I ran into a little issue today parsing a S3 SQS event that was sent to Lambda via a SQS trigger. I assumed the incoming event to Lambda was 100% of type dict. Given this, I assumed I could pull the bucket name and key using this syntax.
bucketname = event['Records']['body']['Records']['s3']['bucket']['name'] objectname = event['Records']['body']['Records']['s3']['object']['key']
As it turns out the incoming event is not 100% of type dict and I got the following error.
string indices must be integers
The Records after the body ([‘Records’][‘body’]) are of type str. Below is the updated code to pull the bucket name and key from the incoming event.
event_body_records_string = event['Records']['body'] event_body_records_dict = json.loads(event_body_records_string) bucketname = event_body_records_dict['Records']['s3']['bucaket']['name'] objectname = event_body_records_dict['Records']['s3']['object']['key']
Now everything works out great!!!
When using the “receive_message” Python Boto function to pull message(s) from a SQS queue, you will always get a response back when the command completes. However, how do you determine if the response you got back actually contains a valid message?
response = sqs.receive_message
if 'Messages' in response: print("Message on the queue to process")
else: print("No messages on the queue to process")
Thats about it!!
Its time to study for AWS Certification #3. I took a little time off, but no more! This time I am going for the SysOps Administrator – Associate certificate. I guess its a little harder than the other 2 certifications I passed, but still within range!
Whats hard about some of these certifications is you don’t actually work with all covered material day-to-day. Spinning up EC2 instances and creating security groups it pretty standard, but when it comes to things like networking, those tasks are not typically done by me.
So my plan is going to be the following to pass the exam:
- Listen to a training course from acloud.guru on the way to and from work. (Long commute both ways)
- Practice heavily in AWS on all the relevant topics.
- Use the practice exam voucher I received from passing previous certifications to pay for an official AWS SysOps practice exam.
- Rely on personal experience.
Passed the AWS Certified Solutions Architect – Associate (Released February 2018) exam today!
These exams are both fun and a little stressful, but you do learn a lot while studying!!!
Will post the overall score results when I get them!
I’m getting ready to take the AWS Solutions Architect Associate 2018 test. Below are some final items I need to review before the exam.
- Redshift – https://aws.amazon.com/redshift/faqs/
- AWS Config – https://aws.amazon.com/config/faq/
- EBS – https://aws.amazon.com/ebs/faqs/
- EC2 Types – https://aws.amazon.com/ec2/instance-types/
- EBS – https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSVolumeTypes.html
- ECS – https://aws.amazon.com/ecs/faqs/
- ECR – https://aws.amazon.com/ecr/faqs/
- S3 Perf – https://docs.aws.amazon.com/AmazonS3/latest/dev/request-rate-perf-considerations.html
- S3 Meta – https://aws.amazon.com/blogs/big-data/building-and-maintaining-an-amazon-s3-metadata-index-without-servers/
Passed the AWS Certified Developer – Associate exam! My broad study plan worked great!
Below are my results.
|Overall Score: 96%|
|Topic Level Scoring:
1.0 AWS Fundamentals: 100%
2.0 Designing and Developing: 100%
3.0 Deployment and Security: 87%
4.0 Debugging: 100%
Next step is to attempt the AWS Certified Solutions Architect – Associate exam.
Below is my plan to obtain the AWS Developer Certification.
For each of the following areas listed further down, I am trying to do the following:
- Read the FAQ’s
- Practice in the console
- Practice with the CLI and understand the functions\parameters
- Review all HTTP codes
- Review all defaults and limits
- Review uniqueness of each area
Here are the areas I am covering in preparation for the exam.
- Route 53
- Elastic Beanstalk
- API Gateway
- Storage Gateway
The exam is only 55 questions, so i’m not sure how in depth the exam will go on each of these. Regardless, its a good to review all of the areas!
S3 Consistency Model
- Puts (New record) = Read-after-write consistency model
- Updates and Deletes = Eventual consistency model
DynamoDB Consistency Model
- Write = Eventual consistency model
- Read = Eventual consistency model
- Read = Optional – Strong consistency model
- Read-after-write consistency model = New objects should be available with out delays to clients.
- Eventual consistency model = “Eventually” all access attempts to a particular item will return the last updated value. There is potential here for stale or old data reads while data replication occurs.
- Strong consistency model = All access attempts (e.g. parallel) to a particular item return the same unique state. Old\stale data reads are avoided, but it will cost you more.
Good information to know for any AWS certification tests…. 🙂