DVA-C01 : AWS Certified Developer Associate : Part 14
DVA-C01 : AWS Certified Developer Associate : Part 14
-
A Developer has discovered that an application responsible for processing messages in an Amazon SQS queue is routinely falling behind. The application is capable of processing multiple messages in one execution, but is only receiving one message at a time.
What should the Developer do to increase the number of messages the application receives?
- Call the ChangeMessageVisibility API for the queue and set MaxNumberOfMessages to a value greater than the default of 1.
- Call the AddPermission API to set MaxNumberOfMessages for the ReceiveMessage action to a value greater than the default of 1.
- Call the ReceiveMessage API to set MaxNumberOfMessages to a value greater than the default of 1.
- Call the SetQueueAttributes API for the queue and set MaxNumberOfMessages to a value greater than the default of 1.
-
A Developer is investigating an application’s performance issues. The application consists of hundreds of microservices, and a single API call can potentially have a deep call stack. The Developer must isolate the component that is causing the issue.
Which AWS service or feature should the Developer use to gather information about what is happening and isolate the fault?
- AWS X-Ray
- VPC Flow Logs
- Amazon GuardDuty
- Amazon Macie
-
A Company runs continuous integration/continuous delivery (CI/CD) pipelines for its application on AWS CodePipeline. A Developer must write unit tests and run them as part of the pipelines before staging the artifacts for testing.
How should the Developer incorporate unit tests as part of CI/CD pipelines?
- Create a separate CodePipeline pipeline to run unit tests
- Update the AWS CodeBuild specification to include a phase for running unit tests
- Install the AWS CodeDeploy agent on an Amazon EC2 instance to run unit tests
- Create a testing branch in AWS CodeCommit to run unit tests
-
An application has the following requirements:
– Performance efficiency of seconds with up to a minute of latency.
– The data storage size may grow up to thousands of terabytes.
– Per-message sizes may vary between 100 KB and 100 MB.
– Data can be stored as key/value stores supporting eventual consistency.What is the MOST cost-effective AWS service to meet these requirements?
- Amazon DynamoDB
- Amazon S3
- Amazon RDS (with a MySQL engine)
- Amazon ElastiCache
-
An application is experiencing performance issues based on increased demand. This increased demand is on read-only historical records pulled from an Amazon RDS-hosted database with custom views and queries. A Developer must improve performance without changing the database structure.
Which approach will improve performance and MINIMIZE management overhead?
- Deploy Amazon DynamoDB, move all the data, and point to DynamoDB.
- Deploy Amazon ElastiCache for Redis and cache the data for the application.
- Deploy Memcached on Amazon EC2 and cache the data for the application.
- Deploy Amazon DynamoDB Accelerator (DAX) on Amazon RDS to improve cache performance.
-
A Developer has an Amazon DynamoDB table that must be in provisioned mode to comply with user requirements. The application needs to support the following:
– Average item size: 10 KB
– Item reads each second: 10 strongly consistent
– Item writes each second: 2 transactionalWhich read and write capacity cost-effectively meets these requirements?
- Read 10; write 2
- Read 30; write 40
- Use on-demand scaling
- Read 300; write 400
-
A company wants to containerize an existing three-tier web application and deploy it to Amazon ECS Fargate. The application is using session data to keep track of user activities.
Which approach would provide the BEST user experience?
- Provision a Redis cluster in Amazon ElastiCache and save the session data in the cluster.
- Create a session table in Amazon Redshift and save the session data in the database table.
- Enable session stickiness in the existing Network Load Balancer and manage the session data in the container.
- Use an Amazon S3 bucket as data store and save the session data in the bucket.
-
An application is using a single-node Amazon ElastiCache for Redis instance to improve read performance. Over time, demand for the application has increased exponentially, which has increased the load on the ElastiCache instance. It is critical that this cache layer handles the load and is resilient in case of node failures.
What can the Developer do to address the load and resiliency requirements?
- Add a read replica instance.
- Migrate to a Memcached cluster.
- Migrate to an Amazon Elasticsearch Service cluster.
- Vertically scale the ElastiCache instance.
-
A Developer is designing an AWS Lambda function that create temporary files that are less than 10 MB during execution. The temporary files will be accessed and modified multiple times during execution. The Developer has no need to save or retrieve these files in the future.
Where should the temporary file be stored?
- the /tmp directory
- Amazon EFS
- Amazon EBS
- Amazon S3
-
A Developer is writing an application that runs on Amazon EC2 instances in an Auto Scaling group. The application data is stored in an Amazon DynamoDB table and records are constantly updated by all instances. An instance sometimes retrieves old data. The Developer wants to correct this by making sure the reads are strongly consistent.
How can the Developer accomplish this?
- Set ConsistentRead to true when calling GetItem.
- Create a new DynamoDB Accelerator (DAX) table.
- Set Consistency to strong when calling UpdateTable.
- Use the GetShardIterator command.
-
A Developer has an application that must accept a large amount of incoming data streams and process the data before sending it to many downstream users.
Which serverless solution should the Developer use to meet these requirements?
- Amazon RDS MySQL stored procedure with AWS Lambda
- AWS Direct Connect with AWS Lambda
- Amazon Kinesis Data Streams with AWS Lambda
- Amazon EC2 bash script with AWS Lambda
-
A company is using Amazon API Gateway to manage its public-facing API. The CISO requires that the APIs be used by test account users only.
What is the MOST secure way to restrict API access to users of this particular AWS account?
- Client-side SSL certificates for authentication
- API Gateway resource policies
- Cross-origin resource sharing (CORS)
- Usage plans
-
A Developer is migrating existing applications to AWS. These applications use MongoDB as their primary data store, and they will be deployed to Amazon EC2 instances. Management requires that the Developer minimize changes to applications while using AWS services.
Which solution should the Developer use to host MongoDB in AWS?
- Install MongoDB on the same instance where the application is running.
- Deploy Amazon DocumentDB in MongoDB compatibility mode.
- Use Amazon API Gateway to translate API calls from MongoDB to Amazon DynamoDB.
- Replicate the existing MongoDB workload to Amazon DynamoDB.
-
A company requires that AWS Lambda functions written by Developers log errors so System Administrators can more effectively troubleshoot issues.
What should the Developers implement to meet this need?
- Publish errors to a dedicated Amazon SQS queue.
- Create an Amazon CloudWatch Events event trigger based on certain Lambda events.
- Report errors through logging statements in Lambda function code.
- Set up an Amazon SNS topic that sends logging statements upon failure.
-
A Developer needs to deploy an application running on AWS Fargate using Amazon ECS. The application has environment variables that must be passed to a container for the application to initialize.
How should the environment variables be passed to the container?
- Define an array that includes the environment variables under the environment parameter within the service definition.
- Define an array that includes the environment variables under the environment parameter within the task definition.
- Define an array that includes the environment variables under the entryPoint parameter within the task definition.
- Define an array that includes the environment variables under the entryPoint parameter within the service definition.
-
A company’s fleet of Amazon EC2 instances receives data from millions of users through an API. The servers batch the data, add an object for each user, and upload the objects to an S3 bucket to ensure high access rates. The object attributes are Customer ID, Server ID, TS-Server (TimeStamp and Server ID), the size of the object, and a timestamp. A Developer wants to find all the objects for a given user collected during a specified time range.
After creating an S3 object created event, how can the Developer achieve this requirement?
- Execute an AWS Lambda function in response to the S3 object creation events that creates an Amazon DynamoDB record for every object with the Customer ID as the partition key and the Server ID as the sort key. Retrieve all the records using the Customer ID and Server ID attributes.
- Execute an AWS Lambda function in response to the S3 object creation events that creates an Amazon Redshift record for every object with the Customer ID as the partition key and TS-Server as the sort key. Retrieve all the records using the Customer ID and TS-Server attributes.
- Execute an AWS Lambda function in response to the S3 object creation events that creates an Amazon DynamoDB record for every object with the Customer ID as the partition key and TS-Server as the sort key. Retrieve all the records using the Customer ID and TS-Server attributes.
- Execute an AWS Lambda function in response to the S3 object creation events that creates an Amazon Redshift record for every object with the Customer ID as the partition key and the Server ID as the sort key. Retrieve all the records using the Customer ID and Server ID attributes.
-
A company is managing a NoSQL database on-premises to host a critical component of an application, which is starting to have scaling issues. The company wants to migrate the application to Amazon DynamoDB with the following considerations:
– Optimize frequent queries
– Reduce read latencies
– Plan for frequent queries on certain key attributes of the tableWhich solution would help achieve these objectives?
- Create global secondary indexes on keys that are frequently queried. Add the necessary attributes into the indexes.
- Create local secondary indexes on keys that are frequently queried. DynamoDB will fetch needed attributes from the table.
- Create DynamoDB global tables to speed up query responses. Use a scan to fetch data from the table.
- Create an AWS Auto Scaling policy for the DynamoDB table.
-
A developer is writing an application that will process data delivered into an Amazon S3 bucket. The data is delivered approximately 10 times a day, and the developer expects the data will be processed in less than 1 minute, on average.
How can the developer deploy and invoke the application with the lowest cost and lowest latency?
- Deploy the application as an AWS Lambda function and invoke it with an Amazon CloudWatch alarm triggered by an S3 object upload.
- Deploy the application as an AWS Lambda function and invoke it with an S3 event notification.
- Deploy the application as an AWS Lambda function and invoke it with an Amazon CloudWatch scheduled event.
- Deploy the application onto an Amazon EC2 instance and have it poll the S3 bucket for new objects.
-
A developer converted an existing program to an AWS Lambda function in the console. The program runs properly on a local laptop, but shows an “Unable to import module” error when tested in the Lambda console.
Which of the following can fix the error?
- Install the missing module and specify the current directory as the target. Create a ZIP file to include all files under the current directory, and upload the ZIP file.
- Install the missing module in a lib directory. Create a ZIP file to include all files under the lib directory, and upload the ZIP file as dependency file.
- In the Lambda code, invoke a Linux command to install the missing modules under the /usr/lib directory.
- In the Lambda console, create a LB_LIBRARY_PATH environment and specify the value for the system library path.
-
A front-end web application is using Amazon Cognito user pools to handle the user authentication flow. A developer is integrating Amazon DynamoDB into the application using the AWS SDK for JavaScript.
How would the developer securely call the API without exposing the access or secret keys?
- Configure Amazon Cognito identity pools and exchange the JSON Web Token (JWT) for temporary credentials.
- Run the web application in an Amazon EC2 instance with the instance profile configured.
- Hardcore the credentials, use Amazon S3 to host the web application, and enable server-side encryption.
- Use Amazon Cognito user pool JSON Web Tokens (JWITs) to access the DynamoDB APIs.