New Year Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70track

Free Amazon Web Services SAA-C03 Practice Exam with Questions & Answers | Set: 4

Questions 46

A company uses an AWS Transfer for SFTP public server endpoint and Amazon S3 storage to host large datasets for its customers. The company provides customers SSH private keys to authenticate and download their datasets. The Transfer for SFTP server is configured with structured logging that is saved to an S3 bucket. The company wants to charge customers based on their monthly data download usage. Which solution will meet these requirements?

Options:
A.

Configure VPC Flow Logs to write to a new S3 bucket. Run monthly queries on the flow logs to identify customer usage and calculate cost. Add the charges to the customers' monthly bills.

B.

Each month, use AWS Cost Explorer to examine the costs for Transfer for SFTP and obtain a breakdown by customer. Add the charges to the customers' monthly bills.

C.

Enable requester pays on the S3 bucket that hosts the software. Allocate the charges to each customer based on the customer's requests.

D.

Run Amazon Athena queries on the logging S3 bucket monthly to identify customer usage and calculate costs. Add the charges to the customers' monthly bills.

Amazon Web Services SAA-C03 Premium Access
Questions 47

A company hosts an application in an Amazon EC2 Auto Scaling group. The company has observed that during periods of high demand, new instances take too long to join the Auto Scaling group and serve the increased demand. The company determines that the root cause of the issue is the long boot time of the instances in the Auto Scaling group. The company needs to reduce the time required to launch new instances to respond to demand. Which solution will meet this requirement?

Options:
A.

Increase the maximum capacity of the Auto Scaling group by 50%.

B.

Create a warm pool for the Auto Scaling group. Use the default specification for the warm pool size.

C.

Increase the health check grace period for the Auto Scaling group by 50%.

D.

Create a scheduled scaling action. Set the desired capacity equal to the maximum capacity of the Auto Scaling group.

Questions 48

A company is developing a microservices-based application to manage the company's delivery operations. The application consists of microservices that process orders, manage a fleet of delivery vehicles, and optimize delivery routes.

The microservices must be able to scale independently and must be able to handle bursts of traffic without any data loss.

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Use Amazon API Gateway REST APIs to establish communication between microservices. Deploy the application on Amazon EC2 instances in Auto Scaling groups.

B.

Use Amazon Simple Queue Service (Amazon SQS) to establish communication between microservices. Deploy the application on Amazon Elastic Container Service (Amazon ECS) containers on AWS Fargate.

C.

Use WebSocket-based communication between microservices. Deploy the application on Amazon EC2 instances in Auto Scaling groups.

D.

Use Amazon Simple Notification Service (Amazon SNS) to establish communication between microservices. Deploy the application on Amazon Elastic Container Service (Amazon ECS) containers on Amazon EC2 instances.

Questions 49

A company runs a mobile game app on AWS. The app stores data for every user session. The data updates frequently during a gaming session. The app stores up to 256 KB for each session. Sessions can last up to 48 hours.

The company wants to automate the deletion of expired session data. The company must be able to restore all session data automatically if necessary.

Which solution will meet these requirements?

Options:
A.

Use an Amazon DynamoDB table to store the session data. Enable point-in-time recovery (PITR) and TTL for the table. Select the corresponding attribute for TTL in the session data.

B.

Use an Amazon MemoryDB table to store the session data. Enable point-in-time recovery (PITR) and TTL for the table. Select the corresponding attribute for TTL in the session data.

C.

Store session data in an Amazon S3 bucket. Use the S3 Standard storage class. Enable S3 Versioning for the bucket. Create an S3 Lifecycle configuration to expire objects after 48 hours.

D.

Store session data in an Amazon S3 bucket. Use the S3 Intelligent-Tiering storage class. Enable S3 Versioning for the bucket. Create an S3 Lifecycle configuration to expire objects after 48 hours.

Questions 50

A company runs an application on Amazon EC2 instances. The instances need to access an Amazon RDS database by using specific credentials. The company uses AWS Secrets Manager to contain the credentials the EC2 instances must use. Which solution will meet this requirement?

Options:
A.

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the new IAM role access to the secret that contains the database credentials.

B.

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the new IAM user access to the secret that contains the database credentials.

C.

Create a resource-based policy for the secret that contains the database credentials. Use EC2 Instance Connect to access the secret.

D.

Create an identity-based policy for the secret that contains the database credentials. Grant direct access to the EC2 instances.

Questions 51

A company needs a solution to automate email ingestion. The company needs to automatically parse email messages, look for email attachments, and save any attachments to an Amazon S3 bucket in near real time. Email volume varies significantly from day to day.

Which solution will meet these requirements?

Options:
A.

Set up email receiving in Amazon Simple Email Service {Amazon SES). Create a rule set and a receipt rule. Create an AWS Lambda function that Amazon SES can invoke to process the email bodies and attachments.

B.

Set up email content filtering in Amazon Simple Email Service (Amazon SES). Create a content filtering rule based on sender, recipient, message body, and attachments.

C.

Set up email receiving in Amazon Simple Email Service (Amazon SES). Configure Amazon SES and S3 Event Notifications to process the email bodies and attachments.

D.

Create an AWS Lambda function to process the email bodies and attachments. Use Amazon EventBridge to invoke the Lambda function. Configure an EventBridge rule to listen for incoming emails.

Questions 52

A company has established a new AWS account. The account is newly provisioned and no changes have been made to the default settings. The company is concerned about the security of the AWS account root user.

What should be done to secure the root user?

Options:
A.

Create IAM users for daily administrative tasks. Disable the root user.

B.

Create IAM users for daily administrative tasks. Enable multi-factor authentication on the root user.

C.

Generate an access key for the root user. Use the access key for daily administration tasks instead of the AWS Management Console.

D.

Provide the root user credentials to the most senior solutions architect. Have the solutions architect use the root user for daily administration tasks.

Questions 53

A company has built an application that uses an Amazon Simple Queue Service (Amazon SQS) standard queue and an AWS Lambda function. The Lambda function writes messages to the SQS queue. The company needs a solution to ensure that the consumer of the SQS queue never receives duplicate messages.

Which solution will meet this requirement with the FEWEST changes to the current architecture?

Options:
A.

Modify the SQS queue to enable long polling for the queue.

B.

Delete the existing SQS queue. Recreate the queue as a FIFO queue. Enable content-based deduplication for the queue.

C.

Modify the SQS queue to enable content-based deduplication for the queue.

D.

Delete the SQS queue. Create an Amazon MQ message broker. Configure the broker to deduplicate messages.

Questions 54

A company hosts an application on AWS that stores files that users need to access. The application uses two Amazon EC2 instances. One instance is in Availability Zone A, and the second instance is in Availability Zone B. Both instances use Amazon Elastic Block Store (Amazon EBS) volumes. Users must be able to access the files at any time without delay. Users report that the two instances occasionally contain different versions of the same file. Users occasionally receive HTTP 404 errors when they try to download files. The company must address the customer issues. The company cannot make changes to the application code. Which solution will meet these requirements in the MOST operationally efficient way?

Options:
A.

Run the robocopy command on one of the EC2 instances on a schedule to copy files from the Availability Zone A instance to the Availability Zone B instance.

B.

Configure the application to store the files on both EBS volumes each time a user writes or updates a file.

C.

Mount an Amazon Elastic File System (Amazon EFS) file system to the EC2 instances. Copy the files from the EBS volumes to the EFS file system. Configure the application to store files in the EFS file system.

D.

Create an EC2 instance profile that allows the instance in Availability Zone A to access the S3 bucket. Re-associate the instance profile to the instance in Availability Zone B when needed.

Questions 55

A company runs a critical public application on Amazon Elastic Kubernetes Service (Amazon EKS) clusters. The application has a microservices architecture. The company needs to implement a solution that collects, aggregates, and summarizes metrics and logs from the application in a centralized location.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:
A.

Run the Amazon CloudWatch agent in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

B.

Configure a data stream in Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to read events and to deliver the events to an Amazon S3 bucket. Use Amazon Athena to view the events.

C.

Configure AWS CloudTrail to capture data events. Use Amazon OpenSearch Service to query CloudTrail.

D.

Configure Amazon CloudWatch Container Insights in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

Questions 56

A mining company is using Amazon S3 as its data lake. The company wants to analyze the data collected by the sensors in its mines. A data pipeline is being built to capture data from the sensors, ingest the data into an S3 bucket, and convert the data to Apache Parquet format. The data pipeline must be processed in near-real time. The data will be used for on-demand queries with Amazon Athena.

Which solution will meet these requirements?

Options:
A.

Use Amazon Data Firehose to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

B.

Use Amazon Kinesis Data Streams to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

C.

Use AWS DataSync to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

D.

Use Amazon Simple Queue Service (Amazon SQS) to stream data directly to an AWS Glue job that converts the data to Parquet format and stores the data in Amazon S3.

Questions 57

A company hosts a web application on an on-premises server that processes incoming requests. Processing time for each request varies from 5 minutes to 20 minutes.

The number of requests is growing. The company wants to move the application to AWS. The company wants to update the architecture to scale automatically.

Options:
A.

Convert the application to a microservices architecture that uses containers. Use Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type to run the containerized web application. Configure Service Auto Scaling. Use an Application Load Balancer to distribute incoming requests.

B.

Create an Amazon EC2 instance that has sufficient CPU and RAM capacity to run the application. Create metrics to track usage. Create alarms to notify the company when usage exceeds a specified threshold. Replace the EC2 instance with a larger instance size in the same family when usage is too high.

C.

Refactor the web application to use multiple AWS Lambda functions. Use an Amazon API Gateway REST API as an entry point to the Lambda functions.

D.

Refactor the web application to use a single AWS Lambda function. Use an Amazon API Gateway HTTP API as an entry point to the Lambda function.

Questions 58

A company needs an automated solution to detect cryptocurrency mining activity on Amazon EC2 instances. The solution must automatically isolate any identified EC2 instances for forensic analysis.

Which solution will meet these requirements?

Options:
A.

Create an Amazon EventBridge rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

B.

Create an AWS Security Hub custom action that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the custom action to invoke an AWS Lambda function to isolate the identified EC2 instances.

C.

Create an Amazon Inspector rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

D.

Create an AWS Config custom rule that runs when AWS Config detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

Questions 59

A company generates approximately 20 GB of data multiple times each day. The company uses AWS DataSync to copy all data from on-premises storage to Amazon S3 every 6 hours for further processing. The analytics team wants to modify the copy process to copy only data relevant to the analytics team and ignore the rest of the data. The team wants to copy data as soon as possible and receive a notification when the copy process is finished. Which combination of steps will meet these requirements MOST cost-effectively? (Select THREE.)

Options:
A.

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create a custom script to upload the manifest file to an S3 bucket.

B.

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create an AWS Lambda function to load the manifest file data into an Amazon DynamoDB table.

C.

Create an AWS Lambda function that Amazon EventBridge invokes when the manifest file is loaded into Amazon DynamoDB. Configure the Lambda function to copy the data from on-premises storage to the S3 bucket that uses the manifest file.

D.

Create an AWS Lambda function that an S3 Event Notification invokes when the manifest file is uploaded. Configure the Lambda function to invoke the DataSync task by calling the StartTaskExecution API action with a manifest.

E.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an Amazon EventBridge rule to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

F.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

Questions 60

A company has separate AWS accounts for its finance, data analytics, and development departments. Because of costs and security concerns, the company wants to control which services each AWS account can use

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Use AWS Systems Manager templates to control which AWS services each department can use

B.

Create organization units (OUs) for each department in AWS Organizations. Attach service control policies (SCPs) to the OUs.

C.

Use AWS CloudFormation to automatically provision only the AWS services that each department can use.

D.

Set up a list of products in AWS Service Catalog in the AWS accounts to manage and control the usage of specific AWS services