Summer Special 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bestdeal

Free Amazon Web Services SAA-C03 Practice Exam with Questions & Answers | Set: 5

Questions 201

The company has a workflow on AWS that processes uploaded documents to perform sentiment analysis of photos and text. The processing workflow calls multiple AWS services.

The company needs a solution to automate the processing workflow. The solution must handle any failed uploads.

Which solution will meet these requirements with the LEAST effort?

Options:
A.

Use S3 Event Notifications to publish events to an Amazon Simple Notification Service (Amazon SNS) topic. Deploy a web application on the Amazon ECS cluster to subscribe to the SNS topic and listen for events to orchestrate the processing workflow.

B.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Configure long polling. Deploy an Amazon EC2 instance that runs a script to orchestrate the processing workflow.

C.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Create an ECS cluster that scales based on the number of messages in the queue. Configure the cluster to orchestrate the processing workflow.

D.

Use S3 Event Notifications to invoke an Amazon EventBridge rule. Configure the rule to initiate an AWS Step Functions workflow that orchestrates the processing workflow.

Amazon Web Services SAA-C03 Premium Access
Questions 202

A company has an ordering application that stores customer information in Amazon RDS for MySQL. During regular business hours, employees run one-time queries for reporting purposes. Timeouts are occurring during order processing because the reporting queries are taking a long time to run. The company needs to eliminate the timeouts without preventing employees from performing queries.

Options:
A.

Create a read replica. Move reporting queries to the read replica.

B.

Create a read replica. Distribute the ordering application to the primary DB instance and the read replica.

C.

Migrate the ordering application to Amazon DynamoDB with on-demand capacity.

D.

Schedule the reporting queries for non-peak hours.

Questions 203

Which solution will meet these requirements?

Options:
A.

Create an Amazon EventBridge rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

B.

Create an AWS Security Hub custom action that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the custom action to invoke an AWS Lambda function to isolate the identified EC2 instances.

C.

Create an Amazon Inspector rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

D.

Create an AWS Config custom rule that runs when AWS Config detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

Questions 204

A company hosts an application that processes highly sensitive customer transactions on AWS. The application uses Amazon RDS as its database. The company manages its own encryption keys to secure the data in Amazon RDS.

The company needs to update the customer-managed encryption keys at least once each year.

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Set up automatic key rotation in AWS Key Management Service (AWS KMS) for the encryption keys.

B.

Configure AWS Key Management Service (AWS KMS) to alert the company to rotate the encryption keys annually.

C.

Schedule an AWS Lambda function to rotate the encryption keys annually.

D.

Create an AWS CloudFormation stack to run an AWS Lambda function that deploys new encryption keys once each year.

Questions 205

Which solution meets these requirements?

Options:
A.

Replace the NAT gateway with a NAT instance.

B.

Replace the NAT gateway with an internet gateway.

C.

Replace the NAT gateway with a gateway VPC endpoint.

D.

Replace the NAT gateway with an AWS Direct Connect connection.

Questions 206

An ecommerce company runs an application that uses an Amazon DynamoDB table in a single AWS Region. The company wants to deploy the application to a second Region. The company needs to support multi-active replication with low latency reads and writes to the existing DynamoDB table in both Regions.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:
A.

Create a DynamoDB global secondary index (GSI) for the existing table. Create a new table in the second Region. Convert the existing DynamoDB table to a global table. Specify the new table as the secondary table.

B.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create a new application that uses the DynamoDB Streams Kinesis Adapter and the Amazon Kinesis Client Library (KCL). Configure the new application to read data from the DynamoDB table in the first Region and to write the data to the new table in the second Region.

C.

Convert the existing DynamoDB table to a global table. Choose the appropriate second Region to achieve active-active write capabilities in both Regions.

D.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create an AWS Lambda function in the first Region that reads data from the table in the first Region and writes the data to the new table in the second Region. Set a DynamoDB stream as the input trigger for the Lambda function.

Questions 207

A media company hosts a mobile app backend in the AWS Cloud. The company is releasing a new feature to allow users to upload short videos and apply special effects by using the mobile app. The company uses AWS Amplify to store the videos that customers upload in an Amazon S3 bucket.

The videos must be processed immediately. Users must receive a notification when processing is finished.

Which solution will meet these requirements?

Options:
A.

Use Amazon EventBridge Scheduler to schedule an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

B.

Use Amazon EventBridge Scheduler to schedule AWS Fargate to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

C.

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

D.

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use AWS Amplify to send push notifications to customers when processing is finished.

Questions 208

A solutions architect is designing an AWS Identity and Access Management (IAM) authorization model for a company's AWS account. The company has designated five specific employees to have full access to AWS services and resources in the AWS account.

The solutions architect has created an IAM user for each of the five designated employees and has created an IAM user group.

Which solution will meet these requirements?

Options:
A.

Attach the AdministratorAccess resource-based policy to the IAM user group. Place each of the five designated employee IAM users in the IAM user group.

B.

Attach the SystemAdministrator identity-based policy to the IAM user group. Place each of the five designated employee IAM users in the IAM user group.

C.

Attach the AdministratorAccess identity-based policy to the IAM user group. Place each of the five designated employee IAM users in the IAM user group.

D.

Attach the SystemAdministrator resource-based policy to the IAM user group. Place each of the five designated employee IAM users in the IAM user group.

Questions 209

A company is running its production and nonproduction environment workloads in multiple AWS accounts. The accounts are in an organization in AWS Organizations. The company needs to design a solution that will prevent the modification of cost usage tags.

Which solution will meet these requirements?

Options:
A.

Create a custom AWS Config rule to prevent tag modification except by authorized principals.

B.

Create a custom trail in AWS CloudTrail to prevent tag modification

C.

Create a service control policy (SCP) to prevent tag modification except by authonzed principals.

D.

Create custom Amazon CloudWatch logs to prevent tag modification.

Questions 210

A company has a web application that uses several web servers that run on Amazon EC2 instances. The instances use a shared Amazon RDS for MySQL database.

The company requires a secure method to store database credentials. The credentials must be automatically rotated every 30 days without affecting application availability.

Which solution will meet these requirements?

Options:
A.

Store database credentials in AWS Secrets Manager. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to access Secrets Manager.

B.

Store database credentials in AWS Systems Manager OpsCenter. Grant the necessary IAM permissions to allow the web servers to access OpsCenter.

C.

Store database credentials in an Amazon S3 bucket. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to retrieve credentials from the S3 bucket.

D.

Store the credentials in a local file on each of the web servers. Use an AWS KMS key to encrypt the credentials. Create a cron job on each server to rotate the credentials every 30 days.

Questions 211

A company wants to run its payment application on AWS The application receives payment notifications from mobile devices Payment notifications require a basic validation before they are sent for further processing

The backend processing application is long running and requires compute and memory to be adjusted The company does not want to manage the infrastructure

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Create an Amazon Simple Queue Service (Amazon SQS) queue Integrate the queue with an Amazon EventBndge rule to receive payment notifications from mobile devices Configure the rule to validate payment notifications and send the notifications to the backend applicationDeploy the backend application on Amazon Elastic Kubernetes Service (Amazon EKS) Anywhere Create a standalone cluster

B.

Create an Amazon API Gateway API Integrate the API with anAWS Step Functions state machine to receive payment notifications from mobile devices Invoke the statemachine to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon Elastic Kubernetes Sen/ice (Amazon EKS). Configure an EKS cluster with self-managed nodes.

C.

Create an Amazon Simple Queue Sen/ice (Amazon SQS) queue Integrate the queue with an Amazon EventBridge rule to receive payment notifications from mobile devices Configure the rule to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon EC2 Spot Instances Configure a Spot Fleet with a default allocation strategy.

D.

Create an Amazon API Gateway API Integrate the API with AWS Lambda to receive payment notifications from mobile devices Invoke a Lambda function to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon ECS with an AWS Fargate launch type.

Questions 212

The DNS provider that hosts a company's domain name records is experiencing outages that cause service disruption for a website running on AWS. The company needs to migrate to a more resilient managed DNS service and wants the service to run on AWS.

What should a solutions architect do to rapidly migrate the DNS hosting service?

Options:
A.

Create an Amazon Route 53 public hosted zone for the domain name. Import the zone file containing the domain records hosted by the previous provider

B.

Create an Amazon Route 53 private hosted zone for the domain name Import the zone file containing the domain records hosted by the previous provider.

C.

Create a Simple AD directory in AWS. Enable zone transfer between the DNS provider and AWS Directory Service for Microsoft Active Directory for the domain records.

D.

Create an Amazon Route 53 Resolver inbound endpomt in the VPC. Specify the IP addresses that the provider's DNS will forward DNS queries to. Configure the provider's DNS to forward DNS queries for the domain to the IP addresses that are specified in the inbound endpoint.

Questions 213

A company has Amazon EC2 instances that run nightly batch jobs to process data. The EC2 instances run in an Auto Scaling group that uses On-Demand billing. If a job fails on one instance: another instance will reprocess the job. The batch jobs run between 12:00 AM and 06 00 AM local time every day.

Which solution will provide EC2 instances to meet these requirements MOST cost-effectively'?

Options:
A.

Purchase a 1-year Savings Plan for Amazon EC2 that covers the instance family of the Auto Scaling group that the batch job uses.

B.

Purchase a 1-year Reserved Instance for the specific instance type and operating system of the instances in the Auto Scaling group that the batch job uses.

C.

Create a new launch template for the Auto Scaling group Set the instances to Spot Instances Set a policy to scale out based on CPU usage.

D.

Create a new launch template for the Auto Scaling group Increase the instance size Set a policy to scale out based on CPU usage.

Questions 214

A company stores critical data in Amazon DynamoDB tables in the company's AWS account. An IT administrator accidentally deleted a DynamoDB table. The deletion caused a significant loss of data and disrupted the company's operations. The company wants to prevent this type of disruption in the future.

Which solution will meet this requirement with the LEAST operational overhead?

Options:
A.

Configure a trail in AWS CloudTrail. Create an Amazon EventBridge rule for delete actions. Create an AWS Lambda function to automatically restore deleted DynamoDB tables.

B.

Create a backup and restore plan for the DynamoDB tables. Recover the DynamoDB tables manually.

C.

Configure deletion protection on the DynamoDB tables.

D.

Enable point-in-time recovery on the DynamoDB tables.

Questions 215

A company has an organization in AWS Organizations that has all features enabled The company requires that all API calls and logins in any existing or new AWS account must be audited The company needs a managed solution to prevent additional work and to minimize costs The company also needs to know when any AWS account is not compliant with the AWS Foundational Security Best Practices (FSBP) standard.

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Deploy an AWS Control Tower environment in the Organizations management account Enable AWS Security Hub and AWS Control Tower Account Factory in the environment.

B.

Deploy an AWS Control Tower environment in a dedicated Organizations member account Enable AWS Security Hub and AWS Control Tower Account Factory in the environment.

C.

Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone (MALZ) Submit an RFC to self-service provision Amazon GuardDuty in the MALZ.

D.

Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone (MALZ) Submit an RFC to self-service provision AWS Security Hub in the MALZ.

Questions 216

A development team is collaborating with another company to create an integrated product. The other company needs to access an Amazon Simple Queue Service (Amazon SQS) queue that is contained in the development team's account. The other company wants to poll the queue without giving up its own account permissions to do so.

How should a solutions architect provide access to the SQS queue?

Options:
A.

Create an instance profile that provides the other company access to the SQS queue.

B.

Create an IAM policy that provides the other company access to the SQS queue.

C.

Create an SQS access policy that provides the other company access to the SQS queue.

D.

Create an Amazon Simple Notification Service (Amazon SNS) access policy that provides the other company access to the SQS queue.

Questions 217

An online video game company must maintain ultra-low latency for its game servers. The game servers run on Amazon EC2 instances. The company needs a solution that can handle millions of UDP internet traffic requests each second.

Which solution will meet these requirements MOST cost-effectively?

Options:
A.

Configure an Application Load Balancer with the required protocol and ports for the internet traffic. Specify the EC2 instances as the targets.

B.

Configure a Gateway Load Balancer for the internet traffic. Specify the EC2 instances as the targets.

C.

Configure a Network Load Balancer with the required protocol and ports for the internet traffic. Specify the EC2 instances as the targets.

D.

Launch an identical set of game servers on EC2 instances in separate AWS Regions. Route internet traffic to both sets of EC2 instances.

Questions 218

A company is designing a new multi-tier web application that consists of the following components:

• Web and application servers that run on Amazon EC2 instances as part of Auto Scaling groups

• An Amazon RDS DB instance for data storage

A solutions architect needs to limit access to the application servers so that only the web servers can access them. Which solution will meet these requirements?

Options:
A.

Deploy AWS PrivateLink in front of the application servers. Configure the network ACL to allow only the web servers to access the application servers.

B.

Deploy a VPC endpoint in front of the application servers Configure the security group to allow only the web servers to access the application servers

C.

Deploy a Network Load Balancer with a target group that contains the application servers' Auto Scaling group Configure the network ACL to allow only the web servers to access the application servers.

D.

Deploy an Application Load Balancer with a target group that contains the application servers' Auto Scaling group. Configure the security group to allow only the web servers to access the application servers.

Questions 219

A company is building a microservices-based application that will be deployed on Amazon Elastic Kubernetes Service (Amazon EKS). The microservices will interact with each other. The company wants to ensure that the application is observable to identify performance issues in the future.

Which solution will meet these requirements?

Options:
A.

Configure the application to use Amazon ElastiCache to reduce the number of requests that are sent to the microservices.

B.

Configure Amazon CloudWatch Container Insights to collect metrics from the EKS clusters Configure AWS X-Ray to trace the requests between the microservices.

C.

Configure AWS CloudTrail to review the API calls. Build an Amazon QuickSight dashboard to observe the microservice interactions.

D.

Use AWS Trusted Advisor to understand the performance of the application.

Questions 220

A company is migrating applications from an on-premises Microsoft Active Directory that the company manages to AWS. The company deploys the applications in multiple AWS accounts. The company uses AWS Organizations to manage the accounts centrally.

The company's security team needs a single sign-on solution across all the company's AWS accounts. The company must continue to manage users and groups that are in the on-premises Active Directory

Which solution will meet these requirements?

Options:
A.

Create an Enterprise Edition Active Directory in AWS Directory Service for Microsoft Active Directory. Configure the Active Directory to be the identity source for AWS IAM Identity Center

B.

Enable AWS IAM Identity Center. Configure a two-way forest trust relationship to connect the company's self-managed Active Directory with IAM Identity Center by using AWS Directory Service for Microsoft Active Directory.

C.

Use AWS Directory Service and create a two-way trust relationship with the company's self-managed Active Directory.

D.

Deploy an identity provider (IdP) on Amazon EC2. Link the IdP as an identity source within AWS IAM Identity Center.

Questions 221

A company runs HPC workloads requiring high IOPS.

Which combination of steps will meet these requirements? (Select TWO)

Options:
A.

Use Amazon EFS as a high-performance file system.

B.

Use Amazon FSx for Lustre as a high-performance file system.

C.

Create an Auto Scaling group of EC2 instances. Use Reserved Instances. Configure a spread placement group. Use AWS Batch for analytics.

D.

Use Mountpoint for Amazon S3 as a high-performance file system.

E.

Create an Auto Scaling group of EC2 instances. Use mixed instance types and a cluster placement group. Use Amazon EMR for analytics.

Questions 222

How can a company detect and notify security teams about PII in S3 buckets?

Options:
A.

Use Amazon Macie. Create an EventBridge rule for SensitiveData findings and send an SNS notification.

B.

Use Amazon GuardDuty. Create an EventBridge rule for CRITICAL findings and send an SNS notification.

C.

Use Amazon Macie. Create an EventBridge rule for SensitiveData:S3Object/Personal findings and send an SQS notification.

D.

Use Amazon GuardDuty. Create an EventBridge rule for CRITICAL findings and send an SQS notification.

Questions 223

A solutions architect needs to connect a company's corporate network to its VPC to allow on-premises access to its AWS resources. The solution must provide encryption of all trafficbetween the corporate network and the VPC at the network layer and the session layer. The solution also must provide security controls to prevent unrestricted access between AWS and the on-premises systems.

Which solution meets these requirements?

Options:
A.

Configure AWS Direct Connect to connect to the VPC. Configure the VPC route tables to allow and deny traffic between AWS and on premises as required.

B.

Create an IAM policy to allow access to the AWS Management Console only from a defined set of corporate IP addresses Restrict user access based on job responsibility by using an IAM policy and roles

C.

Configure AWS Site-to-Site VPN to connect to the VPC. Configure route table entries to direct traffic from on premises to the VPC. Configure instance security groups and network ACLs to allow only required traffic from on premises.

D.

Configure AWS Transit Gateway to connect to the VPC. Configure route table entries to direct traffic from on premises to the VPC. Configure instance security groups and network ACLs to allow only required traffic from on premises.

Questions 224

A company is building an application on AWS that connects to an Amazon RDS database. The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:
A.

Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials.

B.

Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials.

C.

Use an encrypted application configuration file Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials.

D.

Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials.

Questions 225

A company runs a Java-based job on an Amazon EC2 instance. The job runs every hour and takes 10 seconds to run. The job runs on a scheduled interval and consumes 1 GB of memory. The CPU utilization of the instance is low except for short surges during which the job uses the maximum CPU available. The company wants to optimize the costs to run the job.

Which solution will meet these requirements?

Options:
A.

Use AWS App2Container (A2C) to containerize the job. Run the job as an Amazon Elastic Container Service (Amazon ECS) task on AWS Fargate with 0.5 virtual CPU (vCPU) and 1 GB of memory.

B.

Copy the code into an AWS Lambda function that has 1 GB of memory. Create an Amazon EventBridge scheduled rule to run the code each hour.

C.

Use AWS App2Container (A2C) to containerize the job. Install the container in the existing Amazon Machine Image (AMI). Ensure that the schedule stops the container when the task finishes.

D.

Configure the existing schedule to stop the EC2 instance at the completion of the job and restart the EC2 instance when the next job starts.

Questions 226

A company wants to migrate its three-tier application from on premises to AWS. The web tier and the application tier are running on third-party virtual machines (VMs). The database tier is running on MySQL.

The company needs to migrate the application by making the fewest possible changes to the architecture. The company also needs a database solution that can restore data to a specific point in time.

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Migrate the web tier and the application tier to Amazon EC2 instances in private subnets. Migrate the database tier to Amazon RDS for MySQL in private subnets.

B.

Migrate the web tier to Amazon EC2 instances in public subnets. Migrate the application tier to EC2 instances in private subnets. Migrate the database tier to Amazon Aurora MySQL in private subnets.

C.

Migrate the web tier to Amazon EC2 instances in public subnets. Migrate the application tier to EC2 instances in private subnets. Migrate the database tier to Amazon RDS for MySQL in private subnets.

D.

Migrate the web tier and the application tier to Amazon EC2 instances in public subnets. Migrate the database tier to Amazon Aurora MySQL in public subnets.

Questions 227

A company website hosted on Amazon EC2 instances processes classified data stored in The application writes data to Amazon Elastic Block Store (Amazon EBS) volumes The company needs to ensure that all data that is written to the EBS volumes is encrypted at rest.

Which solution will meet this requirement?

Options:
A.

Create an IAM role that specifies EBS encryption Attach the role to the EC2 instances

B.

Create the EBS volumes as encrypted volumes Attach the EBS volumes to the EC2 instances

C.

Create an EC2 instance tag that has a key of Encrypt and a value of True Tag all instances that require encryption at the EBS level

D.

Create an AWS Key Management Service (AWS KMS) key policy that enforces EBS encryption in the account Ensure that the key policy is active

Questions 228

A company has a business-critical application that runs on Amazon EC2 instances. The application stores data in an Amazon DynamoDB table. The company must be able to revert the table to any point within the last 24 hours.

Which solution meets these requirements with the LEAST operational overhead?

Options:
A.

Configure point-in-time recovery for the table.

B.

Use AWS Backup for the table.

C.

Use an AWS Lambda function to make an on-demand backup of the table every hour.

D.

Turn on streams on the table to capture a log of all changes to the table in the last 24 hours Store a copy of the stream in an Amazon S3 bucket.

Questions 229

A company's ecommerce website has unpredictable traffic and uses AWS Lambda functions to directly access a private Amazon RDS for PostgreSQL DB instance. The company wants to maintain predictable database performance and ensure that the Lambda invocations do not overload the database with too many connections.

What should a solutions architect do to meet these requirements?

Options:
A.

Point the client driver at an RDS custom endpoint. Deploy the Lambda functions inside a VPC.

B.

Point the client driver at an RDS proxy endpoint. Deploy the Lambda functions inside a VPC.

C.

Point the client driver at an RDS custom endpoint. Deploy the Lambda functions outside a VPC.

D.

Point the client driver at an RDS proxy endpoint. Deploy the Lambda functions outside a VPC.

Questions 230

A company's developers want a secure way to gain SSH access on the company's Amazon EC2 instances that run the latest version of Amazon Linux. The developers work remotely and in the corporate office.

The company wants to use AWS services as a part of the solution. The EC2 instances are hosted in a VPC private subnet and access the internet through a NAT gateway that is deployed in a public subnet.

What should a solutions architect do to meet these requirements MOST cost-effectively?

Options:
A.

Create a bastion host in the same subnet as the EC2 instances. Grant the ec2: CreateVpnConnection IAM permission to the developers. Install EC2 Instance Connect so that the developers can connect to the EC2 instances.

B.

Create an AWS Site-to-Site VPN connection between the corporate network and the VPC. Instruct the developers to use the Site-to-Site VPN connection to access the EC2 instances when the developers are on the corporate network. Instruct the developers to set up another VPN connection for access when they work remotely.

C.

Create a bastion host in the public subnet of the VPC. Configure the security groups and SSH keys of the bastion host to only allow connections and SSH authentication from the developers' corporate and remote networks. Instruct the developers to connect through the bastion host by using SSH to reach the EC2 instances.

D.

Attach the AmazonSSMManagedlnstanceCore IAM policy to an IAM role that is associated with the EC2 instances. Instruct the developers to use AWS Systems Manager Session Manager to access the EC2 instances.

Questions 231

A company has a web application that includes an embedded NoSQL database. The application runs on Amazon EC2 instances behind an Application Load Balancer (ALB). The instances run in an Amazon EC2 Auto Scaling group in a single Availability Zone.

A recent increase in traffic requires the application to be highly available and for the database to be eventually consistent

Which solution will meet these requirements with the LEAST operational overhead?

Options:
A.

Replace the ALB with a Network Load Balancer Maintain the embedded NoSQL database with its replication service on the EC2 instances.

B.

Replace the ALB with a Network Load Balancer Migrate the embedded NoSQL database to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS).

C.

Modify the Auto Scaling group to use EC2 instances across three Availability Zones. Maintain the embedded NoSQL database with its replication service on the EC2 instances.

D.

Modify the Auto Scaling group to use EC2 instances across three Availability Zones. Migrate the embedded NoSQL database to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS).

Questions 232

A company is developing a mobile game that streams score updates to a backend processor and then posts results on a leaderboard A solutions architect needs to design a solution that can handle large traffic spikes process the mobile game updates in order of receipt, and store the processed updates in a highly available database The company also wants to minimize the management overhead required to maintain the solution

What should the solutions architect do to meet these requirements?

Options:
A.

Push score updates to Amazon Kinesis Data Streams Process the updates in Kinesis Data Streams with AWS Lambda Store the processed updates in Amazon DynamoDB.

B.

Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleet of Amazon EC2 instances set up for Auto Scaling Store the processed updates in Amazon Redshift.

C.

Push score updates to an Amazon Simple Notification Service (Amazon SNS) topic Subscribe an AWS Lambda function to the SNS topic to process the updates. Store the processed updates in a SQL database running on Amazon EC2.

D.

Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue. Use a fleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQS queue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.

Questions 233

A company has multiple AWS accounts with applications deployed in the us-west-2 Region Application logs are stored within Amazon S3 buckets in each account The company wants tobuild a centralized log analysis solution that uses a single S3 bucket Logs must not leave us-west-2, and the company wants to incur minimal operational overhead

Which solution meets these requirements and is MOST cost-effective?

Options:
A.

Create an S3 Lifecycle policy that copies the objects from one of the application S3 buckets to the centralized S3 bucket

B.

Use S3 Same-Region Replication to replicate logs from the S3 buckets to another S3 bucket in us-west-2 Use this S3 bucket for log analysis.

C.

Write a script that uses the PutObject API operation every day to copy the entire contents of the buckets to another S3 bucket in us-west-2 Use this S3 bucket for log analysis.

D.

Write AWS Lambda functions in these accounts that are triggered every time logs are delivered to the S3 buckets (s3 ObjectCreated a event) Copy the logs to another S3 bucket in us-west-2. Use this S3 bucket for log analysis.

Questions 234

A company has an AWS Direct Connect connection from its corporate data center to its VPC in the us-east-1 Region. The company recently acquired a corporation that has several VPCs and a Direct Connect connection between its on-premises data center and the eu-west-2 Region. The CIDR blocks for the VPCs of the company and the corporation do not overlap. The company requires connectivity between two Regions and the data centers. The company needs a solution that is scalable while reducing operational overhead.

What should a solutions architect do to meet these requirements?

Options:
A.

Set up inter-Region VPC peering between the VPC in us-east-1 and the VPCs in eu-west-2.

B.

Create private virtual interfaces from the Direct Connect connection in us-east-1 to the VPCs in eu-west-2.

C.

Establish VPN appliances in a fully meshed VPN network hosted by Amazon EC2. Use AWS VPN CloudHub to send and receive data between the data centers and each VPC.

D.

Connect the existing Direct Connect connection to a Direct Connect gateway. Route traffic from the virtual private gateways of the VPCs in each Region to the Direct Connect gateway.

Questions 235

A company has NFS servers in an on-premises data center that need to periodically back up small amounts of data to Amazon S3. Which solution meets these requirements and is MOST cost-effective?

Options:
A.

Set up AWS Glue to copy the data from the on-premises servers to Amazon S3.

B.

Set up an AWS DataSync agent on the on-premises servers, and sync the data to Amazon S3.

C.

Set up an SFTP sync using AWS Transfer for SFTP to sync data from on premises to Amazon S3.

D.

Set up an AWS Direct Connect connection between the on-premises data center and a VPC, and copy the data to Amazon S3.

Questions 236

A financial services company wants to shut down two data centers and migrate more than 100 TB of data to AWS. The data has an intricate directory structure with millions of small filesstored in deep hierarchies of subfolders. Most of the data is unstructured, and the company's file storage consists of SMB-based storage types from multiple vendors. The company does not want to change its applications to access the data after migration.

What should a solutions architect do to meet these requirements with the LEAST operational overhead?

Options:
A.

Use AWS Direct Connect to migrate the data to Amazon S3.

B.

Use AWS DataSync to migrate the data to Amazon FSx for Lustre.

C.

Use AWS DataSync to migrate the data to Amazon FSx for Windows File Server.

D.

Use AWS Direct Connect to migrate the data on-premises file storage to an AWS Storage Gateway volume gateway.

Questions 237

A company uses an Amazon S3 bucket as its data lake storage platform The S3 bucket contains a massive amount of data that is accessed randomly by multiple teams and hundreds of applications. The company wants to reduce the S3 storage costs and provide immediate availability for frequently accessed objects

What is the MOST operationally efficient solution that meets these requirements?

Options:
A.

Create an S3 Lifecycle rule to transition objects to the S3 Intelligent-Tiering storage class

B.

Store objects in Amazon S3 Glacier Use S3 Select to provide applications with access to the data.

C.

Use data from S3 storage class analysis to create S3 Lifecycle rules to automatically transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.

D.

Transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class Create an AWS Lambda function to transition objects to the S3 Standard storage class when they are accessed by an application

Questions 238

A company uses a set of Amazon EC2 instances to host a website. The website uses an Amazon S3 bucket to store images and media files.

The company wants to automate website infrastructure creation to deploy the website to multiple AWS Regions. The company also wants to provide the EC2 instances access to the S3 bucket so the instances can store and access data by using AWS Identity and Access Management (IAM).

Which solution will meet these requirements MOST securely?

Options:
A.

Create an AWS Cloud Format ion template for the web server EC2 instances. Save an IAM access key in the UserData section of the AWS;:EC2::lnstance entity in the CloudFormation template.

B.

Create a file that contains an IAM secret access key and access key ID. Store the file in a new S3 bucket. Create an AWS CloudFormation template. In the template, create a parameter to specify the location of the S3 object that contains the access key and access key ID.

C.

Create an IAM role and an IAM access policy that allows the web server EC2 instances to access the S3 bucket. Create an AWS CloudFormation template for the web server EC2 instances that contains an IAM instance profile entity that references the IAM role and the IAM access policy.

D.

Create a script that retrieves an IAM secret access key and access key ID from IAM and stores them on the web server EC2 instances. Include the script in the UserData section of the AWS::EC2::lnstance entity in an AWS CloudFormation template.

Questions 239

A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The company wants to migrate the workloads to AWS.

The migration is time sensitive. The company wants to use a lift-and-shift strategy for non-critical workloads.

Which combination of steps will meet these requirements? (Select THREE.)

Options:
A.

Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.

B.

Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.

C.

Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.

D.

Stop all operations on the VMs Launch a cutover instance.

E.

Use AWS App2Container (A2C) to collect data about the VMs.

F.

Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Questions 240

A company needs a solution to automate email ingestion. The company needs to automatically parse email messages, look for email attachments, and save any attachments to an Amazon S3 bucket in near real time. Email volume varies significantly from day to day.

Which solution will meet these requirements?

Options:
A.

Set up email receiving in Amazon Simple Email Service {Amazon SES). Create a rule set and a receipt rule. Create an AWS Lambda function that Amazon SES can invoke to process the email bodies and attachments.

B.

Set up email content filtering in Amazon Simple Email Service (Amazon SES). Create a content filtering rule based on sender, recipient, message body, and attachments.

C.

Set up email receiving in Amazon Simple Email Service (Amazon SES). Configure Amazon SES and S3 Event Notifications to process the email bodies and attachments.

D.

Create an AWS Lambda function to process the email bodies and attachments. Use Amazon EventBridge to invoke the Lambda function. Configure an EventBridge rule to listen for incoming emails.

Questions 241

A company's application runs on Amazon EC2 instances that are in multiple Availability Zones. The application needs to ingest real-time data from third-party applications.

The company needs a data ingestion solution that places the ingested raw data in an Amazon S3 bucket.

Which solution will meet these requirements?

Options:
A.

Create Amazon Kinesis data streams for data ingestion. Create Amazon Kinesis Data Firehose delivery streams to consume the Kinesis data streams. Specify the S3 bucket as the destination of the delivery streams.

B.

Create database migration tasks in AWS Database Migration Service (AWS DMS). Specify replication instances of the EC2 instances as the source endpoints. Specify the S3 bucket as the target endpoint. Set the migration type to migrate existing data and replicate ongoing changes.

C.

Create and configure AWS DataSync agents on the EC2 instances. Configure DataSync tasks to transfer data from the EC2 instances to the S3 bucket.

D.

Create an AWS Direct Connect connection to the application for data ingestion. Create Amazon Kinesis Data Firehose delivery streams to consume direct PUT operations from the application. Specify the S3 bucket as the destination of the delivery streams.

Questions 242

A company has an on-premises data center that is running out of storage capacity. The company wants to migrate its storage infrastructure to AWS while minimizing bandwidth costs. The solution must allow for immediate retrieval of data at no additional cost.

How can these requirements be met?

Options:
A.

Deploy Amazon S3 Glacier Vault and enable expedited retrieval. Enable provisioned retrieval capacity for the workload.

B.

Deploy AWS Storage Gateway using cached volumes. Use Storage Gateway to store data in Amazon S3 while retaining copies of frequently accessed data subsets locally.

C.

Deploy AWS Storage Gateway using stored volumes to store data locally. Use Storage Gateway to asynchronously back up point-in-time snapshots of the data to Amazon S3.

D.

Deploy AWS Direct Connect to connect with the on-premises data center. Configure AWS Storage Gateway to store data locally. Use Storage Gateway to asynchronously back up point-in-time snapshots of the data to Amazon S3.

Questions 243

A company has a mobile game that reads most of its metadata from an Amazon RDS DB instance. As the game increased in popularity, developers noticed slowdowns related to the game's metadata load times Performance metrics indicate that simply scaling the database will not help A solutions architect must explore all options that include capabilities for snapshots, replication, and sub-millisecond response times

What should the solutions architect recommend to solve these issues'?

Options:
A.

Migrate the database to Amazon Aurora with Aurora Replicas

B.

Migrate the database to Amazon DynamoDB with global tables

C.

Add an Amazon ElastiCache for Redis layer in front of the database.

D.

Add an Amazon ElastiCache for Memcached layer in front of the database

Questions 244

A company's marketing data is uploaded from multiple sources to an Amazon S3 bucket A series ot data preparation jobs aggregate the data for reporting The data preparation jobs need to run at regular intervals in parallel A few jobs need to run in a specific order later

The company wants to remove the operational overhead of job error handling retry logic, and state management

Which solution will meet these requirements?

Options:
A.

Use an AWS Lambda function to process the data as soon as the data is uploaded to the S3 bucket Invoke Other Lambda functions at regularly scheduled intervals

B.

Use Amazon Athena to process the data Use Amazon EventBndge Scheduler to invoke Athena on a regular internal

C.

Use AWS Glue DataBrew to process the data Use an AWS Step Functions state machine to run the DataBrew data preparation jobs

D.

Use AWS Data Pipeline to process the data. Schedule Data Pipeline to process the data once at midnight.

Questions 245

A media company hosts its video processing workload on AWS. The workload uses Amazon EC2 instances in an Auto Scaling group to handle varying levels of demand. The workload stores the original videos and the processed videos in an Amazon S3 bucket.

The company wants to ensure that the video processing workload is scalable. The company wants to prevent failed processing attempts because of resource constraints. The architecturemust be able to handle sudden spikes in video uploads without impacting the processing capability.

Which solution will meet these requirements with the LEAST overhead?

Options:
A.

Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Configure an Amazon S3 event notification to invoke the Lambda functions when a new video is uploaded. Configure the Lambda functions to process videos directly and to save processed videos back to the S3 bucket.

B.

Migrate the workload from Amazon EC2 instances to AWS Lambda functions. Use Amazon S3 to invoke an Amazon Simple Notification Service (Amazon SNS) topic when a new video is uploaded. Subscribe the Lambda functions to the SNS topic. Configure the Lambda functions to process the videos asynchronously and to save processed videos back to the S3 bucket.

C.

Configure an Amazon S3 event notification to send a message to an Amazon Simple Queue Service (Amazon SQS) queue when a new video is uploaded. Configure the existing Auto Scaling group to poll the SQS queue, process the videos, and save processed videos back to the S3 bucket.

D.

Configure an Amazon S3 upload trigger to invoke an AWS Step Functions state machine when a new video is uploaded. Configure the state machine to orchestrate the video processing workflow by placing a job message in the Amazon SQS queue. Configure the job message to invoke the EC2 instances to process the videos. Save processed videos back to the S3 bucket.

Questions 246

A company runs its application on Oracle Database Enterprise Edition The company needs to migrate the application and the database to AWS. The company can use the Bring Your Own License (BYOL) model while migrating to AWS The application uses third-party database features that require privileged access.

A solutions architect must design a solution for the database migration.

Which solution will meet these requirements MOST cost-effectively?

Options:
A.

Migrate the database to Amazon RDS for Oracle by using native tools. Replace the third-party features with AWS Lambda.

B.

Migrate the database to Amazon RDS Custom for Oracle by using native tools Customize the new database settings to support the third-party features.

C.

Migrate the database to Amazon DynamoDB by using AWS Database Migration Service {AWS DMS). Customize the new database settings to support the third-party features.

D.

Migrate the database to Amazon RDS for PostgreSQL by using AWS Database Migration Service (AWS DMS). Rewrite the application code to remove the dependency on third-party features.

Questions 247

A company is migrating its databases to Amazon RDS for PostgreSQL. The company is migrating its applications to Amazon EC2 instances. The company wants to optimize costs for long-running workloads.

Which solution will meet this requirement MOST cost-effectively?

Options:
A.

Use On-Demand Instances for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year Compute Savings Plan with the No Upfront option for the EC2 instances.

B.

Purchase Reserved Instances for a 1 year term with the No Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the No Upfront option for the EC2 instances.

C.

Purchase Reserved Instances for a 1 year term with the Partial Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the Partial Upfront option for the EC2 instances.

D.

Purchase Reserved Instances for a 3 year term with the All Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 3 year EC2 Instance Savings Plan with the All Upfront option for the EC2 instances.

Questions 248

A solutions architect needs to implement a solution that can handle up to 5,000 messages per second. The solution must publish messages as events to multiple consumers. The messages are upto 500 KB in size. The message consumers need to have the ability to use multiple programming languages to consume the messages with minimal latency. The solution must retain published messages for more than 3 months. The solution must enforce strict ordering of the messages.

Which solution will meet these requirements?

Options:
A.

Publish messages to an Amazon Kinesis Data Streams data stream. Enable enhanced fan-out. Ensure that consumers ingest the data stream by using dedicated throughput.

B.

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to subscribe to the topic.

C.

Publish messages to Amazon EventBridge. Allow each consumer to create rules to deliver messages to the consumer's own target.

D.

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use Amazon Data Firehose to subscribe to the topic.

Questions 249

A company has an application that uses Docker containers in its local data center The application runs on a container host that stores persistent data in a volume on the host. The container instances use the stored persistent data.

The company wants to move the application to a fully managed service because the company does not want to manage any servers or storage infrastructure.

Which solution will meet these requirements?

Options:
A.

Use Amazon Elastic Kubernetes Service (Amazon EKS) with self-managed nodes. Create an Amazon Elastic Block Store (Amazon EBS) volume attached to an Amazon EC2 instance. Use the EBS volume as a persistent volume mounted in the containers.

B.

Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launch type. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volume as a persistent storage volume mounted in the containers.

C.

Use Amazon Elastic Container Service (Amazon ECS) with an AWS Fargate launch type. Create an Amazon S3 bucket. Map the S3 bucket as a persistent storage volume mounted in the containers.

D.

Use Amazon Elastic Container Service (Amazon ECS) with an Amazon EC2 launch type. Create an Amazon Elastic File System (Amazon EFS) volume. Add the EFS volume as a persistent storage volume mounted in the containers.

Questions 250

A company wants to use NAT gateways in its AWS environment. The company's Amazon EC2 instances in private subnets must be able to connect to the public internet through the NAT gateways.

Which solution will meet these requirements'?

Options:
A.

Create public NAT gateways in the same private subnets as the EC2 instances

B.

Create private NAT gateways in the same private subnets as the EC2 instances

C.

Create public NAT gateways in public subnets in the same VPCs as the EC2 instances

D.

Create private NAT gateways in public subnets in the same VPCs as the EC2 instances