Pre-Summer Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70track

Free Amazon Web Services MLA-C01 Practice Exam with Questions & Answers | Set: 8

Questions 71

A company uses a batching solution to process data analytics each day. The company wants to build an analytics platform to provide near real-time updates. The company wants to use open source technology and does not want to manage or scale the infrastructure.

Which solution will meet these requirements?

Options:
A.

Create Amazon Managed Streaming for Apache Kafka (Amazon MSK) Serverless clusters to process the data.

B.

Create Amazon Managed Streaming for Apache Kafka (Amazon MSK) Provisioned clusters. Configure the clusters based on data volume.

C.

Create data streams in Amazon Kinesis Data Streams. Use AWS Application Auto Scaling to scale the infrastructure.

D.

Create self-hosted Apache Flink applications on Amazon EC2. Run the applications as containers.

Amazon Web Services MLA-C01 Premium Access
Questions 72

Case Study

A company is building a web-based AI application by using Amazon SageMaker. The application will provide the following capabilities and features: ML experimentation, training, a

central model registry, model deployment, and model monitoring.

The application must ensure secure and isolated use of training data during the ML lifecycle. The training data is stored in Amazon S3.

The company must implement a manual approval-based workflow to ensure that only approved models can be deployed to production endpoints.

Which solution will meet this requirement?

Options:
A.

Use SageMaker Experiments to facilitate the approval process during model registration.

B.

Use SageMaker ML Lineage Tracking on the central model registry. Create tracking entities for the approval process.

C.

Use SageMaker Model Monitor to evaluate the performance of the model and to manage the approval.

D.

Use SageMaker Pipelines. When a model version is registered, use the AWS SDK to change the approval status to " Approved. "