Weekend Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated SAA-C03 Exam Questions and Answers by theodora

Page: 44 / 81

Amazon Web Services SAA-C03 Exam Overview :

Exam Name: AWS Certified Solutions Architect - Associate (SAA-C03)
Exam Code: SAA-C03 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Associate
Questions: 1093 Q&A's Shared By: theodora
Question 176

A company collects data from sensors. The company needs a cloud-based solution to store and transform the sensor data to make critical decisions. The solution must store the data for up to 2 days. After 2 days, the solution must delete the data. The company needs to use the transformeddata in an automated workflow that has manual approval steps.

Which solution will meet these requirements?

Options:

A.

Load the data into an Amazon Simple Queue Service (Amazon SQS) queue that has a retention period of 2 days. Use an Amazon EventBridge pipe to retrieve data from the queue, transform the data, and pass the data to an AWS Step Functions workflow.

B.

Load the data into AWS DataSync. Delete the DataSync task after 2 days. Invoke an AWS Lambda function to retrieve the data, transform the data, and invoke a second Lambda function that performs the remaining workflow steps.

C.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic, transform the data, and send the data to Amazon EC2 instances to perform the remaining workflow steps.

D.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic and transform the data into an appropriate format for an Amazon SQS queue. Use an AWS Lambda function to poll the queue to perform the remaining workflow steps.

Discussion
Amy
I passed my exam and found your dumps 100% relevant to the actual exam.
Lacey Aug 9, 2024
Yeah, definitely. I experienced the same.
Pippa
I was so happy to see that almost all the questions on the exam were exactly what I found in their Dumps.
Anastasia Sep 21, 2024
You are right…It was amazing! The Cramkey Dumps were so comprehensive and well-organized, it made studying for the exam a breeze.
Addison
Want to tell everybody through this platform that I passed my exam with excellent score. All credit goes to Cramkey Exam Dumps.
Libby Aug 9, 2024
That's good to know. I might check it out for my next IT certification exam. Thanks for the info.
Yusra
I passed my exam. Cramkey Dumps provides detailed explanations for each question and answer, so you can understand the concepts better.
Alisha Aug 29, 2024
I recently used their dumps for the certification exam I took and I have to say, I was really impressed.
Question 177

A company uses an AWS Transfer for SFTP public server endpoint and Amazon S3 storage to host large datasets for its customers. The company provides customers SSH private keys to authenticate and download their datasets. The Transfer for SFTP server is configured with structured logging that is saved to an S3 bucket. The company wants to charge customers based on their monthly data download usage. Which solution will meet these requirements?

Options:

A.

Configure VPC Flow Logs to write to a new S3 bucket. Run monthly queries on the flow logs to identify customer usage and calculate cost. Add the charges to the customers' monthly bills.

B.

Each month, use AWS Cost Explorer to examine the costs for Transfer for SFTP and obtain a breakdown by customer. Add the charges to the customers' monthly bills.

C.

Enable requester pays on the S3 bucket that hosts the software. Allocate the charges to each customer based on the customer's requests.

D.

Run Amazon Athena queries on the logging S3 bucket monthly to identify customer usage and calculate costs. Add the charges to the customers' monthly bills.

Discussion
Question 178

A company is developing a highly available natural language processing (NLP) application. The application handles large volumes of concurrent requests. The application performs NLP tasks such as entity recognition, sentiment analysis, and key phrase extraction on text data.

The company needs to store data that the application processes in a highly available and scalable database.

Options:

Options:

A.

Create an Amazon API Gateway REST API endpoint to handle incoming requests. Configure the REST API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Comprehend to perform NLP tasks on the text data. Store the processed data in Amazon DynamoDB.

B.

Create an Amazon API Gateway HTTP API endpoint to handle incoming requests. Configure the HTTP API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Translate to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

C.

Create an Amazon SQS queue to buffer incoming requests. Deploy the NLP application on Amazon EC2 instances in an Auto Scaling group. Use Amazon Comprehend to perform NLP tasks. Store the processed data in an Amazon RDS database.

D.

Create an Amazon API Gateway WebSocket API endpoint to handle incoming requests. Configure the WebSocket API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Textract to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

Discussion
Question 179

A company's reporting system delivers hundreds of .csv files to an Amazon S3 bucket each day. The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.

Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.

B.

Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.

C.

Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.

D.

Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.

Discussion
Page: 44 / 81
Title
Questions
Posted

SAA-C03
PDF

$36.75  $104.99

SAA-C03 Testing Engine

$43.75  $124.99

SAA-C03 PDF + Testing Engine

$57.75  $164.99