Pre-Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated SAP-C02 Exam Questions and Answers by shiloh

Page: 43 / 48

Amazon Web Services SAP-C02 Exam Overview :

Exam Name: AWS Certified Solutions Architect - Professional
Exam Code: SAP-C02 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Professional
Questions: 645 Q&A's Shared By: shiloh
Question 172

A company is developing a gene reporting device that will collect genomic information to assist researchers with collecting large samples of data from a diverse population. The device will push 8 KB of genomic data every second to a data platform that will need to process and analyze the data and provide information back to researchers. The data platform must meet the following requirements:

•Provide near-real-time analytics of the inbound genomic data

•Ensure the data is flexible, parallel, and durable

•Deliver results of processing to a data warehouse

Which strategy should a solutions architect use to meet these requirements?

Options:

A.

Use Amazon Kinesis Data Firehose to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon RDS instance.

B.

Use Amazon Kinesis Data Streams to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon Redshift cluster using Amazon EMR.

C.

Use Amazon S3 to collect the inbound device data, analyze the data from Amazon SOS with Kinesis, and save the results to an Amazon Redshift cluster.

D.

Use an Amazon API Gateway to put requests into an Amazon SQS queue, analyze the data with an AWS Lambda function, and save the results to an Amazon Redshift cluster using Amazon EMR.

Discussion
Question 173

A company hosts a multi-tier data processing application that consists of a static web application frontend and APIs that are hosted on multiple Amazon EC2 instances. The application stores search data on a single-node Amazon OpenSearch Service cluster that runs on an EC2 instance. The application stores additional data in a PostgreSQL database that runs on another EC2 instance. An NGINX server that is hosted on an EC2 instance serves the web application.

The company has experienced some support issues with the application and wants to modernize the application.

Which solution meets these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon ECS cluster that runs on AWS Fargate. Configure the ECS cluster to pull images from the Amazon ECR public repositories for OpenSearch Service, PostgreSQL, and NGINX and from a private repository for the APIs.

B.

Host the web application on Amazon CloudFront by using an Amazon S3 origin. Use OpenSearch Service to store the search data and migrate the PostgreSQL database to an Amazon Aurora PostgreSQL cluster. Run the APIs on AWS App Runner.

C.

Create an Amazon EKS cluster that has a managed node group. Configure the EKS cluster to pull images from the Amazon ECR public repositories for OpenSearch Service, PostgreSQL, and NGINX and from a private repository for the APIs.

D.

Configure AWS App Runner to pull images from the Amazon ECR public repositories for OpenSearch Service, PostgreSQL, and NGINX and from a private repository for the APIs. Deploy the images to App Runner.

Discussion
Question 174

A company established a data-sharing agreement with a supplier. A solutions architect must establish bidirectional access to Amazon S3 buckets for the company’s organization in AWS Organizations and the supplier’s organization. The company’s S3 buckets are in the us-east-1 Region. The supplier’s buckets are in the us-west-1 Region. The company must encrypt data at rest and collect logs of all S3 bucket access.

Which solution will meet these requirements?

Options:

A.

Create S3 Access Grants that have specific permissions in the source accounts. Use AWS Resource Access Manager (AWS RAM) to share the access grants with both organizations. Use server-side encryption with AWS KMS keys (SSE-KMS) to encrypt the data. Grant the target accounts access to the KMS keys. Configure AWS CloudTrail to log S3 data events.

B.

Set up VPC peering connections between the VPCs of the two organizations. Use AWS PrivateLink to implement S3 interface endpoints. Configure IAM policies in each organization to control endpoint access. Use Amazon S3 server-side encryption with Amazon S3 managed keys (SSE-S3) to encrypt the data. Enable VPC Flow Logs.

C.

Create S3 Access Points for both organizations, and use AWS Resource Access Manager (AWS RAM) to share the access points. Attach IAM policies that grant cross-organization access to the access points. Use customer managed AWS KMS keys to encrypt the data. Enable AWS CloudTrail in both Regions.

D.

Configure S3 Cross-Region Replication. Create an SCP to allow S3:GetObject and S3:PutObject actions. Share the SCP between both organizations. Use Amazon S3 server-side encryption with Amazon S3 managed keys (SSE-S3) to encrypt the data. Implement S3 server access logging.

Discussion
Madeleine
Passed my exam with my dream score…. Guys do give these dumps a try. They are authentic.
Ziggy Mar 24, 2026
That's really impressive. I think I might give Cramkey Dumps a try for my next certification exam.
Marley
Hey, I heard the good news. I passed the certification exam!
Jaxson Mar 25, 2026
Yes, I passed too! And I have to say, I couldn't have done it without Cramkey Dumps.
Cecilia
Yes, I passed my certification exam using Cramkey Dumps.
Helena Mar 9, 2026
Great. Yes they are really effective
Ava-Rose
Yes! Cramkey Dumps are amazing I passed my exam…Same these questions were in exam asked.
Ismail Mar 24, 2026
Wow, that sounds really helpful. Thanks, I would definitely consider these dumps for my certification exam.
Sarah
Yeah, I was so relieved when I saw that the question appeared in the exam were similar to their exam dumps. It made the exam a lot easier and I felt confident going into it.
Aaliyah Mar 15, 2026
Same here. I've heard mixed reviews about using exam dumps, but for us, it definitely paid off.
Question 175

A solutions architect is designing a solution to process events. The solution must have the ability to scale in and out based on the number of events that the solution receives. If a processing error occurs, the event must move into a separate queue for review.

Which solution will meet these requirements?

Options:

A.

Send event details to an Amazon Simple Notification Service (Amazon SNS) topic. Configure an AWS Lambda function as a subscriber to the SNS topic to process the events. Add an on-failure destination to the function. Set an Amazon Simple Queue Service (Amazon SQS) queue as the target.

B.

Publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Create an Amazon EC2 Auto Scaling group. Configure the Auto Scaling group to scale in and out based on the ApproximateAgeOfOldestMessage metric of the queue. Configure the application to write failed messages to a dead-letter queue.

C.

Write events to an Amazon DynamoDB table. Configure a DynamoDB stream for the table. Configure the stream to invoke an AWS Lambda function. Configure the Lambda function to process the events.

D.

Publish events to an Amazon EventBridge event bus. Create and run an application on an Amazon EC2 instance with an Auto Scaling group that isbehind an Application Load Balancer (ALB). Set the ALB as the event bus target. Configure the event bus to retry events. Write messages to a dead-letter queue if the application cannot process the messages.

Discussion
Page: 43 / 48
Title
Questions
Posted

SAP-C02
PDF

$36.75  $104.99

SAP-C02 Testing Engine

$43.75  $124.99

SAP-C02 PDF + Testing Engine

$57.75  $164.99