Winter Sale Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: big60

Amazon Web Services Updated SAP-C02 Exam Questions and Answers by hiba

Page: 44 / 45

Amazon Web Services SAP-C02 Exam Overview :

Exam Name: AWS Certified Solutions Architect - Professional
Exam Code: SAP-C02 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Professional
Questions: 614 Q&A's Shared By: hiba
Question 176

A company has a website that serves many visitors. The company deploys a backend service for the website in a primary AWS Region and a disaster recovery (DR) Region.

A single Amazon CloudFront distribution is deployed for the website. The company creates an Amazon Route 53 record set with health checks and a failover routing policy for the primary Region's backend service. The company configures the Route 53 record set as an origin for the CloudFront distribution. The company configures another record set that points to the backend service's endpoint in the DR Region as a secondary failover record type. The TTL for both record sets is 60 seconds.

Currently, failover takes more than 1 minute. A solutions architect must design a solution that will provide the fastest failover time.

Which solution will achieve this goal?

Options:

A.

Deploy an additional CloudFront distribution. Create a new Route 53 failover record set with health checks for both CloudFront distributions.

B.

Set the TTL to 1 second for the existing Route 53 record sets that are used for the backend service in each Region.

C.

Create new record sets for the backend services by using a latency routing policy. Use the record sets as an origin in the CloudFront distribution.

D.

Create a CloudFront origin group that includes two origins, one for each backend service Region. Configure origin failover as a cache behavior for the CloudFront distribution.

Discussion
Question 177

A company is using GitHub Actions to run a CI/CD pipeline that accesses resources on AWS. The company has an IAM user that uses a secret key in the pipeline to authenticate to AWS. An existing IAM role with an attached policy grants the required permissions to deploy resources.

The company's security team implements a new requirement that pipelines can no longer use long-lived secret keys. A solutions architect must replace the secret key with a short-lived solution.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an IAM SAML 2.0 identity provider (IdP) in IAM. Create a new IAM role with the appropriate trust policy that allows the sts:AssumeRole API call. Attach the existing IAM policy to the new IAM role. Update GitHub to use SAML authentication for the pipeline.

B.

Create an IAM OpenID Connect (OIDC) identity provider (IdP) in IAM. Create a new IAM role with the appropriate trust policy that allows the sts:AssumeRoleWithWebIdentity API call from the GitHub OIDC IdP. Update GitHub to assume the role for the pipeline.

C.

Create an Amazon Cognito identity pool. Configure the authentication provider to use GitHub. Create a new IAM role with the appropriate trust policy that allows the sts:AssumeRoleWithWebIdentity API call from the GitHub authentication provider. Configure the pipeline to use Cognito as its authentication provider.

D.

Create a trust anchor to AWS Private CA. Generate a client certificate to use with AWS IAM Roles Anywhere. Create a new IAM role with the appropriate trust policy that allows the sts:AssumeRole API call. Attach the existing IAM policy to the new IAM role. Configure the pipeline to use the credential helper tool and to reference the client certificate public key to assume the new IAM role.

Discussion
Question 178

A company wants to migrate its on-premises application to AWS. The database for the application stores structured product data and temporary user session data. The company needs to decouple the product data from the user session data. The company also needs to implement replication in another AWS Region for disaster recovery.

Which solution will meet these requirements with the HIGHEST performance?

Options:

A.

Create an Amazon RDS DB instance with separate schemas to host the product data and the user session data. Configure a read replica for the DB instance in another Region.

B.

Create an Amazon RDS DB instance to host the product data. Configure a read replica for the DB instance in another Region. Create a global datastore in Amazon ElastiCache for Memcached to host the user session data.

C.

Create two Amazon DynamoDB global tables. Use one global table to host the product data Use the other global table to host the user session data. Use DynamoDB Accelerator (DAX) for caching.

D.

Create an Amazon RDS DB instance to host the product data. Configure a read replica for the DB instance in another Region. Create an Amazon DynamoDB global table to host the user session data

Discussion
Miriam
Highly recommended Dumps. 100% authentic and reliable. Passed my exam with wonderful score.
Milan Jan 12, 2026
I see. Thanks for the information. I'll definitely keep Cramkey in mind for my next exam.
Kylo
What makes Cramkey Dumps so reliable? Please guide.
Sami Jan 8, 2026
Well, for starters, they have a team of experts who are constantly updating their material to reflect the latest changes in the industry. Plus, they have a huge database of questions and answers, which makes it easy to study and prepare for the exam.
Lennox
Something Special that they provide a comprehensive overview of the exam content. They cover all the important topics and concepts, so you can be confident that you are well-prepared for the test.
Aiza Jan 25, 2026
That makes sense. What makes Cramkey Dumps different from other study materials?
Georgina
I used Cramkey Dumps to prepare for my recent exam and I have to say, they were a huge help.
Corey Jan 4, 2026
Really? How did they help you? I know these are the same questions appears in exam. I will give my try. But tell me if they also help in some training?
Anya
I must say they're considered the best dumps available and the questions are very similar to what you'll see in the actual exam. Recommended!!!
Cassius Jan 3, 2026
Yes, they offer a 100% success guarantee. And many students who have used them have reported passing their exams with flying colors.
Question 179

A company has developed a hybrid solution between its data center and AWS. The company uses Amazon VPC and Amazon EC2 instances that send application togs to Amazon CloudWatch. The EC2 instances read data from multiple relational databases that are hosted on premises.

The company wants to monitor which EC2 instances are connected to the databases in near-real time. The company already has a monitoring solution that uses Splunk on premises. A solutions architect needs to determine how to send networking traffic to Splunk.

How should the solutions architect meet these requirements?

Options:

A.

Enable VPC flows logs, and send them to CloudWatch. Create an AWS Lambda function to periodically export the CloudWatch logs to an Amazon S3 bucket by using the pre-defined export function. Generate ACCESS_KEY and SECRET_KEY AWS credentials. Configure Splunk to pull the logs from the S3 bucket by using those credentials.

B.

Create an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination. Configure a pre-processing AWS Lambda function with a Kinesis Data Firehose stream processor that extracts individual log events from records sent by CloudWatch Logs subscription filters. Enable VPC flows logs, and send them to CloudWatch. Create a CloudWatch Logs subscription that sends log events to the Kinesis Data Firehose delivery stream.

C.

Ask the company to log every request that is made to the databases along with the EC2 instance IP address. Export the CloudWatch logs to an Amazon S3 bucket. Use Amazon Athena to query the logs grouped by database name. Export Athena results to another S3 bucket. Invoke an AWS Lambda function to automatically send any new file that is put in the S3 bucket to Splunk.

D.

Send the CloudWatch logs to an Amazon Kinesis data stream with Amazon Kinesis Data Analytics for SOL Applications. Configure a 1 -minute sliding window to collect the events. Create a SQL query that uses the anomaly detection template to monitor any networking traffic anomalies in near-real time. Send the result to an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination.

Discussion
Page: 44 / 45
Title
Questions
Posted

SAP-C02
PDF

$42  $104.99

SAP-C02 Testing Engine

$50  $124.99

SAP-C02 PDF + Testing Engine

$66  $164.99