Labour Day Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Page: 1 / 14

AWS Certified Professional AWS Certified DevOps Engineer - Professional

AWS Certified DevOps Engineer - Professional

Last Update May 2, 2024
Total Questions : 207

To help you prepare for the DOP-C02 Amazon Web Services exam, we are offering free DOP-C02 Amazon Web Services exam questions. All you need to do is sign up, provide your details, and prepare with the free DOP-C02 practice questions. Once you have done that, you will have access to the entire pool of AWS Certified DevOps Engineer - Professional DOP-C02 test questions which will help you better prepare for the exam. Additionally, you can also find a range of AWS Certified DevOps Engineer - Professional resources online to help you better understand the topics covered on the exam, such as AWS Certified DevOps Engineer - Professional DOP-C02 video tutorials, blogs, study guides, and more. Additionally, you can also practice with realistic Amazon Web Services DOP-C02 exam simulations and get feedback on your progress. Finally, you can also share your progress with friends and family and get encouragement and support from them.

Questions 4

A company manages an application that stores logs in Amazon CloudWatch Logs. The company wants to archive the logs to an Amazon S3 bucket Logs are rarely accessed after 90 days and must be retained tor 10 years.

Which combination of steps should a DevOps engineer take to meet these requirements? (Select TWO.)

Options:

A.  

Configure a CloudWatch Logs subscription filter to use AWS Glue to transfer all logs to an S3 bucket.

B.  

Configure a CloudWatch Logs subscription filter to use Amazon Kinesis Data Firehose to stream all logs to an S3 bucket.

C.  

Configure a CloudWatch Logs subscription fitter to stream all logs to an S3 bucket.

D.  

Configure the S3 bucket lifecycle policy to transition logs to S3 Glacier after 90 days and to expire logs after 3.650 days.

E.  

Configure the S3 bucket lifecycle policy to transition logs to Reduced Redundancy after 90 days and to expire logs after 3.650 days.

Discussion 0
Osian
Dumps are fantastic! I recently passed my certification exam using these dumps and I must say, they are 100% valid.
Azaan (not set)
They are incredibly accurate and valid. I felt confident going into my exam because the dumps covered all the important topics and the questions were very similar to what I saw on the actual exam. The team of experts behind Cramkey Dumps make sure the information is relevant and up-to-date.
Hassan
Highly Recommended Dumps… today I passed my exam! Same questions appear. I bought Full Access.
Kasper (not set)
Hey wonderful….so same questions , sounds good. Planning to write this week, I will go for full access today.
Reeva
Wow what a success I achieved today. Thank you so much Cramkey for amazing Dumps. All students must try it.
Amari (not set)
Wow, that's impressive. I'll definitely keep Cramkey in mind for my next exam.
Faye
Yayyyy. I passed my exam. I think all students give these dumps a try.
Emmeline (not set)
Definitely! I have no doubt new students will find them to be just as helpful as I did.
Honey
I highly recommend it. They made a big difference for me and I'm sure they'll help you too. Just make sure to use them wisely and not solely rely on them. They should be used as a supplement to your regular studies.
Antoni (not set)
Good point. Thanks for the advice. I'll definitely keep that in mind.
Questions 5

A company is hosting a web application in an AWS Region. For disaster recovery purposes, a second region is being used as a standby. Disaster recovery requirements state that session data must be replicated between regions in near-real time and 1% of requests should route to the secondary region to continuously verify system functionality. Additionally, if there is a disruption in service in the main region, traffic should be automatically routed to the secondary region, and the secondary region must be able to scale up to handle all traffic.

How should a DevOps engineer meet these requirements?

Options:

A.  

In both regions, deploy the application on AWS Elastic Beanstalk and use Amazon DynamoDB global tables for session data. Use an Amazon Route 53 weighted routing policy with health checks to distribute the traffic across the regions.

B.  

In both regions, launch the application in Auto Scaling groups and use DynamoDB for session data. Use a Route 53 failover routing policy with health checks to distribute the traffic across the regions.

C.  

In both regions, deploy the application in AWS Lambda, exposed by Amazon API Gateway, and use Amazon RDS for PostgreSQL with cross-region replication for session data. Deploy the web application with client-side logic to call the API Gateway directly.

D.  

In both regions, launch the application in Auto Scaling groups and use DynamoDB global tables for session data. Enable an Amazon CloudFront weighted distribution across regions. Point the Amazon Route 53 DNS record at the CloudFront distribution.

Discussion 0
Questions 6

A company's development team uses AVMS Cloud Formation to deploy its application resources The team must use for an changes to the environment The team cannot use AWS Management Console or the AWS CLI to make manual changes directly.

The team uses a developer IAM role to access the environment The role is configured with the Admnistratoraccess managed policy. The company has created a new Cloudformationdeployment IAM role that has the following policy.

Questions 6

The company wants ensure that only CloudFormation can use the new role. The development team cannot make any manual changes to the deployed resources.

Which combination of steps meet these requirements? (Select THREE.)

Options:

A.  

Remove the AdministratorAccess policy. Assign the ReadOnIyAccess managed IAM policy to the developer role. Instruct the developers to use the CloudFormationDeployment role as a CloudFormation service role when the developers deploy new stacks.

B.  

Update the trust of CloudFormationDeployment role to allow the developer IAM role to assume the CloudFormationDepoyment role.

C.  

Configure the IAM to be to get and pass the CloudFormationDeployment role if

cloudformation actions for resources,

D.  

Update the trust Of the CloudFormationDepoyment role to anow the cloudformation.amazonaws.com AWS principal to perform the iam:AssumeR01e action

E.  

Remove me Administratoraccess policy. Assign the ReadOnly/Access managed IAM policy to the developer role Instruct the developers to assume the CloudFormatondeployment role when the developers new stacks

F.  

Add an IAM policy to CloudFormationDeplyment to allow cloudformation * on an Add a policy that allows the iam.PassR01e action for ARN of if iam PassedT0Service equal cloudformation.amazonaws.com

Discussion 0
Questions 7

A video-sharing company stores its videos in Amazon S3. The company has observed a sudden increase in video access requests, but the company does not know which videos are most popular. The company needs to identify the general access pattern for the video files. This pattern includes the number of users who access a certain file on a given day, as well as the numb A DevOps engineer manages a large commercial website that runs on Amazon EC2 The website uses Amazon Kinesis Data Streams to collect and process web togs The DevOps engineer manages the Kinesis consumer application, which also runs on Amazon EC2

Sudden increases of data cause the Kinesis consumer application to (all behind and the Kinesis data streams drop records before the records can be processed The DevOps engineer must implement a solution to improve stream handling

Which solution meets these requirements with the MOST operational efficiency''

er of pull requests for certain files.

How can the company meet these requirements with the LEAST amount of effort?

Options:

A.  

Activate S3 server access logging. Import the access logs into an Amazon Aurora database. Use an Aurora SQL query to analyze the access patterns.

B.  

Activate S3 server access logging. Use Amazon Athena to create an external table with the log files. Use Athena to create a SQL query to analyze the access patterns.

C.  

Invoke an AWS Lambda function for every S3 object access event. Configure the Lambda function to write the file access information, such as user. S3 bucket, and file key, to an Amazon Aurora database. Use an Aurora SQL query to analyze the access patterns.

D.  

Record an Amazon CloudWatch Logs log message for every S3 object access event. Configure a CloudWatch Logs log stream to write the file access information, such as user, S3 bucket, and file key, to an Amazon Kinesis Data Analytics for SQL application. Perform a sliding window analysis.

Discussion 0

DOP-C02
PDF

$35  $99.99

DOP-C02 Testing Engine

$42  $119.99

DOP-C02 PDF + Testing Engine

$56  $159.99