| Exam Name: | AWS Certified Solutions Architect - Professional | ||
| Exam Code: | SAP-C02 Dumps | ||
| Vendor: | Amazon Web Services | Certification: | AWS Certified Professional |
| Questions: | 605 Q&A's | Shared By: | inaaya |
A company is planning to migrate an Amazon RDS for Oracle database to an RDS for PostgreSQL DB instance in another AWS account. A solutions architect needs to design a migration strategy that will require no downtime and that will minimize the amount of time necessary to complete the migration. The migration strategy must replicate all existing data and any new data that is created during the migration The target database must be identical to the source database at completion of the migration process
All applications currently use an Amazon Route 53 CNAME record as their endpoint for communication with the RDS for Oracle DB instance The RDS for Oracle DB instance is in a private subnet.
Which combination of steps should the solutions architect take to meet these requirements? (Select THREE)
A software company hosts an application on AWS with resources in multiple AWS accounts and Regions. The application runs on a group of Amazon EC2 instances in an application VPC located in the us-east-1 Region with an IPv4 CIDR block of 10.10.0.0/16. In a different AWS account, a shared services VPC is located in the us-east-2 Region with an IPv4 CIDR block of 10.10.10.0/24. When a cloud engineer uses AWS CloudFormation to attempt to peer the application
VPC with the shared services VPC, an error message indicates a peering failure.
Which factors could cause this error? (Choose two.)
A company with several AWS accounts is using AWS Organizations and service control policies (SCPs). An Administrator created the following SCP and has attached it to an organizational unit (OU) that contains AWS account 1111-1111-1111:
Developers working in account 1111-1111-1111 complain that they cannot create Amazon S3 buckets. How should the Administrator address this problem?
Question:
How should a companyefficiently processinfrequently uploaded S3 data using a long-running (up to 25 minutes) custom application?