Weekend Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by dawson

Page: 6 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: dawson
Question 24

A company is designing a data warehouse to support business intelligence reporting. Users will access the executive dashboard heavily each Monday and Friday morning

for I hour. These read-only queries will run on the active Amazon Redshift cluster, which runs on dc2.8xIarge compute nodes 24 hours a day, 7 days a week. There are

three queues set up in workload management: Dashboard, ETL, and System. The Amazon Redshift cluster needs to process the queries without wait time.

What is the MOST cost-effective way to ensure that the cluster processes these queries?

Options:

A.

Perform a classic resize to place the cluster in read-only mode while adding an additional node to the cluster.

B.

Enable automatic workload management.

C.

Perform an elastic resize to add an additional node to the cluster.

D.

Enable concurrency scaling for the Dashboard workload queue.

Discussion
Question 25

A technology company is creating a dashboard that will visualize and analyze time-sensitive data. The data will come in through Amazon Kinesis DataFirehose with the butter interval set to 60 seconds. The dashboard must support near-real-time data.

Which visualization solution will meet these requirements?

Options:

A.

Select Amazon Elasticsearch Service (Amazon ES) as the endpoint for Kinesis Data Firehose. Set up a Kibana dashboard using the data in Amazon ES with the desired analyses and visualizations.

B.

Select Amazon S3 as the endpoint for Kinesis Data Firehose. Read data into an Amazon SageMaker Jupyter notebook and carry out the desired analyses and visualizations.

C.

Select Amazon Redshift as the endpoint for Kinesis Data Firehose. Connect Amazon QuickSight with SPICE to Amazon Redshift to create the desired analyses and visualizations.

D.

Select Amazon S3 as the endpoint for Kinesis Data Firehose. Use AWS Glue to catalog the data and Amazon Athena to query it. Connect Amazon QuickSight with SPICE to Athena to create the desired analyses and visualizations.

Discussion
Question 26

A company has developed several AWS Glue jobs to validate and transform its data from Amazon S3 and load it into Amazon RDS for MySQL in batches once every day. The ETL jobs read the S3 data using a DynamicFrame. Currently, the ETL developers are experiencing challenges in processing only the incremental data on every run, as the AWS Glue job processes all the S3 input data on each run.

Which approach would allow the developers to solve the issue with minimal coding effort?

Options:

A.

Have the ETL jobs read the data from Amazon S3 using a DataFrame.

B.

Enable job bookmarks on the AWS Glue jobs.

C.

Create custom logic on the ETL jobs to track the processed S3 objects.

D.

Have the ETL jobs delete the processed objects or data from Amazon S3 after each run.

Discussion
Rae
I tried using Cramkey dumps for my recent certification exam and I found them to be more accurate and up-to-date compared to other dumps I've seen. Passed the exam with wonderful score.
Rayyan (not set)
I see your point. Thanks for sharing your thoughts. I might give it a try for my next certification exam.
Lennie
I passed my exam and achieved wonderful score, I highly recommend it.
Emelia (not set)
I think I'll give Cramkey a try next time I take a certification exam. Thanks for the recommendation!
Ilyas
Definitely. I felt much more confident and prepared because of the Cramkey Dumps. I was able to answer most of the questions with ease and I think that helped me to score well on the exam.
Saoirse (not set)
That's amazing. I'm glad you found something that worked for you. Maybe I should try them out for my next exam.
Carson
Yeah, definitely. I would definitely recommend Cramkey Dumps to anyone who is preparing for an exam.
Rufus (not set)
Me too. They're a lifesaver!
Josie
I just passed my certification exam using their dumps and I must say, I was thoroughly impressed.
Fatimah (not set)
You’re right. The dumps were authentic and covered all the important topics. I felt confident going into the exam and it paid off.
Question 27

A central government organization is collecting events from various internal applications using Amazon Managed Streaming for Apache Kafka (Amazon MSK). The organization has configured a separate Kafka topic for each application to separate the data. For security reasons, the Kafka cluster has been configured to only allow TLS encrypted data and it encrypts the data at rest.

A recent application update showed that one of the applications was configured incorrectly, resulting in writing data to a Kafka topic that belongs to another application. This resulted in multiple errors in the analytics pipeline as data from different applications appeared on the same topic. After this incident, the organization wants to prevent applications from writing to a topic different than the one they should write to.

Which solution meets these requirements with the least amount of effort?

Options:

A.

Create a different Amazon EC2 security group for each application. Configure each security group to have access to a specific topic in the Amazon MSK cluster. Attach the security group to each application based on the topic that the applications should read and write to.

B.

Install Kafka Connect on each application instance and configure each Kafka Connect instance to write to a specific topic only.

C.

Use Kafka ACLs and configure read and write permissions for each topic. Use the distinguished name of the clients’ TLS certificates as the principal of the ACL.

D.

Create a different Amazon EC2 security group for each application. Create an Amazon MSK cluster and Kafka topic for each application. Configure each security group to have access to the specific cluster.

Discussion
Page: 6 / 14

DAS-C01
PDF

$35  $99.99

DAS-C01 Testing Engine

$42  $119.99

DAS-C01 PDF + Testing Engine

$56  $159.99