Weekend Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by jiya

Page: 14 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: jiya
Question 56

A company receives data from its vendor in JSON format with a timestamp in the file name. The vendor uploads the data to an Amazon S3 bucket, and the data is registered into the company’s data lake for analysis and reporting. The company has configured an S3 Lifecycle policy to archive all files to S3 Glacier after 5 days.

The company wants to ensure that its AWS Glue crawler catalogs data only from S3 Standard storage and ignores the archived files. A data analytics specialist must implement a solution to achieve this goal without changing the current S3 bucket configuration.

Which solution meets these requirements?

Options:

A.

Use the exclude patterns feature of AWS Glue to identify the S3 Glacier files for the crawler to exclude.

B.

Schedule an automation job that uses AWS Lambda to move files from the original S3 bucket to a new S3 bucket for S3 Glacier storage.

C.

Use the excludeStorageClasses property in the AWS Glue Data Catalog table to exclude files on S3 Glacier storage

D.

Use the include patterns feature of AWS Glue to identify the S3 Standard files for the crawler to include.

Discussion
Question 57

An online retail company is migrating its reporting system to AWS. The company’s legacy system runs data processing on online transactions using a complex series of nested Apache Hive queries. Transactional data is exported from the online system to the reporting system several times a day. Schemas in the files are stable

between updates.

A data analyst wants to quickly migrate the data processing to AWS, so any code changes should be minimized. To keep storage costs low, the data analyst decides to store the data in Amazon S3. It is vital that the data from the reports and associated analytics is completely up to date based on the data in Amazon S3.

Which solution meets these requirements?

Options:

A.

Create an AWS Glue Data Catalog to manage the Hive metadata. Create an AWS Glue crawler over Amazon S3 that runs when data is refreshed to ensure that data changes are updated. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

B.

Create an AWS Glue Data Catalog to manage the Hive metadata. Create an Amazon EMR cluster with consistent view enabled. Run emrfs sync before each analytics step to ensure data changes are updated. Create an EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

C.

Create an Amazon Athena table with CREATE TABLE AS SELECT (CTAS) to ensure data is refreshed from underlying queries against the rawdataset. Create an AWS Glue Data Catalog to manage the Hive metadata over the CTAS table. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

D.

Use an S3 Select query to ensure that the data is properly updated. Create an AWS Glue Data Catalog to manage the Hive metadata over the S3 Select table. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

Discussion
Question 58

A company uses Amazon kinesis Data Streams to ingest and process customer behavior information from application users each day. A data analytics specialist notices that its data stream is throttling. The specialist has turned on enhanced monitoring for the Kinesis data stream and has verified that the data stream did not exceed the data limits. The specialist discovers that there are hot shards

Which solution will resolve this issue?

Options:

A.

Use a random partition key to ingest the records.

B.

Increase the number of shards Split the size of the log records.

C.

Limit the number of records that are sent each second by the producer to match the capacity of the stream.

D.

Decrease the size of the records that are sent from the producer to match the capacity of the stream.

Discussion
Freddy
I passed my exam with flying colors and I'm confident who will try it surely ace the exam.
Aleksander (not set)
Thanks for the recommendation! I'll check it out.
Vienna
I highly recommend them. They are offering exact questions that we need to prepare our exam.
Jensen (not set)
That's great. I think I'll give Cramkey a try next time I take a certification exam. Thanks for the recommendation!
Ari
Can anyone explain what are these exam dumps and how are they?
Ocean (not set)
They're exam preparation materials that are designed to help you prepare for various certification exams. They provide you with up-to-date and accurate information to help you pass your exams.
Nylah
I've been looking for good study material for my upcoming certification exam. Need help.
Dolly (not set)
Then you should definitely give Cramkey Dumps a try. They have a huge database of questions and answers, making it easy to study and prepare for the exam. And the best part is, you can be sure the information is accurate and relevant.
Hassan
Highly Recommended Dumps… today I passed my exam! Same questions appear. I bought Full Access.
Kasper (not set)
Hey wonderful….so same questions , sounds good. Planning to write this week, I will go for full access today.
Page: 14 / 14

DAS-C01
PDF

$35  $99.99

DAS-C01 Testing Engine

$42  $119.99

DAS-C01 PDF + Testing Engine

$56  $159.99