Summer Special Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: big60

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by jiya

Page: 14 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: jiya
Question 56

A company receives data from its vendor in JSON format with a timestamp in the file name. The vendor uploads the data to an Amazon S3 bucket, and the data is registered into the company’s data lake for analysis and reporting. The company has configured an S3 Lifecycle policy to archive all files to S3 Glacier after 5 days.

The company wants to ensure that its AWS Glue crawler catalogs data only from S3 Standard storage and ignores the archived files. A data analytics specialist must implement a solution to achieve this goal without changing the current S3 bucket configuration.

Which solution meets these requirements?

Options:

A.

Use the exclude patterns feature of AWS Glue to identify the S3 Glacier files for the crawler to exclude.

B.

Schedule an automation job that uses AWS Lambda to move files from the original S3 bucket to a new S3 bucket for S3 Glacier storage.

C.

Use the excludeStorageClasses property in the AWS Glue Data Catalog table to exclude files on S3 Glacier storage

D.

Use the include patterns feature of AWS Glue to identify the S3 Standard files for the crawler to include.

Discussion
Honey
I highly recommend it. They made a big difference for me and I'm sure they'll help you too. Just make sure to use them wisely and not solely rely on them. They should be used as a supplement to your regular studies.
Antoni Aug 4, 2025
Good point. Thanks for the advice. I'll definitely keep that in mind.
Miley
Hey, I tried Cramkey Dumps for my IT certification exam. They are really awesome and helped me pass my exam with wonderful score.
Megan Aug 24, 2025
That’s great!!! I’ll definitely give it a try. Thanks!!!
Esmae
I highly recommend Cramkey Dumps to anyone preparing for the certification exam.
Mollie Aug 8, 2025
Absolutely. They really make it easier to study and retain all the important information. I'm so glad I found Cramkey Dumps.
Kylo
What makes Cramkey Dumps so reliable? Please guide.
Sami Aug 20, 2025
Well, for starters, they have a team of experts who are constantly updating their material to reflect the latest changes in the industry. Plus, they have a huge database of questions and answers, which makes it easy to study and prepare for the exam.
Faye
Yayyyy. I passed my exam. I think all students give these dumps a try.
Emmeline Aug 14, 2025
Definitely! I have no doubt new students will find them to be just as helpful as I did.
Question 57

An online retail company is migrating its reporting system to AWS. The company’s legacy system runs data processing on online transactions using a complex series of nested Apache Hive queries. Transactional data is exported from the online system to the reporting system several times a day. Schemas in the files are stable

between updates.

A data analyst wants to quickly migrate the data processing to AWS, so any code changes should be minimized. To keep storage costs low, the data analyst decides to store the data in Amazon S3. It is vital that the data from the reports and associated analytics is completely up to date based on the data in Amazon S3.

Which solution meets these requirements?

Options:

A.

Create an AWS Glue Data Catalog to manage the Hive metadata. Create an AWS Glue crawler over Amazon S3 that runs when data is refreshed to ensure that data changes are updated. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

B.

Create an AWS Glue Data Catalog to manage the Hive metadata. Create an Amazon EMR cluster with consistent view enabled. Run emrfs sync before each analytics step to ensure data changes are updated. Create an EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

C.

Create an Amazon Athena table with CREATE TABLE AS SELECT (CTAS) to ensure data is refreshed from underlying queries against the rawdataset. Create an AWS Glue Data Catalog to manage the Hive metadata over the CTAS table. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

D.

Use an S3 Select query to ensure that the data is properly updated. Create an AWS Glue Data Catalog to manage the Hive metadata over the S3 Select table. Create an Amazon EMR cluster and use the metadata in the AWS Glue Data Catalog to run Hive processing queries in Amazon EMR.

Discussion
Question 58

A company uses Amazon kinesis Data Streams to ingest and process customer behavior information from application users each day. A data analytics specialist notices that its data stream is throttling. The specialist has turned on enhanced monitoring for the Kinesis data stream and has verified that the data stream did not exceed the data limits. The specialist discovers that there are hot shards

Which solution will resolve this issue?

Options:

A.

Use a random partition key to ingest the records.

B.

Increase the number of shards Split the size of the log records.

C.

Limit the number of records that are sent each second by the producer to match the capacity of the stream.

D.

Decrease the size of the records that are sent from the producer to match the capacity of the stream.

Discussion
Page: 14 / 14

DAS-C01
PDF

$42  $104.99

DAS-C01 Testing Engine

$50  $124.99

DAS-C01 PDF + Testing Engine

$66  $164.99