Weekend Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by blossom

Page: 10 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: blossom
Question 40

A large company receives files from external parties in Amazon EC2 throughout the day. At the end of the day, the files are combined into a single file, compressed into a gzip file, and uploaded to Amazon S3. The total size of all the files is close to 100 GB daily. Once the files are uploaded to Amazon S3, an AWS Batch program executes a COPY command to load the files into an Amazon Redshift cluster.

Which program modification will accelerate the COPY process?

Options:

A.

Upload the individual files to Amazon S3 and run the COPY command as soon as the files become available.

B.

Split the number of files so they are equal to a multiple of the number of slices in the Amazon Redshift cluster. Gzip and upload the files to Amazon S3. Run the COPY command on the files.

C.

Split the number of files so they are equal to a multiple of the number of compute nodes in the Amazon Redshift cluster. Gzip and upload the files to Amazon S3. Run the COPY command on the files.

D.

Apply sharding by breaking up the files so the distkey columns with the same values go to the same file. Gzip and upload the sharded files to Amazon S3. Run the COPY command on the files.

Discussion
Question 41

A company has a data lake on AWS that ingests sources of data from multiple business units and uses Amazon Athena for queries. The storage layer is Amazon S3 using the AWS Glue Data Catalog. The company wants to make the data available to its data scientists and business analysts. However, the company first needs to manage data access for Athena based on user roles and responsibilities.

What should the company do to apply these access controls with the LEAST operational overhead?

Options:

A.

Define security policy-based rules for the users and applications by role in AWS Lake Formation.

B.

Define security policy-based rules for the users and applications by role in AWS Identity and Access Management (IAM).

C.

Define security policy-based rules for the tables and columns by role in AWS Glue.

D.

Define security policy-based rules for the tables and columns by role in AWS Identity and Access Management (IAM).

Discussion
Question 42

A company has several Amazon EC2 instances sitting behind an Application Load Balancer (ALB) The company wants its IT Infrastructure team to analyze the IP addresses coming into the company's ALB The ALB is configured to store access logs in Amazon S3 The access logs create about 1 TB of data each day, and access to the data will be infrequent The company needs a solution that is scalable, cost-effective and has minimal maintenance requirements

Which solution meets these requirements?

Options:

A.

Copy the data into Amazon Redshift and query the data

B.

Use Amazon EMR and Apache Hive to query the S3 data

C.

Use Amazon Athena to query the S3 data

D.

Use Amazon Redshift Spectrum to query the S3 data

Discussion
Question 43

A company is streaming its high-volume billing data (100 MBps) to Amazon Kinesis Data Streams. A data analyst partitioned the data on account_id to ensure that all records belonging to an account go to the same Kinesis shard and order is maintained. While building a custom consumer using the Kinesis Java SDK, the data analyst notices that, sometimes, the messages arrive out of order for account_id. Upon further investigation, the data analyst discovers the messages that are out of order seem to be arriving from different shards for the same account_id and are seen when a stream resize runs.

What is an explanation for this behavior and what is the solution?

Options:

A.

There are multiple shards in a stream and order needs to be maintained in the shard. The data analyst

needs to make sure there is only a single shard in the stream and no stream resize runs.

B.

The hash key generation process for the records is not working correctly. The data analyst should generate an explicit hash key on the producer side so the records are directed to the appropriate shard accurately.

C.

The records are not being received by Kinesis Data Streams in order. The producer should use the PutRecords API call instead of the PutRecord API call with the SequenceNumberForOrdering parameter.

D.

The consumer is not processing the parent shard completely before processing the child shards after a stream resize. The data analyst should process the parent shard completely first before processing the child shards.

Discussion
Erik
Hey, I have passed my exam using Cramkey Dumps?
Freyja (not set)
Really, what are they? All come in your pool? Please give me more details, I am going to have access their subscription. Please brother, give me more details.
Annabel
I recently used them for my exam and I passed it with excellent score. I am impressed.
Amirah (not set)
I passed too. The questions I saw in the actual exam were exactly the same as the ones in the Cramkey Dumps. I was able to answer the questions confidently because I had already seen and studied them.
Nell
Are these dumps reliable?
Ernie (not set)
Yes, very much so. Cramkey Dumps are created by experienced and certified professionals who have gone through the exams themselves. They understand the importance of providing accurate and relevant information to help you succeed.
Joey
I highly recommend Cramkey Dumps to anyone preparing for the certification exam. They have all the key information you need and the questions are very similar to what you'll see on the actual exam.
Dexter (not set)
Agreed. It's definitely worth checking out if you're looking for a comprehensive and reliable study resource.
Ari
Can anyone explain what are these exam dumps and how are they?
Ocean (not set)
They're exam preparation materials that are designed to help you prepare for various certification exams. They provide you with up-to-date and accurate information to help you pass your exams.
Page: 10 / 14

DAS-C01
PDF

$35  $99.99

DAS-C01 Testing Engine

$42  $119.99

DAS-C01 PDF + Testing Engine

$56  $159.99