Pre-Summer Special Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: big60

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by juniper

Page: 3 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: juniper
Question 12

An advertising company has a data lake that is built on Amazon S3. The company uses AWS Glue Data Catalog to maintain the metadata. The data lake is several years old and its overall size has increased exponentially as additional data sources and metadata are stored in the data lake. The data lake administrator wants to implement a mechanism to simplify permissions management between Amazon S3 and the Data Catalog to keep them in sync

Which solution will simplify permissions management with minimal development effort?

Options:

A.

Set AWS Identity and Access Management (1AM) permissions tor AWS Glue

B.

Use AWS Lake Formation permissions

C.

Manage AWS Glue and S3 permissions by using bucket policies

D.

Use Amazon Cognito user pools.

Discussion
Sarah
Yeah, I was so relieved when I saw that the question appeared in the exam were similar to their exam dumps. It made the exam a lot easier and I felt confident going into it.
Aaliyah Aug 27, 2024
Same here. I've heard mixed reviews about using exam dumps, but for us, it definitely paid off.
Faye
Yayyyy. I passed my exam. I think all students give these dumps a try.
Emmeline Sep 12, 2024
Definitely! I have no doubt new students will find them to be just as helpful as I did.
Freddy
I passed my exam with flying colors and I'm confident who will try it surely ace the exam.
Aleksander Sep 26, 2024
Thanks for the recommendation! I'll check it out.
Inaaya
Are these Dumps worth buying?
Fraser Oct 9, 2024
Yes, of course, they are necessary to pass the exam. They give you an insight into the types of questions that could come up and help you prepare effectively.
Question 13

A manufacturing company has many loT devices in different facilities across the world The company is using Amazon Kinesis Data Streams to collect the data from the devices

The company's operations team has started to observe many WnteThroughputExceeded exceptions The operations team determines that the reason is the number of records that are being written to certain shards The data contains device ID capture date measurement type, measurement value and facility ID The facility ID is used as the partition key

Which action will resolve this issue?

Options:

A.

Change the partition key from facility ID to a randomly generated key

B.

Increase the number of shards

C.

Archive the data on the producers' side

D.

Change the partition key from facility ID to capture date

Discussion
Question 14

A financial company uses Amazon S3 as its data lake and has set up a data warehouse using a multi-node Amazon Redshift cluster. The data files in the data lake are organized in folders based on the data source of each data file. All the data files are loaded to one table in the Amazon Redshift cluster using a separate COPY command for each data file location. With this approach, loading all the data files into Amazon Redshift takes a long time to complete. Users want a faster solution with little or no increase in cost while maintaining the segregation of the data files in the S3 data lake.

Which solution meets these requirements?

Options:

A.

Use Amazon EMR to copy all the data files into one folder and issue a COPY command to load the data into Amazon Redshift.

B.

Load all the data files in parallel to Amazon Aurora, and run an AWS Glue job to load the data into Amazon Redshift.

C.

Use an AWS Glue job to copy all the data files into one folder and issue a COPY command to load the data into Amazon Redshift.

D.

Create a manifest file that contains the data file locations and issue a COPY command to load the data into Amazon Redshift.

Discussion
Question 15

A large marketing company needs to store all of its streaming logs and create near-real-time dashboards. The dashboards will be used to help the company make critical business decisions and must be highly available.

Which solution meets these requirements?

Options:

A.

Store the streaming logs in Amazon S3 with replication to an S3 bucket in a different Availability Zone. Create the dashboards by using Amazon QuickSight.

B.

Deploy an Amazon Redshift cluster with at least three nodes in a VPC that spans two Availability Zones. Store the streaming logs and use the Redshift cluster as a source to create the dashboards by using Amazon QuickSight.

C.

Store the streaming logs in Amazon S3 with replication to an S3 bucket in a different Availability Zone. Every time a new log is added in the bucket, invoke an AWS Lambda function to update the dashboards in Amazon QuickSight.

D.

Store the streaming logs in Amazon OpenSearch Service deployed across three Availability Zones and with three dedicated master nodes. Create the dashboards by using OpenSearch Dashboards.

Discussion
Page: 3 / 14

DAS-C01
PDF

$42  $104.99

DAS-C01 Testing Engine

$50  $124.99

DAS-C01 PDF + Testing Engine

$66  $164.99