Summer Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by kiana

Page: 2 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: kiana
Question 8

A hospital is building a research data lake to ingest data from electronic health records (EHR) systems from multiple hospitals and clinics. The EHR systems are independent of each other and do not have a common patient identifier. The data engineering team is not experienced in machine learning (ML) and has been asked to generate a unique patient identifier for the ingested records.

Which solution will accomplish this task?

Options:

A.

An AWS Glue ETL job with the FindMatches transform

B.

Amazon Kendra

C.

Amazon SageMaker Ground Truth

D.

An AWS Glue ETL job with the ResolveChoice transform

Discussion
Question 9

A financial company uses Apache Hive on Amazon EMR for ad-hoc queries. Users are complaining of sluggish performance.

A data analyst notes the following:

  • Approximately 90% of queries are submitted 1 hour after the market opens.
  • Hadoop Distributed File System (HDFS) utilization never exceeds 10%.

Which solution would help address the performance issues?

Options:

A.

Create instance fleet configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch CapacityRemainingGB metric. Create an automatic scaling policy to scale in the instance fleet based on the CloudWatch CapacityRemainingGB metric.

B.

Create instance fleet configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch YARNMemoryAvailablePercentage metric. Create an automatic scaling policy to scale in the instance fleet based on the CloudWatch YARNMemoryAvailablePercentage metric.

C.

Create instance group configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch CapacityRemainingGB metric. Create an

automatic scaling policy to scale in the instance groups based on the CloudWatch CapacityRemainingGB metric.

D.

Create instance group configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch YARNMemoryAvailablePercentage metric. Create an automatic scaling policy to scale in the instance groups based on the CloudWatch YARNMemoryAvailablePercentage metric.

Discussion
Question 10

A data analytics specialist is building an automated ETL ingestion pipeline using AWS Glue to ingest compressed files that have been uploaded to an Amazon S3 bucket. The ingestion pipeline should support incremental data processing.

Which AWS Glue feature should the data analytics specialist use to meet this requirement?

Options:

A.

Workflows

B.

Triggers

C.

Job bookmarks

D.

Classifiers

Discussion
Miley
Hey, I tried Cramkey Dumps for my IT certification exam. They are really awesome and helped me pass my exam with wonderful score.
Megan (not set)
That’s great!!! I’ll definitely give it a try. Thanks!!!
Ella-Rose
Amazing website with excellent Dumps. I passed my exam and secured excellent marks!!!
Alisha (not set)
Extremely accurate. They constantly update their materials with the latest exam questions and answers, so you can be confident that what you're studying is up-to-date.
Syeda
I passed, Thank you Cramkey for your precious Dumps.
Stella (not set)
That's great. I think I'll give Cramkey Dumps a try.
Vienna
I highly recommend them. They are offering exact questions that we need to prepare our exam.
Jensen (not set)
That's great. I think I'll give Cramkey a try next time I take a certification exam. Thanks for the recommendation!
Question 11

A company uses Amazon Redshift as its data warehouse. The Redshift cluster is not encrypted. A data analytics specialist needs to use hardware security module (HSM) managed encryption keys to encrypt the data that is stored in the Redshift cluster.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Stop all write operations on the source cluster. Unload data from the source cluster.

B.

Copy the data to a new target cluster that is encrypted with AWS Key Management Service (AWS KMS).

C.

Modify the source cluster by activating AWS CloudHSM encryption. Configure Amazon Redshift to automatically migrate data to a new encrypted cluster.

D.

Modify the source cluster by activating encryption from an external HSM. Configure Amazon Redshift to automatically migrate data to a new encrypted cluster.

E.

Copy the data to a new target cluster that is encrypted with an HSM from AWS CloudHSM.

F.

Rename the source cluster and the target cluster after the migration so that the target cluster is using the original endpoint.

Discussion
Page: 2 / 14

DAS-C01
PDF

$35  $99.99

DAS-C01 Testing Engine

$42  $119.99

DAS-C01 PDF + Testing Engine

$56  $159.99