Spring Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by kiana

Page: 2 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: kiana
Question 8

A hospital is building a research data lake to ingest data from electronic health records (EHR) systems from multiple hospitals and clinics. The EHR systems are independent of each other and do not have a common patient identifier. The data engineering team is not experienced in machine learning (ML) and has been asked to generate a unique patient identifier for the ingested records.

Which solution will accomplish this task?

Options:

A.

An AWS Glue ETL job with the FindMatches transform

B.

Amazon Kendra

C.

Amazon SageMaker Ground Truth

D.

An AWS Glue ETL job with the ResolveChoice transform

Discussion
Teddie
yes, I passed my exam with wonderful score, Accurate and valid dumps.
Isla-Rose Feb 13, 2026
Absolutely! The questions in the dumps were almost identical to the ones that appeared in the actual exam. I was able to answer almost all of them correctly.
Anya
I must say they're considered the best dumps available and the questions are very similar to what you'll see in the actual exam. Recommended!!!
Cassius Feb 3, 2026
Yes, they offer a 100% success guarantee. And many students who have used them have reported passing their exams with flying colors.
Cody
I used Cramkey Dumps to prepare and a lot of the questions on the exam were exactly what I found in their study materials.
Eric Feb 17, 2026
Really? That's great to hear! I used Cramkey Dumps too and I had the same experience. The questions were almost identical.
Syeda
I passed, Thank you Cramkey for your precious Dumps.
Stella Feb 18, 2026
That's great. I think I'll give Cramkey Dumps a try.
Billy
It was like deja vu! I was confident going into the exam because I had already seen those questions before.
Vincent Feb 23, 2026
Definitely. And the best part is, I passed! I feel like all that hard work and preparation paid off. Cramkey is the best resource for all students!!!
Question 9

A financial company uses Apache Hive on Amazon EMR for ad-hoc queries. Users are complaining of sluggish performance.

A data analyst notes the following:

  • Approximately 90% of queries are submitted 1 hour after the market opens.
  • Hadoop Distributed File System (HDFS) utilization never exceeds 10%.

Which solution would help address the performance issues?

Options:

A.

Create instance fleet configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch CapacityRemainingGB metric. Create an automatic scaling policy to scale in the instance fleet based on the CloudWatch CapacityRemainingGB metric.

B.

Create instance fleet configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch YARNMemoryAvailablePercentage metric. Create an automatic scaling policy to scale in the instance fleet based on the CloudWatch YARNMemoryAvailablePercentage metric.

C.

Create instance group configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch CapacityRemainingGB metric. Create an

automatic scaling policy to scale in the instance groups based on the CloudWatch CapacityRemainingGB metric.

D.

Create instance group configurations for core and task nodes. Create an automatic scaling policy to scale out the instance groups based on the Amazon CloudWatch YARNMemoryAvailablePercentage metric. Create an automatic scaling policy to scale in the instance groups based on the CloudWatch YARNMemoryAvailablePercentage metric.

Discussion
Question 10

A data analytics specialist is building an automated ETL ingestion pipeline using AWS Glue to ingest compressed files that have been uploaded to an Amazon S3 bucket. The ingestion pipeline should support incremental data processing.

Which AWS Glue feature should the data analytics specialist use to meet this requirement?

Options:

A.

Workflows

B.

Triggers

C.

Job bookmarks

D.

Classifiers

Discussion
Question 11

A company uses Amazon Redshift as its data warehouse. The Redshift cluster is not encrypted. A data analytics specialist needs to use hardware security module (HSM) managed encryption keys to encrypt the data that is stored in the Redshift cluster.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Stop all write operations on the source cluster. Unload data from the source cluster.

B.

Copy the data to a new target cluster that is encrypted with AWS Key Management Service (AWS KMS).

C.

Modify the source cluster by activating AWS CloudHSM encryption. Configure Amazon Redshift to automatically migrate data to a new encrypted cluster.

D.

Modify the source cluster by activating encryption from an external HSM. Configure Amazon Redshift to automatically migrate data to a new encrypted cluster.

E.

Copy the data to a new target cluster that is encrypted with an HSM from AWS CloudHSM.

F.

Rename the source cluster and the target cluster after the migration so that the target cluster is using the original endpoint.

Discussion
Page: 2 / 14

DAS-C01
PDF

$36.75  $104.99

DAS-C01 Testing Engine

$43.75  $124.99

DAS-C01 PDF + Testing Engine

$57.75  $164.99