Black Friday-Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Page: 1 / 5

AWS Certified Data Analytics AWS Certified Data Analytics - Specialty

AWS Certified Data Analytics - Specialty

Last Update Dec 11, 2023
Total Questions : 157

To help you prepare for the DAS-C01 Amazon Web Services exam, we are offering free DAS-C01 Amazon Web Services exam questions. All you need to do is sign up, provide your details, and prepare with the free DAS-C01 practice questions. Once you have done that, you will have access to the entire pool of AWS Certified Data Analytics - Specialty DAS-C01 test questions which will help you better prepare for the exam. Additionally, you can also find a range of AWS Certified Data Analytics - Specialty resources online to help you better understand the topics covered on the exam, such as AWS Certified Data Analytics - Specialty DAS-C01 video tutorials, blogs, study guides, and more. Additionally, you can also practice with realistic Amazon Web Services DAS-C01 exam simulations and get feedback on your progress. Finally, you can also share your progress with friends and family and get encouragement and support from them.

Questions 4

A company wants to research user turnover by analyzing the past 3 months of user activities. With millions of users, 1.5 TB of uncompressed data is generated each day. A 30-node Amazon Redshift cluster with 2.56 TB of solid state drive (SSD) storage for each node is required to meet the query performance goals.

The company wants to run an additional analysis on a year’s worth of historical data to examine trends indicating which features are most popular. This analysis will be done once a week.

What is the MOST cost-effective solution?

Options:

A.  

Increase the size of the Amazon Redshift cluster to 120 nodes so it has enough storage capacity to hold 1

year of data. Then use Amazon Redshift for the additional analysis.

B.  

Keep the data from the last 90 days in Amazon Redshift. Move data older than 90 days to Amazon S3 and store it in Apache Parquet format partitioned by date. Then use Amazon Redshift Spectrum for the additional analysis.

C.  

Keep the data from the last 90 days in Amazon Redshift. Move data older than 90 days to Amazon S3 and store it in Apache Parquet format partitioned by date. Then provision a persistent Amazon EMR cluster and use Apache Presto for the additional analysis.

D.  

Resize the cluster node type to the dense storage node type (DS2) for an additional 16 TB storage capacity on each individual node in the Amazon Redshift cluster. Then use Amazon Redshift for the additional analysis.

Discussion 0
Questions 5

A data analyst is designing a solution to interactively query datasets with SQL using a JDBC connection. Users will join data stored in Amazon S3 in Apache ORC format with data stored in Amazon Elasticsearch Service (Amazon ES) and Amazon Aurora MySQL.

Which solution will provide the MOST up-to-date results?

Options:

A.  

Use AWS Glue jobs to ETL data from Amazon ES and Aurora MySQL to Amazon S3. Query the data with Amazon Athena.

B.  

Use Amazon DMS to stream data from Amazon ES and Aurora MySQL to Amazon Redshift. Query the data with Amazon Redshift.

C.  

Query all the datasets in place with Apache Spark SQL running on an AWS Glue developer endpoint.

D.  

Query all the datasets in place with Apache Presto running on Amazon EMR.

Discussion 0
Robin
Cramkey is highly recommended.
Jonah (not set)
Definitely. If you're looking for a reliable and effective study resource, look no further than Cramkey Dumps. They're simply wonderful!
Ilyas
Definitely. I felt much more confident and prepared because of the Cramkey Dumps. I was able to answer most of the questions with ease and I think that helped me to score well on the exam.
Saoirse (not set)
That's amazing. I'm glad you found something that worked for you. Maybe I should try them out for my next exam.
Nadia
Why these dumps are important? Can I pass my exam without these dumps?
Julian (not set)
The questions in the Cramkey dumps are explained in detail and there are also study notes and reference materials provided. This made it easier for me to understand the concepts and retain the information better.
Lennox
Something Special that they provide a comprehensive overview of the exam content. They cover all the important topics and concepts, so you can be confident that you are well-prepared for the test.
Aiza (not set)
That makes sense. What makes Cramkey Dumps different from other study materials?
Victoria
Hey, guess what? I passed the certification exam! I couldn't have done it without Cramkey Dumps.
Isabel (not set)
Same here! I was so surprised when I saw that almost all the questions on the exam were exactly what I found in their study materials.
Questions 6

A regional energy company collects voltage data from sensors attached to buildings. To address any known dangerous conditions, the company wants to be alerted when a sequence of two voltage drops is detected within 10 minutes of a voltage spike at the same building. It is important to ensure that all messages are delivered as quickly as possible. The system must be fully managed and highly available. The company also needs a solution that will automatically scale up as it covers additional cites with this monitoring feature. The alerting system is subscribed to an Amazon SNS topic for remediation.

Which solution meets these requirements?

Options:

A.  

Create an Amazon Managed Streaming for Kafka cluster to ingest the data, and use an Apache Spark Streaming with Apache Kafka consumer API in an automatically scaled Amazon EMR cluster to process the incoming data. Use the Spark Streaming application to detect the known event sequence and send the SNS message.

B.  

Create a REST-based web service using Amazon API Gateway in front of an AWS Lambda function. Create an Amazon RDS for PostgreSQL database with sufficient Provisioned IOPS (PIOPS). In the Lambda function, store incoming events in the RDS database and query the latest data to detect the known event sequence and send the SNS message.

C.  

Create an Amazon Kinesis Data Firehose delivery stream to capture the incoming sensor data. Use an AWS Lambda transformation function to detect the known event sequence and send the SNS message.

D.  

Create an Amazon Kinesis data stream to capture the incoming sensor data and create another stream for alert messages. Set up AWS Application Auto Scaling on both. Create a Kinesis Data Analytics for Java application to detect the known event sequence, and add a message to the message stream. Configure an AWS Lambda function to poll the message stream and publish to the SNS topic.

Discussion 0
Questions 7

A company recently created a test AWS account to use for a development environment The company also created a production AWS account in another AWS Region As part of its security testing the company wants to send log data from Amazon CloudWatch Logs in its production account to an Amazon Kinesis data stream in its test account

Which solution will allow the company to accomplish this goal?

Options:

A.  

Create a subscription filter in the production accounts CloudWatch Logs to target the Kinesis data stream in the test account as its destination In the test account create an 1AM role that grants access to the Kinesis data stream and the CloudWatch Logs resources in the production account

B.  

In the test account create an 1AM role that grants access to the Kinesis data stream and the CloudWatch Logs resources in the production account Create a destination data stream in Kinesis Data Streams in the test account with an 1AM role and a trust policy that allow CloudWatch Logs in the production account to write to the test account

C.  

In the test account, create an 1AM role that grants access to the Kinesis data stream and the CloudWatch Logs resources in the production account Create a destination data stream in Kinesis Data Streams in the test account with an 1AM role and a trust policy that allow CloudWatch Logs in the production account to write to the test account

D.  

Create a destination data stream in Kinesis Data Streams in the test account with an 1AM role and a trust policy that allow CloudWatch Logs in the production account to write to the test account Create a subscription filter in the production accounts CloudWatch Logs to target the Kinesis data stream in the test account as its destination

Discussion 0

DAS-C01
PDF

$35  $99.99

DAS-C01 Testing Engine

$42  $119.99

DAS-C01 PDF + Testing Engine

$56  $159.99