New Year Special 75% Discount offer - Ends in 0d 00h 00m 00s - Coupon code: 75brite

Google Updated Associate-Cloud-Engineer Exam Questions and Answers by tia

Page: 22 / 24

Google Associate-Cloud-Engineer Exam Overview :

Exam Name: Google Cloud Certified - Associate Cloud Engineer
Exam Code: Associate-Cloud-Engineer Dumps
Vendor: Google Certification: Google Cloud Certified
Questions: 332 Q&A's Shared By: tia
Question 88

You have a number of applications that have bursty workloads and are heavily dependent on topics to decouple publishing systems from consuming systems. Your company would like to go serverless to enable developers to focus on writing code without worrying about infrastructure. Your solution architect has already identified Cloud Pub/Sub as a suitable alternative for decoupling systems. You have been asked to identify a suitable GCP Serverless service that is easy to use with Cloud Pub/Sub. You want the ability to scale down to zero when there is no traffic in order to minimize costs. You want to follow Google recommended practices. What should you suggest?

Options:

A.

Cloud Run for Anthos

B.

Cloud Run

C.

App Engine Standard

D.

Cloud Functions.

Discussion
Question 89

A company wants to build an application that stores images in a Cloud Storage bucket and wants to generate thumbnails as well as resize the images. They want to use a google managed service that can scale up and scale down to zero automatically with minimal effort. You have been asked to recommend a service. Which GCP service would you suggest?

Options:

A.

Google Compute Engine

B.

Google App Engine

C.

Cloud Functions

D.

Google Kubernetes Engine

Discussion
Question 90

A team of data scientists infrequently needs to use a Google Kubernetes Engine (GKE) cluster that you manage. They require GPUs for some long-running, non-restartable jobs. You want to minimize cost. What should you do?

Options:

A.

Enable node auto-provisioning on the GKE cluster.

B.

Create a VerticalPodAutscaler for those workloads.

C.

Create a node pool with preemptible VMs and GPUs attached to those VMs.

D.

Create a node pool of instances with GPUs, and enable autoscaling on this node pool with a minimum size of 1.

Discussion
Ace
No problem! I highly recommend Cramkey Dumps to anyone looking to pass their certification exams. They will help you feel confident and prepared on exam day. Good luck!
Harris Dec 28, 2025
That sounds amazing. I'll definitely check them out. Thanks for the recommendation!
Aryan
Absolutely rocked! They are an excellent investment for anyone who wants to pass the exam on the first try. They save you time and effort by providing a comprehensive overview of the exam content, and they give you a competitive edge by giving you access to the latest information. So, I definitely recommend them to new students.
Jessie Dec 18, 2025
did you use PDF or Engine? Which one is most useful?
Sam
Can I get help from these dumps and their support team for preparing my exam?
Audrey Dec 15, 2025
Definitely, you won't regret it. They've helped so many people pass their exams and I'm sure they'll help you too. Good luck with your studies!
Nadia
Why these dumps are important? Can I pass my exam without these dumps?
Julian Dec 27, 2025
The questions in the Cramkey dumps are explained in detail and there are also study notes and reference materials provided. This made it easier for me to understand the concepts and retain the information better.
Hendrix
Great website with Great Exam Dumps. Just passed my exam today.
Luka Dec 17, 2025
Absolutely. Cramkey Dumps only provides the latest and most updated exam questions and answers.
Question 91

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

Options:

A.

1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add the following value: logs-destination: bq://platform-logs.

B.

1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2. Create a Cloud Function that is triggered by messages in the logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

C.

1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3. Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

D.

1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that executes this query:INSERT INTO dataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.

Discussion
Page: 22 / 24
Title
Questions
Posted

Associate-Cloud-Engineer
PDF

$26.25  $104.99

Associate-Cloud-Engineer Testing Engine

$31.25  $124.99

Associate-Cloud-Engineer PDF + Testing Engine

$41.25  $164.99