New Year Special Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: spcl70

Practice Free Associate-Cloud-Engineer Google Cloud Certified - Associate Cloud Engineer Exam Questions Answers With Explanation

We at Crack4sure are committed to giving students who are preparing for the Google Associate-Cloud-Engineer Exam the most current and reliable questions . To help people study, we've made some of our Google Cloud Certified - Associate Cloud Engineer exam materials available for free to everyone. You can take the Free Associate-Cloud-Engineer Practice Test as many times as you want. The answers to the practice questions are given, and each answer is explained.

Question # 6

Your application is running on Google Cloud in a managed instance group (MIG). You see errors in Cloud Logging for one VM that one of the processes is not responsive. You want to replace this VM in the MIG quickly. What should you do?

A.

Select the MIG from the Compute Engine console and, in the menu, select Replace VMs.

B.

Use the gcloud compute instance-groups managed recreate-instances command to recreate theVM.

C.

Use the gcloud compute instances update command with a REFRESH action for the VM.

D.

Update and apply the instance template of the MIG.

Question # 7

You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts are proficient only in SQL and need access to the data stored in this file. You want to find a cost-effective way to complete their request as soon as possible. What should you do?

A.

Load data in Cloud Datastore and run a SQL query against it.

B.

Create a BigQuery table and load data in BigQuery. Run a SQL query on this table and drop this table after you complete your request.

C.

Create external tables in BigQuery that point to Cloud Storage buckets and run a SQL query on these external tables to complete your request.

D.

Create a Hadoop cluster and copy the AVRO file to NDFS by compressing it. Load the file in a hive table and provide access to your analysts so that they can run SQL queries.

Question # 8

You want to configure an SSH connection to a single Compute Engine instance for users in the dev1 group. This instance is the only resource in this particular Google Cloud Platform project that the dev1 users should be able to connect to. What should you do?

A.

Set metadata to enable-oslogin=true for the instance. Grant the dev1 group the compute.osLogin role. Direct them to use the Cloud Shell to ssh to that instance.

B.

Set metadata to enable-oslogin=true for the instance. Set the service account to no service account for that instance. Direct them to use the Cloud Shell to ssh to that instance.

C.

Enable block project wide keys for the instance. Generate an SSH key for each user in the dev1 group. Distribute the keys to dev1 users and direct them to use their third-party tools to connect.

D.

Enable block project wide keys for the instance. Generate an SSH key and associate the key with that instance. Distribute the key to dev1 users and direct them to use their third-party tools to connect.

Question # 9

You need to create a custom VPC with a single subnet. The subnet’s range must be as large as possible. Which range should you use?

A.

.00.0.0/0

B.

10.0.0.0/8

C.

172.16.0.0/12

D.

192.168.0.0/16

Question # 10

You built an application on your development laptop that uses Google Cloud services. Your application uses Application Default Credentials for authentication and works fine on your development laptop. You want to migrate this application to a Compute Engine virtual machine (VM) and set up authentication using Google- recommended practices and minimal changes. What should you do?

A.

Assign appropriate access for Google services to the service account used by the Compute Engine VM.

B.

Create a service account with appropriate access for Google services, and configure the application to use this account.

C.

Store credentials for service accounts with appropriate access for Google services in a config file, and deploy this config file with your application.

D.

Store credentials for your user account with appropriate access for Google services in a config file, and deploy this config file with your application.

Question # 11

Your company has an existing GCP organization with hundreds of projects and a billing account. Your company recently acquired another company that also has hundreds of projects and its own billing account. You would like to consolidate all GCP costs of both GCP organizations onto a single invoice. You would like to consolidate all costs as of tomorrow. What should you do?

A.

Link the acquired company’s projects to your company's billing account.

B.

Configure the acquired company's billing account and your company's billing account to export the billing data into the same BigQuery dataset.

C.

Migrate the acquired company’s projects into your company’s GCP organization. Link the migrated projects to your company's billing account.

D.

Create a new GCP organization and a new billing account. Migrate the acquired company's projects and your company's projects into the new GCP organization and link the projects to the new billing account.

Question # 12

(You are managing a stateful application deployed on Google Kubernetes Engine (GKE) that can only have one replica. You recently discovered that the application becomes unstable at peak times. You have identified that the application needs more CPU than what has been configured in the manifest at these peak times. You want Kubernetes to allocate the application sufficient CPU resources during these peak times, while ensuring cost efficiency during off-peak periods. What should you do?)

A.

Enable cluster autoscaling on the GKE cluster.

B.

Configure a Vertical Pod Autoscaler on the Deployment.

C.

Configure a Horizontal Pod Autoscaler on the Deployment.

D.

Enable node auto-provisioning on the GKE cluster.

Question # 13

You’ve deployed a microservice called myapp1 to a Google Kubernetes Engine cluster using the YAML file specified below:

Associate-Cloud-Engineer question answer

You need to refactor this configuration so that the database password is not stored in plain text. You want to follow Google-recommended practices. What should you do?

A.

Store the database password inside the Docker image of the container, not in the YAML file.

B.

Store the database password inside a Secret object. Modify the YAML file to populate the DB_PASSWORD environment variable from the Secret.

C.

Store the database password inside a ConfigMap object. Modify the YAML file to populate the DB_PASSWORD environment variable from the ConfigMap.

D.

Store the database password in a file inside a Kubernetes persistent volume, and use a persistent volume claim to mount the volume to the container.

Question # 14

Your company completed the acquisition of a startup and is now merging the IT systems of both companies. The startup had a production Google Cloud project in their organization. You need to move this project into your organization and ensure that the project is billed lo your organization. You want to accomplish this task with minimal effort. What should you do?

A.

Use the projects. move method to move the project to your organization. Update the billing account of the project to that of your organization.

B.

Ensure that you have an Organization Administrator Identity and Access Management (IAM) role assigned to you in both organizations. Navigate to the Resource Manager in the startup's Google Cloud organization, and drag the project to your company's organization.

C.

Create a Private Catalog tor the Google Cloud Marketplace, and upload the resources of the startup’s production project to the Catalog. Share the Catalog with your organization, and deploy the resources in your company’s project.

D.

Create an infrastructure-as-code template tor all resources in the project by using Terraform. and deploy that template to a new project in your organization. Delete the protect from the startup's Google Cloud organization.

Question # 15

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

A.

Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.

B.

Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.

C.

Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.

D.

Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.

Question # 16

You created a Google Cloud Platform project with an App Engine application inside the project. You initially configured the application to be served from the us-central region. Now you want the application to be served from the asia-northeast1 region. What should you do?

A.

Change the default region property setting in the existing GCP project to asia-northeast1.

B.

Change the region property setting in the existing App Engine application from us-central to asia-northeast1.

C.

Create a second App Engine application in the existing GCP project and specify asia-northeast1 as the region to serve your application.

D.

Create a new GCP project and create an App Engine application inside this new project. Specify asia-northeast1 as the region to serve your application.

Question # 17

Your company is closely monitoring their cloud spend. You need to allow different teams to monitor their Google Cloud costs. You must ensure that team members receive notifications when their cloud spend reaches certain thresholds and give team members the ability to create dashboards for additional insights with detailed billing data. You want to follow Google-recommended practices and minimize engineering costs. What should you do?

A.

Deploy Grafana to Compute Engine. Create a dashboard for each team that uses the data from the Cloud Billing API. Ask each team to create their own alerts in Cloud Monitoring.

B.

Set up alerts for each team based on required thresholds. Create a shell script to read data from the Cloud Billing API and push the results to BigQuery. Grant team members access to BigQuery.

C.

Deploy Grafana to Compute Engine. Create a dashboard for each team that uses the data from the Cloud Billing Budget API. Ask each team to create their own alerts in Grafana.

D.

Set up alerts for each team based on required thresholds. Set up billing exports to BigQuery. Grant team members access to BigQuery.

Question # 18

You are responsible for a web application on Compute Engine. You want your support team to be notified automatically if users experience high latency for at least 5 minutes. You need a Google-recommended solution with no development cost. What should you do?

A.

Create an alert policy to send a notification when the HTTP response latency exceeds the specified threshold.

B.

Implement an App Engine service which invokes the Cloud Monitoring API and sends a notification in case of anomalies.

C.

Use the Cloud Monitoring dashboard to observe latency and take the necessary actions when the response latency exceeds the specified threshold.

D.

Export Cloud Monitoring metrics to BigQuery and use a Looker Studio dashboard to monitor your web applications latency.

Question # 19

You have a number of compute instances belonging to an unmanaged instances group. You need to SSH to one of the Compute Engine instances to run an ad hoc script. You’ve already authenticated gcloud, however, you don’t have an SSH key deployed yet. In the fewest steps possible, what’s the easiest way to SSH to the instance?

A.

Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.

B.

Use the gcloud compute ssh command.

C.

Create a key with the ssh-keygen command. Then use the gcloud compute ssh command.

D.

Create a key with the ssh-keygen command. Upload the key to the instance. Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.

Question # 20

Your company wants to migrate your data from an on-premises relational database to Google Cloud. Your current database can no longer scale with respect to the growth of your users and you expect the number of users to rapidly grow. You need to choose a relational database that allows you to globally scale while minimizing your management and administration efforts. You also want to follow Google-recommended practices. What should you do?

A.

Use Cloud SQL

B.

Use Filestore.

C.

Use Spanner.

D.

Use BigQuery.

Question # 21

Your coworker has helped you set up several configurations for gcloud. You've noticed that you're running commands against the wrong project. Being new to the company, you haven't yet memorized any of the projects. With the fewest steps possible, what's the fastest way to switch to the correct configuration?

A.

Run gcloud configurations list followed by gcloud configurations activate .

B.

Run gcloud config list followed by gcloud config activate.

C.

Run gcloud config configurations list followed by gcloud config configurations activate.

D.

Re-authenticate with the gcloud auth login command and select the correct configurations on login.

Question # 22

You have just created a new project which will be used to deploy a globally distributed application. You will use Cloud Spanner for data storage. You want to create a Cloud Spanner instance. You want to perform the first step in preparation of creating the instance. What should you do?

A.

Grant yourself the IAM role of Cloud Spanner Admin

B.

Create a new VPC network with subnetworks in all desired regions

C.

Configure your Cloud Spanner instance to be multi-regional

D.

Enable the Cloud Spanner API

Question # 23

You are using Container Registry to centrally store your company’s container images in a separate project. In another project, you want to create a Google Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download images from Container Registry. What should you do?

A.

In the project where the images are stored, grant the Storage Object Viewer IAM role to the service account used by the Kubernetes nodes.

B.

When you create the GKE cluster, choose the Allow full access to all Cloud APIs option under ‘Access scopes’.

C.

Create a service account, and give it access to Cloud Storage. Create a P12 key for this service account and use it as an imagePullSecrets in Kubernetes.

D.

Configure the ACLs on each image in Cloud Storage to give read-only access to the default Compute Engine service account.

Question # 24

(You are migrating your on-premises workload to Google Cloud. Your company is implementing its Cloud Billing configuration and requires access to a granular breakdown of its Google Cloud costs. You need to ensure that the Cloud Billing datasets are available in BigQuery so you can conduct a detailed analysis of costs. What should you do?)

A.

Enable the BigQuery API and ensure that the BigQuery User IAM role is selected. Change the BigQuery dataset to select a data location.

B.

Create a Cloud Billing account. Enable the BigQuery Data Transfer Service API to export pricing data.

C.

Enable Cloud Billing data export to BigQuery when you create a Cloud Billing account.

D.

Enable Cloud Billing on the project and link a Cloud Billing account. Then view the billing data table in the BigQuery dataset.

Question # 25

You used the gcloud container clusters command to create two Google Cloud Kubernetes (GKE) clusters prod-cluster and dev-cluster.

• prod-cluster is a standard cluster.

• dev-cluster is an auto-pilot duster.

When you run the Kubect1 get nodes command, you only see the nodes from prod-cluster Which commands should you run to check the node status for dev-cluster?

A.

Associate-Cloud-Engineer question answer

B.

Associate-Cloud-Engineer question answer

C.

Associate-Cloud-Engineer question answer

D.

Associate-Cloud-Engineer question answer

Question # 26

You have an application on a general-purpose Compute Engine instance that is experiencing excessive disk read throttling on its Zonal SSD Persistent Disk. The application primarily reads large files from disk. The disk size is currently 350 GB. You want to provide the maximum amount of throughput while minimizing costs. What should you do?

A.

Increase the size of the disk to 1 TB.

B.

Increase the allocated CPU to the instance.

C.

Migrate to use a Local SSD on the instance.

D.

Migrate to use a Regional SSD on the instance.

Question # 27

You want to run a single caching HTTP reverse proxy on GCP for a latency-sensitive website. This specific reverse proxy consumes almost no CPU. You want to have a 30-GB in-memory cache, and need an additional 2 GB of memory for the rest of the processes. You want to minimize cost. How should you run this reverse proxy?

A.

Create a Cloud Memorystore for Redis instance with 32-GB capacity.

B.

Run it on Compute Engine, and choose a custom instance type with 6 vCPUs and 32 GB of memory.

C.

Package it in a container image, and run it on Kubernetes Engine, using n1-standard-32 instances as nodes.

D.

Run it on Compute Engine, choose the instance type n1-standard-1, and add an SSD persistent disk of 32 GB.

Question # 28

You need to run an important query in BigQuery but expect it to return a lot of records. You want to find out how much it will cost to run the query. You are using on-demand pricing. What should you do?

A.

Arrange to switch to Flat-Rate pricing for this query, then move back to on-demand.

B.

Use the command line to run a dry run query to estimate the number of bytes read. Then convert that bytes estimate to dollars using the Pricing Calculator.

C.

Use the command line to run a dry run query to estimate the number of bytes returned. Then convert that bytes estimate to dollars using the Pricing Calculator.

D.

Run a select count (*) to get an idea of how many records your query will look through. Then convert that number of rows to dollars using the Pricing Calculator.

Question # 29

Several employees at your company have been creating projects with Cloud Platform and paying for it with their personal credit cards, which the company reimburses. The company wants to centralize all these projects under a single, new billing account. What should you do?

A.

Contact cloud-billing@google.com with your bank account details and request a corporate billing account for your company.

B.

Create a ticket with Google Support and wait for their call to share your credit card details over the phone.

C.

In the Google Platform Console, go to the Resource Manage and move all projects to the root Organization.

D.

In the Google Cloud Platform Console, create a new billing account and set up a payment method.

Question # 30

You want to configure 10 Compute Engine instances for availability when maintenance occurs. Your requirements state that these instances should attempt to automatically restart if they crash. Also, the instances should be highly available including during system maintenance. What should you do?

A.

Create an instance template for the instances. Set the ‘Automatic Restart’ to on. Set the ‘On-host maintenance’ to Migrate VM instance. Add the instance template to an instance group.

B.

Create an instance template for the instances. Set ‘Automatic Restart’ to off. Set ‘On-host maintenance’ to Terminate VM instances. Add the instance template to an instance group.

C.

Create an instance group for the instances. Set the ‘Autohealing’ health check to healthy (HTTP).

D.

Create an instance group for the instance. Verify that the ‘Advanced creation options’ setting for ‘do not retry machine creation’ is set to off.

Question # 31

You are planning to migrate your on-premises data to Google Cloud. The data includes:

• 200 TB of video files in SAN storage

• Data warehouse data stored on Amazon Redshift

• 20 GB of PNG files stored on an S3 bucket

You need to load the video files into a Cloud Storage bucket, transfer the data warehouse data into BigQuery, and load the PNG files into a second Cloud Storage bucket. You want to follow Google-recommended practices and avoid writing any code for the migration. What should you do?

A.

Use gcloud storage for the video files. Dataflow for the data warehouse data, and Storage Transfer Service for the PNG files.

B.

Use Transfer Appliance for the videos. BigQuery Data Transfer Service for the data warehouse data, and Storage Transfer Service for the PNG files.

C.

Use Storage Transfer Service for the video files, BigQuery Data Transfer Service for the data warehouse data, and Storage Transfer Service for the PNG files.

D.

Use Cloud Data Fusion for the video files, Dataflow for the data warehouse data, and Storage Transfer Service for the PNG files.

Question # 32

Your company uses BigQuery to store and analyze data. Upon submitting your query in BigQuery, the query fails with a quotaExceeded error. You need to diagnose the issue causing the error. What should you do?

Choose 2 answers

A.

Search errors in Cloud Audit Logs to analyze the issue.

B.

Configure Cloud Trace to analyze the issue.

C.

View errors in Cloud Monitoring to analyze the issue.

D.

Use the information schema views to analyze the underlying issue.

E.

Use BigQuery Bl Engine to analyze the issue.

Question # 33

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution. What should you do?

A.

Deploy the monitoring pod in a StatefulSet object.

B.

Deploy the monitoring pod in a DaemonSet object.

C.

Reference the monitoring pod in a Deployment object.

D.

Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.

Question # 34

You need to set a budget alert for use of Compute Engineer services on one of the three Google Cloud Platform projects that you manage. All three projects are linked to a single billing account. What should you do?

A.

Verify that you are the project billing administrator. Select the associated billing account and create a budget and alert for the appropriate project.

B.

Verify that you are the project billing administrator. Select the associated billing account and create a budget and a custom alert.

C.

Verify that you are the project administrator. Select the associated billing account and create a budget for the appropriate project.

D.

Verify that you are project administrator. Select the associated billing account and create a budget and a custom alert.

Question # 35

You created an instance of SQL Server 2017 on Compute Engine to test features in the new version. You want to connect to this instance using the fewest number of steps. What should you do?

A.

Install a RDP client on your desktop. Verify that a firewall rule for port 3389 exists.

B.

Install a RDP client in your desktop. Set a Windows username and password in the GCP Console. Use the credentials to log in to the instance.

C.

Set a Windows password in the GCP Console. Verify that a firewall rule for port 22 exists. Click the RDP button in the GCP Console and supply the credentials to log in.

D.

Set a Windows username and password in the GCP Console. Verify that a firewall rule for port 3389 exists. Click the RDP button in the GCP Console, and supply the credentials to log in.

Question # 36

You have downloaded and installed the gcloud command line interface (CLI) and have authenticated with your Google Account. Most of your Compute Engine instances in your project run in the europe-west1-d zone. You want to avoid having to specify this zone with each CLI command when managing these instances. What should you do?

A.

Set the europe-west1-d zone as the default zone using the gcloud config subcommand.

B.

In the Settings page for Compute Engine under Default location, set the zone to europe–west1-d.

C.

In the CLI installation directory, create a file called default.conf containing zone=europe–west1–d.

D.

Create a Metadata entry on the Compute Engine page with key compute/zone and value europe–west1–d.

Question # 37

You have deployed an application on a single Compute Engine instance. The application writes logs to disk. Users start reporting errors with the application. You want to diagnose the problem. What should you do?

A.

Navigate to Cloud Logging and view the application logs.

B.

Connect to the instance’s serial console and read the application logs.

C.

Configure a Health Check on the instance and set a Low Healthy Threshold value.

D.

Install and configure the Cloud Logging Agent and view the logs from Cloud Logging.

Question # 38

Your preview application, deployed on a single-zone Google Kubernetes Engine (GKE) cluster in us-centrall, has gained popularity. You are now ready to make the application generally available. You need to deploy the application to production while ensuring high availability and resilience. You also want to follow Google-recommended practices. What should you do?

A.

Use the gcloud container clusters create command with the options--enable-multi-networking and--enable- autoscaling to create an autoscaling zonal cluster and deploy the application to it.

B.

Use the gcloud container clusters create-auto command to create an autopilot cluster and deploy the application to it.

C.

Use the gcloud container clusters update command with the option—region us-centrall to update the cluster and deploy the application to it.

D.

Use the gcloud container clusters update command with the option—node-locations us-centrall-a,us-centrall-b to update the cluster and deploy the application to the nodes.

Question # 39

Your company runs one batch process in an on-premises server that takes around 30 hours to complete. The task runs monthly, can be performed offline, and must be restarted if interrupted. You want to migrate this workload to the cloud while minimizing cost. What should you do?

A.

Migrate the workload to a Compute Engine Preemptible VM.

B.

Migrate the workload to a Google Kubernetes Engine cluster with Preemptible nodes.

C.

Migrate the workload to a Compute Engine VM. Start and stop the instance as needed.

D.

Create an Instance Template with Preemptible VMs On. Create a Managed Instance Group from the template and adjust Target CPU Utilization. Migrate the workload.

Question # 40

Your company requires that Google Cloud products are created with a specific configuration to comply with your company's security policies You need to implement a mechanism that will allow software engineers at your company to deploy and update Google Cloud products in a preconfigured and approved manner. What should you do?

A.

Create Java packages that utilize the Google Cloud Client Libraries for Java to configure Google Cloud products. Store and share the packages in a source code repository.

B.

Create bash scripts that utilize the Google Cloud CLI to configure Google Cloud products. Store and share the bash scripts in a source code repository.

C.

Create Terraform modules that utilize the Google Cloud Terraform Provider to configure Google Cloud products. Store and share the modules in a source code repository.

D.

Use the Google Cloud APIs by using curl to configure Google Cloud products. Store and share the curl commands in a source code repository.

Question # 41

You want to add a new auditor to a Google Cloud Platform project. The auditor should be allowed to read, but not modify, all project items.

How should you configure the auditor's permissions?

A.

Create a custom role with view-only project permissions. Add the user's account to the custom role.

B.

Create a custom role with view-only service permissions. Add the user's account to the custom role.

C.

Select the built-in IAM project Viewer role. Add the user's account to this role.

D.

Select the built-in IAM service Viewer role. Add the user's account to this role.

Question # 42

Your team is using Linux instances on Google Cloud. You need to ensure that your team logs in to these instances in the most secure and cost efficient way. What should you do?

A.

Attach a public IP to the instances and allow incoming connections from the internet on port 22 for SSH.

B.

Use a third party tool to provide remote access to the instances.

C.

Use the gcloud compute ssh command with the --tunnel-through-iap flag. Allow ingress traffic from the IP range 35.235.240.0/20 on port 22.

D.

Create a bastion host with public internet access. Create the SSH tunnel to the instance through the bastion host.

Question # 43

You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want to minimize service costs. What should you do?

A.

Select Google Kubernetes Engine. Use a single-node cluster with a small instance type.

B.

Select Google Kubernetes Engine. Use a three-node cluster with micro instance types.

C.

Select Compute Engine. Use preemptible VM instances of the appropriate standard machine type.

D.

Select Compute Engine. Use VM instance types that support micro bursting.

Question # 44

Your company has an internal application for managing transactional orders. The application is used exclusively by employees in a single physical location. The application requires strong consistency, fast queries, and ACID guarantees for multi-table transactional updates. The first version of the application is implemented inPostgreSQL, and you want to deploy it to the cloud with minimal code changes. Which database is most appropriate for this application?

A.

BigQuery

B.

Cloud SQL

C.

Cloud Spanner

D.

Cloud Datastore

Question # 45

Your company is seeking a scalable solution to retain and explore application logs hosted on Compute Engine. You must be able to analyze your logs with SQL queries, and you want to be able to create charts to identify patterns and trends in your logs over time. You want to follow Google-recommended practices and minimize your operational costs. What should you do?

A.

Use a custom script to push your application logs to Cloud SQL for exploration.

B.

Ingest your application logs to Cloud Logging by using Ops Agent, and explore your logs in Logs Explorer.

C.

Ingest your application logs to Cloud Logging by using Ops Agent, and explore your logs with Log Analytics.

D.

Use a custom script to push your application logs to BigQuery for exploration.

Question # 46

You want to deploy a new containerized application into Google Cloud by using a Kubernetes manifest. You want to have full control over the Kubernetes deployment, and at the same time, you want to minimize configuring infrastructure. What should you do?

A.

Deploy the application on GKE Autopilot.

B.

Deploy the application on GKE Standard.

C.

Deploy the application on Cloud Functions.

D.

Deploy the application on Cloud Run.

Question # 47

You have a development project with appropriate IAM roles defined. You are creating a production project and want to have the same IAM roles on the new project, using the fewest possible steps. What should you do?

A.

Use gcloud iam roles copy and specify the production project as the destination project.

B.

Use gcloud iam roles copy and specify your organization as the destination organization.

C.

In the Google Cloud Platform Console, use the ‘create role from role’ functionality.

D.

In the Google Cloud Platform Console, use the ‘create role’ functionality and select all applicable permissions.

Question # 48

You have been asked to migrate a docker application from datacenter to cloud. Your solution architect has suggested uploading docker images to GCR in one project and running an application in a GKE cluster in a separate project. You want to store images in the project img-278322 and run the application in the project prod-278986. You want to tag the image as acme_track_n_trace:v1. You want to follow Google-recommended practices. What should you do?

A.

Run gcloud builds submit --tag gcr.io/img-278322/acme_track_n_trace

B.

Run gcloud builds submit --tag gcr.io/img-278322/acme_track_n_trace:v1

C.

Run gcloud builds submit --tag gcr.io/prod-278986/acme_track_n_trace

D.

Run gcloud builds submit --tag gcr.io/prod-278986/acme_track_n_trace:v1

Question # 49

You recently deployed a new version of an application to App Engine and then discovered a bug in the release. You need to immediately revert to the prior version of the application. What should you do?

A.

Run gcloud app restore.

B.

On the App Engine page of the GCP Console, select the application that needs to be reverted and click Revert.

C.

On the App Engine Versions page of the GCP Console, route 100% of the traffic to the previous version.

D.

Deploy the original version as a separate application. Then go to App Engine settings and split traffic between applications so that the original version serves 100% of the requests.

Question # 50

Your company uses BigQuery for data warehousing. Over time, many different business units in your company have created 1000+ datasets across hundreds of projects. Your CIO wants you to examine all datasets to find tables that contain an employee_ssn column. You want to minimize effort in performing this task. What should you do?

A.

Go to Data Catalog and search for employee_ssn in the search box.

B.

Write a shell script that uses the bq command line tool to loop through all the projects in your organization.

C.

Write a script that loops through all the projects in your organization and runs a query on INFORMATION_SCHEMA.COLUMNS view to find the employee_ssn column.

D.

Write a Cloud Dataflow job that loops through all the projects in your organization and runs a query on INFORMATION_SCHEMA.COLUMNS view to find employee_ssn column.

Question # 51

You have several hundred microservice applications running in a Google Kubernetes Engine (GKE) cluster. Each microservice is a deployment with resource limits configured for each container in the deployment. You've observed that the resource limits for memory and CPU are not appropriately set for many of the microservices. You want to ensure that each microservice has right sized limits for memory and CPU. What should you do?

A.

Modify the cluster's node pool machine type and choose a machine type with more memory and CPU.

B.

Configure a Horizontal Pod Autoscaler for each microservice.

C.

Configure GKE cluster autoscaling.

D.

Configure a Vertical Pod Autoscaler for each microservice.

Question # 52

(You are deploying an application to Google Kubernetes Engine (GKE). The application needs to make API calls to a private Cloud Storage bucket. You need to configure your application Pods to authenticate to the Cloud Storage API, but your organization policy prevents the usage of service account keys. You want to follow Google-recommended practices. What should you do?)

A.

Create the GKE cluster and deploy the application. Request a security exception to create a Google service account key. Set the constraints/iam.serviceAccountKeyExpiryHours organization policy to 8 hours.

B.

Create the GKE cluster and deploy the application. Request a security exception to create a Google service account key. Set the constraints/iam.serviceAccountKeyExpiryHours organization policy to 24 hours.

C.

Create the GKE cluster with Workload Identity Federation. Configure the default node service account to access the bucket. Deploy the application into the cluster so the application can use the node service account permissions. Use Identity and Access Management (IAM) to grant the service account access to the bucket.

D.

Create the GKE cluster with Workload Identity Federation. Create a Google service account and a Kubernetes ServiceAccount, and configure both service accounts to use Workload Identity Federation. Attach the Kubernetes ServiceAccount to the application Pods and configure the Google service account to access the bucket with Identity and Access Management (IAM).

Question # 53

You are designing an application that uses WebSockets and HTTP sessions that are not distributed across the web servers. You want to ensure the application runs properly on Google Cloud Platform. What should you do?

A.

Meet with the cloud enablement team to discuss load balancer options.

B.

Redesign the application to use a distributed user session service that does not rely on WebSockets and HTTP sessions.

C.

Review the encryption requirements for WebSocket connections with the security team.

D.

Convert the WebSocket code to use HTTP streaming.

Question # 54

(Your company uses a multi-cloud strategy that includes Google Cloud. You want to centralize application logs in a third-party software-as-a-service (SaaS) tool from all environments. You need tointegrate logs originating from Cloud Logging, and you want to ensure the export occurs with the least amount of delay possible. What should you do?)

A.

Use a Cloud Scheduler cron job to trigger a Cloud Function that queries Cloud Logging and sends the logs to the SaaS tool.

B.

Create a Cloud Logging sink and configure Pub/Sub as the destination. Configure the SaaS tool to subscribe to the Pub/Sub topic to retrieve the logs.

C.

Create a Cloud Logging sink and configure Cloud Storage as the destination. Configure the SaaS tool to read the Cloud Storage bucket to retrieve the logs.

D.

Create a Cloud Logging sink and configure BigQuery as the destination. Configure the SaaS tool to query BigQuery to retrieve the logs.

Question # 55

You built an application on Google Cloud Platform that uses Cloud Spanner. Your support team needs to monitor the environment but should not have access to table data. You need a streamlined solution to grant the correct permissions to your support team, and you want to follow Google-recommended practices. What should you do?

A.

Add the support team group to the roles/monitoring.viewer role

B.

Add the support team group to the roles/spanner.databaseUser role.

C.

Add the support team group to the roles/spanner.databaseReader role.

D.

Add the support team group to the roles/stackdriver.accounts.viewer role.

Question # 56

You are monitoring an application and receive user feedback that a specific error is spiking. You notice that the error is caused by a Service Account having insufficient permissions. You are able to solve the problem but want to be notified if the problem recurs. What should you do?

A.

In the Log Viewer, filter the logs on severity 'Error' and the name of the Service Account.

B.

Create a sink to BigQuery to export all the logs. Create a Data Studio dashboard on the exported logs.

C.

Create a custom log-based metric for the specific error to be used in an Alerting Policy.

D.

Grant Project Owner access to the Service Account.

Question # 57

You need to create a new billing account and then link it with an existing Google Cloud Platform project. What should you do?

A.

Verify that you are Project Billing Manager for the GCP project. Update the existing project to link it to the existing billing account.

B.

Verify that you are Project Billing Manager for the GCP project. Create a new billing account and link the new billing account to the existing project.

C.

Verify that you are Billing Administrator for the billing account. Create a new project and link the new project to the existing billing account.

D.

Verify that you are Billing Administrator for the billing account. Update the existing project to link it to the existing billing account.

Question # 58

(You are deploying a web application using Compute Engine. You created a managed instance group (MIG) to host the application. You want to follow Google-recommended practices to implement a secure and highly available solution. What should you do?)

A.

Use a proxy Network Load Balancer for the MIG and an A record in your DNS private zone with the load balancer's IP address.

B.

Use a proxy Network Load Balancer for the MIG and a CNAME record in your DNS public zone with the load balancer's IP address.

C.

Use an Application Load Balancer for the MIG and a CNAME record in your DNS private zone with the load balancer's IP address.

D.

Use an Application Load Balancer for the MIG and an A record in your DNS public zone with the load balancer's IP address.

Question # 59

An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination. What should you do?

A.

View System Event Logs in Stackdriver. Search for the user’s email as the principal.

B.

View System Event Logs in Stackdriver. Search for the service account associated with the user.

C.

View Data Access audit logs in Stackdriver. Search for the user’s email as the principal.

D.

View the Admin Activity log in Stackdriver. Search for the service account associated with the user.

Question # 60

You are asked to set up application performance monitoring on Google Cloud projects A, B, and C as a single pane of glass. You want to monitor CPU, memory, and disk. What should you do?

A.

Enable API and then share charts from project A, B, and C.

B.

Enable API and then give the metrics.reader role to projects A, B, and C.

C.

Enable API and then use default dashboards to view all projects in sequence.

D.

Enable API, create a workspace under project A, and then add project B and C.

Question # 61

You need to manage multiple Google Cloud Platform (GCP) projects in the fewest steps possible. You want to configure the Google Cloud SDK command line interface (CLI) so that you can easily manage multiple GCP projects. What should you?

A.

1. Create a configuration for each project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

B.

1. Create a configuration for each project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project

C.

1. Use the default configuration for one project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

D.

1. Use the default configuration for one project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project.

Question # 62

You are planning to move your company's website and a specific asynchronous background job to Google Cloud Your website contains only static HTML content The background job is started through an HTTP endpoint and generates monthly invoices for your customers. Your website needs to be available in multiple geographic locations and requires autoscaling. You want to have no costs when your workloads are not In use and follow recommended practices. What should you do?

A.

Move your website to Google Kubemetes Engine (GKE). and move your background job to Cloud Functions

B.

Move both your website and background job to Compute Engine

C.

Move both your website and background job to Cloud Run.

D.

Move your website to Google Kubemetes Engine (GKE), and move your background job to Compute Engine

Question # 63

You have a Compute Engine instance hosting a production application. You want to receive an email if the instance consumes more than 90% of its CPU resources for more than 15 minutes. You want to use Google services. What should you do?

A.

1. Create a consumer Gmail account.2.Write a script that monitors the CPU usage.3.When the CPU usage exceeds the threshold, have that script send an email using the Gmail account and smtp.gmail.com on port 25 as SMTP server.

B.

1. Create a Stackdriver Workspace, and associate your Google Cloud Platform (GCP) project with it.2.Create an Alerting Policy in Stackdriver that uses the threshold as a trigger condition.3.Configure your email address in the notification channel.

C.

1. Create a Stackdriver Workspace, and associate your GCP project with it.2.Write a script that monitors the CPU usage and sends it as a custom metric to Stackdriver.3.Create an uptime check for the instance in Stackdriver.

D.

1. In Stackdriver Logging, create a logs-based metric to extract the CPU usage by using this regular expression: CPU Usage: ([0-9] {1,3}) %2.In Stackdriver Monitoring, create an Alerting Policy based on this metric.3.Configure your email address in the notification channel.

Question # 64

After a recent security incident, your startup company wants better insight into what is happening in the Google Cloud environment. You need to monitor unexpected firewall changes and instance creation. Your company prefers simple solutions. What should you do?

A.

Use Cloud Logging filters to create log-based metrics for firewall and instance actions. Monitor the changes and set up reasonable alerts.

B.

Install Kibana on a compute Instance. Create a log sink to forward Cloud Audit Logs filtered for firewalls andcompute instances to Pub/Sub. Target the Pub/Sub topic to push messages to the Kibana instance. Analyze the logs on Kibana in real time.

C.

Turn on Google Cloud firewall rules logging, and set up alerts for any insert, update, or delete events.

D.

Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Cloud Storage.Use BigQuery to periodically analyze log events in the storage bucket.

Question # 65

You have 32 GB of data in a single file that you need to upload to a Nearline Storage bucket. The WAN connection you are using is rated at 1 Gbps, and you are the only one on the connection. You want to use as much of the rated 1 Gbps as possible to transfer the file rapidly. How should you upload the file?

A.

Use the GCP Console to transfer the file instead of gsutil.

B.

Enable parallel composite uploads using gsutil on the file transfer.

C.

Decrease the TCP window size on the machine initiating the transfer.

D.

Change the storage class of the bucket from Nearline to Multi-Regional.

Question # 66

Your organization has a dedicated person who creates and manages all service accounts for Google Cloud projects. You need to assign this person the minimum role for projects. What should you do?

A.

Add the user to roles/iam.roleAdmin role.

B.

Add the user to roles/iam.securityAdmin role.

C.

Add the user to roles/iam.serviceAccountUser role.

D.

Add the user to roles/iam.serviceAccountAdmin role.

Question # 67

You are using Deployment Manager to create a Google Kubernetes Engine cluster. Using the same Deployment Manager deployment, you also want to create a DaemonSet in the kube-system namespace of the cluster. You want a solution that uses the fewest possible services. What should you do?

A.

Add the cluster’s API as a new Type Provider in Deployment Manager, and use the new type to create the DaemonSet.

B.

Use the Deployment Manager Runtime Configurator to create a new Config resource that contains the DaemonSet definition.

C.

With Deployment Manager, create a Compute Engine instance with a startup script that uses kubectl to create the DaemonSet.

D.

In the cluster’s definition in Deployment Manager, add a metadata that has kube-system as key and the DaemonSet manifest as value.

Question # 68

You are building an application that stores relational data from users. Users across the globe will use this application. Your CTO is concerned about the scaling requirements because the size of the user base is unknown. You need to implement a database solution that can scale with your user growth with minimum configuration changes. Which storage solution should you use?

A.

Cloud SQL

B.

Cloud Spanner

C.

Cloud Firestore

D.

Cloud Datastore

Question # 69

(Your company’s developers use an automation that you recently built to provision Linux VMs in Compute Engine within a Google Cloud project to perform various tasks. You need to manage the Linux account lifecycle and access for these users. You want to follow Google-recommended practices to simplify access management while minimizing operational costs. What should you do?)

A.

Enable OS Login for all VMs. Use IAM roles to grant user permissions.

B.

Enable OS Login for all VMs. Write custom startup scripts to update user permissions.

C.

Require your developers to create public SSH keys. Make the owner of the public key the root user.

D.

Require your developers to create public SSH keys. Write custom startup scripts to update user permissions.

Question # 70

An application generates daily reports in a Compute Engine virtual machine (VM). The VM is in the project corp-iot-insights. Your team operates only in the project corp-aggregate-reports and needs a copy of the daily exports in the bucket corp-aggregate-reports-storage. You want to configure access so that the daily reports from the VM are available in the bucket corp-aggregate-reports-storage and use as few steps as possible while following Google-recommended practices. What should you do?

A.

Move both projects under the same folder.

B.

Grant the VM Service Account the role Storage Object Creator on corp-aggregate-reports-storage.

C.

Create a Shared VPC network between both projects. Grant the VM Service Account the role Storage Object Creator on corp-iot-insights.

D.

Make corp-aggregate-reports-storage public and create a folder with a pseudo-randomized suffix name. Share the folder with the IoT team.

Question # 71

An external member of your team needs list access to compute images and disks in one of your projects. You want to follow Google-recommended practices when you grant the required permissions to this user. What should you do?

A.

Create a custom role, and add all the required compute.disks.list and compute, images.list permissions as includedPermissions. Grant the custom role to the user at the project level.

B.

Create a custom role based on the Compute Image User role Add the compute.disks, list to theincludedPermissions field Grant the custom role to the user at the project level

C.

Grant the Compute Storage Admin role at the project level.

D.

Create a custom role based on the Compute Storage Admin role. Exclude unnecessary permissions from the custom role. Grant the custom role to the user at the project level.

Question # 72

You need to verify that a Google Cloud Platform service account was created at a particular time. What should you do?

A.

Filter the Activity log to view the Configuration category. Filter the Resource type to Service Account.

B.

Filter the Activity log to view the Configuration category. Filter the Resource type to Google Project.

C.

Filter the Activity log to view the Data Access category. Filter the Resource type to Service Account.

D.

Filter the Activity log to view the Data Access category. Filter the Resource type to Google Project.

Question # 73

You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?

A.

When creating the VM via the web console, specify the service account under the ‘Identity and API Access’ section.

B.

Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.

C.

Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.

D.

Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.

Question # 74

(You are developing an internet of things (IoT) application that captures sensor data from multiple devices that have already been set up. You need to identify the global data storage product your company should use to store this data. You must ensure that the storage solution you choose meets your requirements of sub-millisecond latency. What should you do?)

A.

Store the IoT data in Spanner. Use caches to speed up the process and avoid latencies.

B.

Store the IoT data in Bigtable.

C.

Capture IoT data in BigQuery datasets.

D.

Store the IoT data in Cloud Storage. Implement caching by using Cloud CDN.

Question # 75

You recently received a new Google Cloud project with an attached billing account where you will work. You need to create instances, set firewalls, and store data in Cloud Storage. You want to follow Google-recommended practices. What should you do?

A.

Use the gcloud CLI services enablecloudresourcemanager.googleapis.comcommand to enable all resources.

B.

Use the gcloud services enablecompute.googleapis.comcommand to enable Compute Engineand thegcloud services enablestorage-api.googleapis.comcommand to enable the Cloud Storage APIs.

C.

Open the Google Cloud console and enable all Google Cloud APIs from the API dashboard.

D.

Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

Question # 76

Your organization is migrating to Google Cloud. You want only users with company-issued Google accounts to access your Google Cloud environment. You must ensure that users of the same department can only access resources within their own department. You want to minimize operational costs while following Google-recommended practices. What should you do?

A.

Create a folder for each department in Resource Manager. Grant the users of each department the Folder Admin role on the folder of their department.

B.

Assign users to the relevant Google Groups and provide access to cloud resources through Identity and Access Management (IAM) roles. Use organization policies to block non-company issued emails.

C.

Create a folder for each department in Resource Manager. Grant all company users the Folder Admin role on the organization level.

D.

Assign users to the relevant Google Groups and provide access to cloud resources through Identity and Access Management (IAM) roles. Periodically identify and remove non-company issued Google accounts.

Question # 77

You have deployed an application on a Compute Engine instance. An external consultant needs to access the Linux-based instance. The consultant is connected to your corporate network through a VPN connection, but the consultant has no Google account. What should you do?

A.

Instruct the external consultant to use the gcloud compute ssh command line tool by using Identity-Aware Proxy to access the instance.

B.

Instruct the external consultant to use the gcloud compute ssh command line tool by using the public IP address of the instance to access it.

C.

Instruct the external consultant to generate an SSH key pair, and request the public key from the consultant.Add the public key to the instance yourself, and have the consultant access the instance through SSH with their private key.

D.

Instruct the external consultant to generate an SSH key pair, and request the private key from the consultant.Add the private key to the instance yourself, and have the consultant access the instance through SSH with their public key.

Question # 78

You have a developer laptop with the Cloud SDK installed on Ubuntu. The Cloud SDK was installed from the Google Cloud Ubuntu package repository. You want to test your application locally on your laptop with Cloud Datastore. What should you do?

A.

Export Cloud Datastore data using gcloud datastore export.

B.

Create a Cloud Datastore index using gcloud datastore indexes create.

C.

Install the google-cloud-sdk-datastore-emulator component using the apt get install command.

D.

Install the cloud-datastore-emulator component using the gcloud components install command.

Question # 79

Your company is active in the European Economic Area (EEA), and will adopt Google Cloud for its workloads. Projects are currently structured within different folders. You need to ensure any resources that will be deployed are using Google Cloud locations within the EEA by using the Organization Policy Service resource locations constraint. What should you do?

A.

Configure the policy at the folder level and add all allowed locations to the policy.

B.

Configure the policy at the organization level and add all allowed locations to the policy.

C.

Configure the policy at the folder level and add all disallowed locations to the policy.

D.

Configure the policy at the organization level and add all disallowed locations to the policy.

Question # 80

(Your company has a rapidly growing social media platform and a user base primarily located in North America. Due to increasing demand, your current on-premises PostgreSQL database, hosted in your United States headquarters data center, no longer meets your needs. You need to identify a cloud-based database solution that offers automatic scaling, multi-region support for future expansion, and maintains low latency.)

A.

Use Bigtable.

B.

Use BigQuery.

C.

Use Spanner.

D.

Use Cloud SQL for PostgreSQL.

Question # 81

Your organization has three existing Google Cloud projects. You need to bill the Marketing department for only their Google Cloud services for a new initiative within their group. What should you do?

A.

1. Verify that you ace assigned the Billing Administrator IAM role tor your organization's Google Cloud Project for the Marketing department2. Link the new project to a Marketing Billing Account

B.

1. Verify that you are assigned the Billing Administrator IAM role for your organization's Google Cloud account2. Create a new Google Cloud Project for the Marketing department3. Set the default key-value project labels to department marketing for all services in this project

C.

1. Verify that you are assigned the Organization Administrator IAM role for your organization's Google Cloud account2. Create a new Google Cloud Project for the Marketing department 3. Link the new project to a Marketing Billing Account.

D.

1. Verity that you are assigned the Organization Administrator IAM role for your organization's Google Cloud account2. Create a new Google Cloud Project for the Marketing department3. Set the default key value project labels to department marketing for all services in this protect

Question # 82

A colleague handed over a Google Cloud project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role. What should you do?

A.

In the Google Cloud console, validate which SSH keys have been stored as project-wide keys.

B.

Navigate to Identity-Aware Proxy and check the permissions for these resources.

C.

Enable Audit logs on the IAM & admin page for all resources, and validate the results.

D.

Use the gcloud projects get-iam-policy command to view the current role assignments.

Question # 83

You have a managed instance group comprised of preemptible VM's. All of the VM's keepdeleting and recreating themselves every minute. What is a possible cause of thisbehavior?

A.

Your zonal capacity is limited, causing all preemptible VM's to be shutdown torecover capacity. Try deploying your group to another zone.

B.

You have hit your instance quota for the region.

C.

Your managed instance group's VM's are toggled to only last 1 minute inpreemptible settings.

D.

Your managed instance group's health check is repeatedly failing, either to amisconfigured health check or misconfigured firewall rules not allowing the healthcheck to access the instance

Question # 84

You are managing a Data Warehouse on BigQuery. An external auditor will review your company's processes, and multiple external consultants will need view access to the data. You need to provide them with view access while following Google-recommended practices. What should you do?

A.

Grant each individual external consultant the role of BigQuery Editor

B.

Grant each individual external consultant the role of BigQuery Viewer

C.

Create a Google Group that contains the consultants and grant the group the role of BigQuery Editor

D.

Create a Google Group that contains the consultants, and grant the group the role of BigQuery Viewer

Question # 85

Your company is moving its entire workload to Compute Engine. Some servers should be accessible through the Internet, and other servers should only be accessible over the internal network. All servers need to be able to talk to each other over specific ports and protocols. The current on-premises network relies on a demilitarized zone (DMZ) for the public servers and a Local Area Network (LAN) for the private servers. You need to design the networking infrastructure on

Google Cloud to match these requirements. What should you do?

A.

1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.

B.

1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.

C.

1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.

D.

1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.

Question # 86

You are running a data warehouse on BigQuery. A partner company is offering a recommendation engine based on the data in your data warehouse. The partner company is also running their application on Google Cloud. They manage the resources in their own project, but they need access to the BigQuery dataset in your project. You want to provide the partner company with access to the dataset What should you do?

A.

Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project

B.

Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project

C.

Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project

D.

Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project

Question # 87

You have developed a web application that serves traffic for a local event and are expecting unpredictable traffic. You have containerized the application, and you now want to deploy the application on Google Cloud. You also want to minimize costs. What should you do?

A.

Deploy the web application as a Cloud Run service.

B.

Deploy the web application on Google Kubernetes Engine In Standard mode.

C.

Deploy the web application as a Cloud Run job.

D.

Deploy the web application on Google Kubernetes Engine in Autopilot mode.

Question # 88

You are managing a fleet of Compute Engine Linux instances in a Google Cloud project. Your company's engineering team requires SSH access to all instances to perform routine maintenance tasks. You need to manage the SSH access for the engineering team and you want to minimize operational overhead when engineers join or leave the team. What should you do?

A.

Create a Google Group for all engineering team members and set up OS Login for this group on the project. Manage group membership when engineers join or leave the team.

B.

Create a Google Group for all engineering team members, and grant them the Compute Viewer IAM role. Manage group membership when engineers join or leave the team.

C.

Create a single SSH key pair to be shared by all engineering team members. Add the public SSH key to project metadata.

D.

Create an SSH key pair for each engineer on the team and add the public SSH key to the metadata of the relevant instances.

Question # 89

You have designed a solution on Google Cloud Platform (GCP) that uses multiple GCP products. Your company has asked you to estimate the costs of the solution. You need to provide estimates for the monthly total cost. What should you do?

A.

For each GCP product in the solution, review the pricing details on the products pricing page. Use the pricing calculator to total the monthly costs for each GCP product.

B.

For each GCP product in the solution, review the pricing details on the products pricing page. Create a Google Sheet that summarizes the expected monthly costs for each product.

C.

Provision the solution on GCP. Leave the solution provisioned for 1 week. Navigate to the Billing Report page in the Google Cloud Platform Console. Multiply the 1 week cost to determine the monthly costs.

D.

Provision the solution on GCP. Leave the solution provisioned for 1 week. Use Stackdriver to determine the provisioned and used resource amounts. Multiply the 1 week cost to determine the monthly costs.

Question # 90

You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VPC. What should you do?

A.

Verify that both projects are in a GCP Organization. Create a new VPC and add all instances.

B.

Verify that both projects are in a GCP Organization. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.

C.

Verify that you are the Project Administrator of both projects. Create two new VPCs and add all instances.

D.

Verify that you are the Project Administrator of both projects. Create a new VPC and add all instances.

Question # 91

You are working for a startup that was officially registered as a business 6 months ago. As your customer base grows, your use of Google Cloud increases. You want to allow all engineers to create new projects without asking them for their credit card information. What should you do?

A.

Create a Billing account, associate a payment method with it, and provide all project creators with permission to associate that billing account with their projects.

B.

Grant all engineer’s permission to create their own billing accounts for each new project.

C.

Apply for monthly invoiced billing, and have a single invoice tor the project paid by the finance team.

D.

Create a billing account, associate it with a monthly purchase order (PO), and send the PO to Google Cloud.

Question # 92

You need to monitor resources that are distributed over different projects in Google Cloud Platform. You want to consolidate reporting under the same Stackdriver Monitoring dashboard. What should you do?

A.

Use Shared VPC to connect all projects, and link Stackdriver to one of the projects.

B.

For each project, create a Stackdriver account. In each project, create a service account for that project and grant it the role of Stackdriver Account Editor in all other projects.

C.

Configure a single Stackdriver account, and link all projects to the same account.

D.

Configure a single Stackdriver account for one of the projects. In Stackdriver, create a Group and add the other project names as criteria for that Group.

Question # 93

You have developed an application that consists of multiple microservices, with each microservice packaged in its own Docker container image. You want to deploy the entire application on Google Kubernetes Engine so that each microservice can be scaled individually. What should you do?

A.

Create and deploy a Custom Resource Definition per microservice.

B.

Create and deploy a Docker Compose File.

C.

Create and deploy a Job per microservice.

D.

Create and deploy a Deployment per microservice.

Question # 94

Your Dataproc cluster runs in a single Virtual Private Cloud (VPC) network in a single subnet with range 172.16.20.128/25. There are no private IP addresses available in the VPC network. You want to add new VMs to communicate with your cluster using the minimum number of steps. What should you do?

A.

Modify the existing subnet range to 172.16.20.0/24.

B.

Create a new Secondary IP Range in the VPC and configure the VMs to use that range.

C.

Create a new VPC network for the VMs. Enable VPC Peering between the VMs’ VPC network and the Dataproc cluster VPC network.

D.

Create a new VPC network for the VMs with a subnet of 172.32.0.0/16. Enable VPC network Peering between the Dataproc VPC network and the VMs VPC network. Configure a custom Route exchange.

Question # 95

You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project. What should you do?

A.

1. Verify that you are assigned the Project Owners IAM role for this project.2. Locate the project in the GCP console, click Shut down and then enter the project ID.

B.

1. Verify that you are assigned the Project Owners IAM role for this project.2. Switch to the project in the GCP console, locate the resources and delete them.

C.

1. Verify that you are assigned the Organizational Administrator IAM role for this project.2. Locate the project in the GCP console, enter the project ID and then click Shut down.

D.

1. Verify that you are assigned the Organizational Administrators IAM role for this project.2. Switch to the project in the GCP console, locate the resources and delete them.

Question # 96

Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?

A.

Create an export to the sink that saves logs from Cloud Audit to BigQuery.

B.

Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.

C.

Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.

D.

Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.

Question # 97

You are building an application that will run in your data center. The application will use Google Cloud Platform (GCP) services like AutoML. You created a service account that has appropriate access to AutoML. You need to enable authentication to the APIs from your on-premises environment. What should you do?

A.

Use service account credentials in your on-premises application.

B.

Use gcloud to create a key file for the service account that has appropriate permissions.

C.

Set up direct interconnect between your data center and Google Cloud Platform to enable authentication for your on-premises applications.

D.

Go to the IAM & admin console, grant a user account permissions similar to the service account permissions, and use this user account for authentication from your data center.

Question # 98

Your company uses Pub/Sub for event-driven workloads. You have a subscription named email-updates attached to the new-orders topic. You need to fetch and acknowledge waiting messages from this subscription. What should you do?

A.

Use the gcloud pubsub subscriptions seek email-updates command.

B.

Use the gcloud pubsub topics describe new-orders command.

C.

Use the gcloud pubsub subscriptions pull email-updates —auto-ack command.

D.

Use the gcloud pubsub topics list-subscriptions new-orders —1ilter="email-updates" command.

Question # 99

You have two subnets (subnet-a and subnet-b) in the default VPC. Your database servers are running in subnet-a. Your application servers and web servers are running in subnet-b. You want to configure a firewall rule that only allows database traffic from the application servers to the database servers. What should you do?

A.

* Create service accounts sa-app and sa-db.• Associate service account: sa-app with the application servers and the service account sa-db with the database servers.• Create an ingress firewall rule to allow network traffic from source service account sa-app to target service account sa-db.

B.

• Create network tags app-server and db-server.• Add the app-server lag lo the application servers and the db-server lag to the database servers.• Create an egress firewall rule to allow network traffic from source network tag app-server to target network tag db-server.

C.

* Create a service account sa-app and a network tag db-server.* Associate the service account sa-app with the application servers and the network tag db-server withthe database servers.• Create an ingress firewall rule to allow network traffic from source VPC IP addresses and target the subnet-a IP addresses.

D.

• Create a network lag app-server and service account sa-db.• Add the tag to the application servers and associate the service account with the database servers.• Create an egress firewall rule to allow network traffic from source network tag app-server to target service account sa-db.

Question # 100

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

A.

1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add the following value: logs-destination: bq://platform-logs.

B.

1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2. Create a Cloud Function that is triggered by messages in the logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

C.

1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3. Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

D.

1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that executes this query:INSERT INTO dataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.

Question # 101

You are building a product on top of Google Kubernetes Engine (GKE). You have a single GKE cluster. For each of your customers, a Pod is running in that cluster, and your customers can run arbitrary code inside their Pod. You want to maximize the isolation between your customers’ Pods. What should you do?

A.

Use Binary Authorization and whitelist only the container images used by your customers’ Pods.

B.

Use the Container Analysis API to detect vulnerabilities in the containers used by your customers’ Pods.

C.

Create a GKE node pool with a sandbox type configured to gvisor. Add the parameter runtimeClassName: gvisor to the specification of your customers’ Pods.

D.

Use the cos_containerd image for your GKE nodes. Add a nodeSelector with the value cloud.google.com/gke-os-distribution: cos_containerd to the specification of your customers’ Pods.

Question # 102

Your company wants to provide engineers with access to explore Google Cloud freely in a sandbox environment. The total budget for testing across your organization is $1,000. You need to ensure that engineers are notified when the budget is about to be reached. You want to automate your solution as much as possible. What should you do?

A.

Configure a billing data export to a BigQuery dataset on the Cloud Billing account. Create a dashboard for all costs related to the sandbox experiments. Share the dashboard with all engineers.

B.

Create a Google Cloud Folder and ensure that all sandbox projects are located under that Folder. Create a Budget Alert for $1,000 and scope it to the Folder. Configure an email alert to billing administrators and users once the budget is 90% reached.

C.

Create a separate Cloud Billing account for all sandbox projects. Link a credit card with a limit of $1,000 to this billing account. Ensure all sandbox projects are linked to this new Cloud Billing account.

D.

Create an email template reminding people to regularly check their Google Cloud spend. Create a Cloud Run function that sends the email to all the project owners. Create a daily job in Cloud Scheduler that triggers the Cloud Run function. Deploy this solution for each sandbox.

Associate-Cloud-Engineer PDF

$33

$109.99

3 Months Free Update

  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions

Associate-Cloud-Engineer PDF + Testing Engine

$52.8

$175.99

3 Months Free Update

  • Exam Name: Google Cloud Certified - Associate Cloud Engineer
  • Last Update: Dec 14, 2025
  • Questions and Answers: 343
  • Free Real Questions Demo
  • Recommended by Industry Experts
  • Best Economical Package
  • Immediate Access

Associate-Cloud-Engineer Engine

$39.6

$131.99

3 Months Free Update

  • Best Testing Engine
  • One Click installation
  • Recommended by Teachers
  • Easy to use
  • 3 Modes of Learning
  • State of Art Technology
  • 100% Real Questions included