New Year Special Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: spcl70

Practice Free Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer Exam Questions Answers With Explanation

We at Crack4sure are committed to giving students who are preparing for the Google Professional-Cloud-Security-Engineer Exam the most current and reliable questions . To help people study, we've made some of our Google Cloud Certified - Professional Cloud Security Engineer exam materials available for free to everyone. You can take the Free Professional-Cloud-Security-Engineer Practice Test as many times as you want. The answers to the practice questions are given, and each answer is explained.

Question # 6

You’re developing the incident response plan for your company. You need to define the access strategy that your DevOps team will use when reviewing and investigating a deployment issue in your Google Cloud environment. There are two main requirements:

    Least-privilege access must be enforced at all times.

    The DevOps team must be able to access the required resources only during the deployment issue.

How should you grant access while following Google-recommended best practices?

A.

Assign the Project Viewer Identity and Access Management (1AM) role to the DevOps team.

B.

Create a custom 1AM role with limited list/view permissions, and assign it to the DevOps team.

C.

Create a service account, and grant it the Project Owner 1AM role. Give the Service Account User Role on this service account to the DevOps team.

D.

Create a service account, and grant it limited list/view permissions. Give the Service Account User Role on this service account to the DevOps team.

Question # 7

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question # 8

You will create a new Service Account that should be able to list the Compute Engine instances in the project. You want to follow Google-recommended practices.

What should you do?

A.

Create an Instance Template, and allow the Service Account Read Only access for the Compute Engine Access Scope.

B.

Create a custom role with the permission compute.instances.list and grant the Service Account this role.

C.

Give the Service Account the role of Compute Viewer, and use the new Service Account for all instances.

D.

Give the Service Account the role of Project Viewer, and use the new Service Account for all instances.

Question # 9

Your application is deployed as a highly available cross-region solution behind a global external HTTP(S) load balancer. You notice significant spikes in traffic from multiple IP addresses but it is unknown whether the IPs are malicious. You are concerned about your application's availability. You want to limit traffic from these clients over a specified time interval.

What should you do?

A.

Configure a rate_based_ban action by using Google Cloud Armor and set the ban_duration_sec parameter to the specified time interval.

B.

Configure a deny action by using Google Cloud Armor to deny the clients that issued too many requests over the specified time interval.

C.

Configure a throttle action by using Google Cloud Armor to limit the number of requests per client over a specified time interval.

D.

Configure a firewall rule in your VPC to throttle traffic from the identified IP addresses.

Question # 10

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

A.

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Question # 11

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

A.

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Question # 12

Your company’s new CEO recently sold two of the company’s divisions. Your Director asks you to help migrate the Google Cloud projects associated with those divisions to a new organization node. Which preparation steps are necessary before this migration occurs? (Choose two.)

A.

Remove all project-level custom Identity and Access Management (1AM) roles.

B.

Disallow inheritance of organization policies.

C.

Identify inherited Identity and Access Management (1AM) roles on projects to be migrated.

D.

Create a new folder for all projects to be migrated.

E.

Remove the specific migration projects from any VPC Service Controls perimeters and bridges.

Question # 13

Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?

A.

Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.

B.

Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.

C.

In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.

D.

Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.

Question # 14

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

A.

Use Puppet or Chef to push out the patch to the running container.

B.

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.

Update the application code or apply a patch, build a new image, and redeploy it.

D.

Configure containers to automatically upgrade when the base image is available in Container Registry.

Question # 15

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements such as having a company-managed device, a specific location, and a valid user identity can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

A.

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.

Use Cloud Audit Logs to monitor user access to the project resources. Use post-incident analysis to identify unauthorized access attempts.

C.

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

D.

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

Question # 16

Your organization uses the top-tier folder to separate application environments (prod and dev). The developers need to see all application development audit logs but they are not permitted to review production logs. Your security team can review all logs in production and development environments. You must grant Identity and Access Management (1AM) roles at the right resource level tor the developers and security team while you ensure least privilege.

What should you do?

A.

• 1 Grant logging, viewer rote to the security team at the organization resource level.• 2 Grant logging, viewer rote to the developer team at the folder resource level that contains all the dev projects.

B.

• 1 Grant logging. viewer rote to the security team at the organization resource level.• 2 Grant logging. admin role to the developer team at the organization resource level.

C.

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging. viewer rote to the developer team at the folder resource level that contains all the dev projects.

D.

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging.admin role to the developer team at the organization resource level.

Question # 17

Your financial services company needs to process customer personally identifiable information (PII) for analytics while adhering to strict privacy regulations. You must transform this data to protect individual privacy to ensure that the data retains its original format and consistency for analytical integrity. Your solution must avoid full irreversible deletion. What should you do?

A.

Configure Sensitive Data Protection (SDP) to de-identify PII using format-preserving encryption (FPE).

B.

Use Cloud Key Management Service (Cloud KMS) to encrypt the entire dataset with a customer-managed encryption key (CMEK).

C.

Implement a custom BigQuery user-defined function (UDF) by using JavaScript to hash all sensitive fields before they are loaded into the analytical tables.

D.

Set up VPC Service Controls around the BigQuery project. Implement row-level encryption.

Question # 18

Your organization wants to be compliant with the General Data Protection Regulation (GDPR) on Google Cloud You must implement data residency and operational sovereignty in the EU.

What should you do?

Choose 2 answers

A.

Limit the physical location of a new resource with the Organization Policy Service resource locationsconstraint."

B.

Use Cloud IDS to get east-west and north-south traffic visibility in the EU to monitor intra-VPC and mter-VPC communication.

C.

Limit Google personnel access based on predefined attributes such as their citizenship or geographic location by using Key Access Justifications

D.

Use identity federation to limit access to Google Cloud resources from non-EU entities.

E.

Use VPC Flow Logs to monitor intra-VPC and inter-VPC traffic in the EU.

Question # 19

Your company requires the security and network engineering teams to identify all network anomalies within and across VPCs, internal traffic from VMs to VMs, traffic between end locations on the internet and VMs, and traffic between VMs to Google Cloud services in production. Which method should you use?

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question # 20

You are responsible for managing identities in your company's Google Cloud organization. Employees are frequently using your organization's corporate domain name to create unmanaged Google accounts. You want to implement a practical and efficient solution to prevent employees from completing this action in the future. What should you do?

A.

Implement an automated process that scans all identities in your organization and disables any unmanaged accounts.

B.

Create a Google Cloud identity for all users in your organization. Ensure that new users are added automatically.

C.

Register a new domain for your Google Cloud resources. Move all existing identities and resources to this domain.

D.

Switch your corporate email system to another domain to avoid using the same domain for Google Cloud identities and corporate emails.

Question # 21

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP. The organization's on-premises production environment is going to be the next phase for migration to GCP. Stable networking connectivity between the on-premises environment and GCP is also being implemented.

Which GCP solution should the organization use?

A.

BigQuery using a data pipeline job with continuous updates via Cloud VPN

B.

Cloud Storage using a scheduled task and gsutil via Cloud Interconnect

C.

Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect

D.

Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN

Question # 22

You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.

What should you do?

A.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.

B.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.

C.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.

D.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.

Question # 23

An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request.

Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses

Which solution should your team implement to meet these requirements?

A.

Cloud Armor

B.

Network Load Balancing

C.

SSL Proxy Load Balancing

D.

NAT Gateway

Question # 24

You work for a healthcare provider that is expanding into the cloud to store and process sensitive patient data. You must ensure the chosen Google Cloud configuration meets these strict regulatory requirements:?

    Data must reside within specific geographic regions.?

    Certain administrative actions on patient data require explicit approval from designated compliance officers.?

    Access to patient data must be auditable.?

What should you do?

A.

Select multiple standard Google Cloud regions for high availability. Implement Access Control Lists (ACLs) on individual storage objects containing patient data. Enable Cloud Audit Logs.?

B.

Deploy an Assured Workloads environment in multiple regions for redundancy. Utilize custom IAM roles with granular permissions. Isolate network-level data by using VPC Service Controls.?

C.

Deploy an Assured Workloads environment in an approved region. Configure Access Approval for sensitive operations on patient data. Enable both Cloud Audit Logs and Access Transparency.?

D.

Select a standard Google Cloud region. Restrict access to patient data based on user location and job function by using Access Context Manager. Enable both Cloud Audit Logging and Access Transparency.?

Question # 25

A security audit uncovered several inconsistencies in your project's Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

A.

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.?

B.

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.?

C.

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.?

D.

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.?

Question # 26
A.

Configure IAM permissions on individual Model Garden to restrict access to specific models.

B.

Regularly audit user activity logs in Vertex AI to identify and revoke access to unapproved models.

C.

Train custom models within your Vertex AI project and restrict user access to these models.

D.

Implement an organization policy that restricts the vertexai.allowedModels constraint.

Question # 27

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Question # 28

When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.)

A.

Ensure that the app does not run as PID 1.

B.

Package a single app as a container.

C.

Remove any unnecessary tools not needed by the app.

D.

Use public container images as a base image for the app.

E.

Use many container image layers to hide sensitive information.

Question # 29

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

    The Cloud Storage bucket in Project A can only be readable from Project B.

    The Cloud Storage bucket in Project A cannot be accessed from outside the network.

    Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.

B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.

C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.

D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.

Question # 30

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Professional-Cloud-Security-Engineer question answer

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question # 31

Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.

How should your team design this network?

A.

Create an ingress firewall rule to allow access only from the application to the database using firewall tags.

B.

Create a different subnet for the frontend application and database to ensure network isolation.

C.

Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.

D.

Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.

Question # 32

You recently joined the networking team supporting your company's Google Cloud implementation. You are tasked with familiarizing yourself with the firewall rules configuration and providing recommendations based on your networking and Google Cloud experience. What product should you recommend to detect firewall rules that are overlapped by attributes from other firewall rules with higher or equal priority?

A.

Security Command Center

B.

Firewall Rules Logging

C.

VPC Flow Logs

D.

Firewall Insights

Question # 33

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

A.

1. Configure the option to suspend domain users not found in LDAP.2. Set up a recurring GCDS task.

B.

1. Configure the option to delete domain users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.2. Set up a recurring GCDS task.

D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

Question # 34

A customer has an analytics workload running on Compute Engine that should have limited internet access.

Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet.

The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do?

A.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000.

B.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000.

C.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000.

D.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000.

Question # 35

Your organization uses Google Workspace as the primary identity provider for Google Cloud Users in your organization initially created their passwords. You need to improve password security due to a recent security event. What should you do?

A.

Audit user activity for suspicious logins by using the audit and investigation tool.

B.

Conduct a security awareness training session, and set the password expiration settings to require more frequent updates.

C.

Check the Enforce strong password box, and set the password expiration to occur more frequently.

D.

Check the Enforce strong password box, and check Enforce password policy at the next sign-in.

Question # 36

You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?

A.

Use Google Cloud Directory Sync to convert the unmanaged user accounts.

B.

Create a new managed user account for each consumer user account.

C.

Use the transfer tool for unmanaged user accounts.

D.

Configure single sign-on using a customer's third-party provider.

Question # 37

You are troubleshooting access denied errors between Compute Engine instances connected to a Shared VPC and BigQuery datasets. The datasets reside in a project protected by a VPC Service Controls perimeter. What should you do?

A.

Add the host project containing the Shared VPC to the service perimeter.

B.

Add the service project where the Compute Engine instances reside to the service perimeter.

C.

Create a service perimeter between the service project where the Compute Engine instances reside and the host project that contains the Shared VPC.

D.

Create a perimeter bridge between the service project where the Compute Engine instances reside and the perimeter that contains the protected BigQuery datasets.

Question # 38

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

    The network connection must be encrypted.

    The communication between servers must be over private IP addresses.

What should you do?

A.

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

Question # 39

You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use?

A.

Titan Security Keys

B.

Google prompt

C.

Google Authenticator app

D.

Cloud HSM keys

Question # 40

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

A.

Multifactor Authentication

B.

A strict password policy

C.

Captcha on login pages

D.

Encrypted emails

Question # 41

After completing a security vulnerability assessment, you learned that cloud administrators leave Google Cloud CLI sessions open for days. You need to reduce the risk of attackers who might exploit these open sessions by setting these sessions to the minimum duration.

What should you do?

A.

Set the session duration for the Google session control to one hour.

B.

Set the reauthentication frequency (or the Google Cloud Session Control to one hour.

C.

Set the organization policy constraintconstraints/iam.allowServiceAccountCredentialLifetimeExtension to one hour.

D.

Set the organization policy constraint constraints/iam. serviceAccountKeyExpiryHours to onehour and inheritFromParent to false.

Question # 42

Your company’s chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company’s global expansion plans. After working on a plan to implement this requirement, you determine the following:

    The services in scope are included in the Google Cloud data residency requirements.

    The business data remains within specific locations under the same organization.

    The folder structure can contain multiple data residency locations.

    The projects are aligned to specific locations.

You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?

A.

Organization

B.

Resource

C.

Project

D.

Folder

Question # 43

Your organization s record data exists in Cloud Storage. You must retain all record data for at least seven years This policy must be permanent.

What should you do?

A.

• 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs

B.

• 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission

C.

• 1 Identify buckets with record data• 2 Enable the bucket policy only to ensure that data is retained• 3 Enable bucket lock

D.

* 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Enable bucket lock

Question # 44

Which two security characteristics are related to the use of VPC peering to connect two VPC networks? (Choose two.)

A.

Central management of routes, firewalls, and VPNs for peered networks

B.

Non-transitive peered networks; where only directly peered networks can communicate

C.

Ability to peer networks that belong to different Google Cloud Platform organizations

D.

Firewall rules that can be created with a tag from one peered network to another peered network

E.

Ability to share specific subnets across peered networks

Question # 45

You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.

What should you do?

A.

Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for theapplication to work properly.

B.

Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.

C.

Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

D.

Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project.Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

Question # 46

In a shared security responsibility model for IaaS, which two layers of the stack does the customer share responsibility for? (Choose two.)

A.

Hardware

B.

Network Security

C.

Storage Encryption

D.

Access Policies

E.

Boot

Question # 47

You plan to use a Google Cloud Armor policy to prevent common attacks such as cross-site scripting (XSS) and SQL injection (SQLi) from reaching your web application's backend. What are two requirements for using Google Cloud Armor security policies? (Choose two.)

A.

The load balancer must be an external SSL proxy load balancer.

B.

Google Cloud Armor Policy rules can only match on Layer 7 (L7) attributes.

C.

The load balancer must use the Premium Network Service Tier.

D.

The backend service's load balancing scheme must be EXTERNAL.

E.

The load balancer must be an external HTTP(S) load balancer.

Question # 48

Your organization has an application hosted in Cloud Run. You must control access to the application by using Cloud Identity-Aware Proxy (IAP) with these requirements:

Only users from the AppDev group may have access.

Access must be restricted to internal network IP addresses.

What should you do?

A.

Configure IAP to enforce multi-factor authentication (MFA) for all users and use network intrusion detection systems (NIDS) to block unauthorized access attempts.

B.

Configure firewall rules to limit access to IAP based on the AppDev group and source IP addresses.

C.

Create an access level that includes conditions for internal IP address ranges and AppDev groups. Apply this access level to the application's IAP policy.

D.

Deploy a VPN gateway and instruct the AppDev group to connect to the company network before accessing the application.

Question # 49

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys.

Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

A.

Customer-supplied encryption keys (CSEK)

B.

Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS)

C.

Encryption by default

D.

Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

Question # 50

Your organization is using Google Workspace. Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

A.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

B.

Poll Cloud Logging for authentication events using the gcloud logging read tool. Forward the events to the SIEM.

C.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.

D.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

Question # 51

You are setting up a new Cloud Storage bucket in your environment that is encrypted with a customer managed encryption key (CMEK). The CMEK is stored in Cloud Key Management Service (KMS). in project "pr j -a", and the Cloud Storage bucket will use project "prj-b". The key is backed by a Cloud Hardware Security Module (HSM) and resides in the region europe-west3. Your storage bucket will be located in the region europe-west1. When you create the bucket, you cannot access the key. and you need to troubleshoot why.

What has caused the access issue?

A.

A firewall rule prevents the key from being accessible.

B.

Cloud HSM does not support Cloud Storage

C.

The CMEK is in a different project than the Cloud Storage bucket

D.

The CMEK is in a different region than the Cloud Storage bucket.

Question # 52

Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports

What should you do?

A.

Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates.

B.

Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily.

C.

Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods.

D.

Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them.

Question # 53

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

A.

BigQuery datasets

B.

Cloud Storage buckets

C.

StackDriver logging

D.

Cloud Pub/Sub topics

Question # 54

Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer.

What type of Load Balancing should you use?

A.

Network Load Balancing

B.

HTTP(S) Load Balancing

C.

TCP Proxy Load Balancing

D.

SSL Proxy Load Balancing

Question # 55

Your organization strives to be a market leader in software innovation. You provided a large number of Google Cloud environments so developers can test the integration of Gemini in Vertex AI into their existing applications or create new projects. Your organization has 200 developers and a five-person security team. You must prevent and detect proper security policies across the Google Cloud environments. What should you do? (Choose 2 answers)?

A.

Apply a predefined AI-recommended security posture template for Gemini in Vertex AI in Security Command Center Enterprise or Premium tiers.?

B.

Publish internal policies and clear guidelines to securely develop applications.?

C.

Implement the least privileged access Identity and Access Management roles to prevent misconfigurations.?

D.

Apply organization policy constraints. Detect and monitor drifts by using Security Health Analytics.?

E.

Use Cloud Logging to create log filters to detect misconfigurations. Trigger Cloud Run functions to remediate misconfigurations.?

Question # 56

Your company runs a website that will store PII on Google Cloud Platform. To comply with data privacy regulations, this data can only be stored for a specific amount of time and must be fully deleted after this specific period. Data that has not yet reached the time period should not be deleted. You want to automate the process of complying with this regulation.

What should you do?

A.

Store the data in a single Persistent Disk, and delete the disk at expiration time.

B.

Store the data in a single BigQuery table and set the appropriate table expiration time.

C.

Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.

D.

Store the data in a single BigTable table and set an expiration time on the column families.

Question # 57

You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use?

A.

Policy Troubleshooter

B.

Policy Analyzer

C.

IAM Recommender

D.

Policy Simulator

Question # 58

Your organization is using Active Directory and wants to configure Security Assertion Markup Language (SAML). You must set up and enforce single sign-on (SSO) for all users.

What should you do?

A.

1. Manage SAML profile assignments.• 2. Enable OpenID Connect (OIDC) in your Active Directory (AD) tenant.• 3. Verify the domain.

B.

1. Create a new SAML profile.• 2. Upload the X.509 certificate.• 3. Enable the change password URL.• 4. Configure Entity ID and ACS URL in your IdP.

C.

1- Create a new SAML profile.• 2. Populate the sign-in and sign-out page URLs.• 3. Upload the X.509 certificate.• 4. Configure Entity ID and ACS URL in your IdP

D.

1. Configure prerequisites for OpenID Connect (OIDC) in your Active Directory (AD) tenant• 2. Verify the AD domain.• 3. Decide which users should use SAML.• 4. Assign the pre-configured profile to the select organizational units (OUs) and groups.

Question # 59

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

A.

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Question # 60

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

A.

Create a firewall rule to block internet traffic from the VM.

B.

Provision a NAT Gateway to access the Cloud Storage API endpoint.

C.

Enable Private Google Access on the VPC.

D.

Mount a Cloud Storage bucket as a local filesystem on every VM.

Question # 61

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

A.

External Key Manager

B.

Customer-supplied encryption keys

C.

Hardware Security Module

D.

Confidential Computing and Istio

E.

Client-side encryption

Question # 62

Your organization wants full control of the keys used to encrypt data at rest in their Google Cloud environments. Keys must be generated and stored outside of Google and integrate with many Google Services including BigQuery.

What should you do?

A.

Create a Cloud Key Management Service (KMS) key with imported key material Wrap the key for protection during import. Import the key generated on a trusted system in Cloud KMS.

B.

Create a KMS key that is stored on a Google managed FIPS 140-2 level 3 Hardware Security Module (HSM) Manage the Identity and Access Management (IAM) permissions settings, and set up the key rotation period.

C.

Use Cloud External Key Management (EKM) that integrates with an external Hardware Security Module(HSM) system from supported vendors.

D.

Use customer-supplied encryption keys (CSEK) with keys generated on trusted external systems Provide the raw CSEK as part of the API call.

Question # 63

A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location.

Which solution will restrict access to the in-progress sites?

A.

Upload an .htaccess file containing the customer and employee user accounts to App Engine.

B.

Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic.

C.

Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts.

D.

Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company’s GCP Virtual Private Cloud (VPC) network.

Question # 64

You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.)

A.

Secret Manager

B.

Cloud Key Management Service

C.

Cloud Data Loss Prevention with cryptographic hashing

D.

Cloud Data Loss Prevention with automatic text redaction

E.

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

Question # 65

Your team needs to configure their Google Cloud Platform (GCP) environment so they can centralize the control over networking resources like firewall rules, subnets, and routes. They also have an on-premises environment where resources need access back to the GCP resources through a private VPN connection. The networking resources will need to be controlled by the network security team.

Which type of networking design should your team use to meet these requirements?

A.

Shared VPC Network with a host project and service projects

B.

Grant Compute Admin role to the networking team for each engineering project

C.

VPC peering between all engineering projects using a hub and spoke model

D.

Cloud VPN Gateway between all engineering projects using a hub and spoke model

Question # 66

Your organization enforces a custom organization policy that disables the use of Compute Engine VM instances with external IP addresses. However, a regulated business unit requires an exception to temporarily use external IPs for a third-party audit process. The regulated business workload must comply with least privilege principles and minimize policy drift. You need to ensure secure policy management and proper handling. What should you do?

A.

Create a folder. Apply the restrictive organization policy for non-regulated business workloads in the folder. Place the regulated business workload in that folder.

B.

Apply the custom organization policy at the organization level to restrict external IPs. Move the regulated business workload to a separate folder. Override the policy at that folder level.

C.

Create an IAM custom role with permissions to bypass organization policies. Assign the custom role to the regulated business team for the specific project.

D.

Modify the custom organization policy at the organization level to allow external IPs for all projects. Configure VPC firewall rules to restrict egress traffic except for the regulated business workload.

Question # 67

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

    Schedule key rotation for sensitive data.

    Control which region the encryption keys for sensitive data are stored in.

    Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

A.

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Question # 68

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.

What should you do?

A.

Use multi-factor authentication for admin access to the web application.

B.

Use only applications certified compliant with PA-DSS.

C.

Move the cardholder data environment into a separate GCP project.

D.

Use VPN for all connections between your office and cloud environments.

Question # 69

Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements:

Scans must run at least once per week

Must be able to detect cross-site scripting vulnerabilities

Must be able to authenticate using Google accounts

Which solution should you use?

A.

Google Cloud Armor

B.

Web Security Scanner

C.

Security Health Analytics

D.

Container Threat Detection

Question # 70

You are in charge of creating a new Google Cloud organization for your company. Which two actions should you take when creating the super administrator accounts? (Choose two.)

A.

Create an access level in the Google Admin console to prevent super admin from logging in to Google Cloud.

B.

Disable any Identity and Access Management (1AM) roles for super admin at the organization level in the Google Cloud Console.

C.

Use a physical token to secure the super admin credentials with multi-factor authentication (MFA).

D.

Use a private connection to create the super admin accounts to avoid sending your credentials over the Internet.

E.

Provide non-privileged identities to the super admin users for their day-to-day activities.

Question # 71

Which Identity-Aware Proxy role should you grant to an Identity and Access Management (IAM) user to access HTTPS resources?

A.

Security Reviewer

B.

lAP-Secured Tunnel User

C.

lAP-Secured Web App User

D.

Service Broker Operator

Question # 72

Your organization is deploying a serverless web application on Cloud Run that must be publicly accessible over HTTPS. To meet security requirements, you need to terminate TLS at the edge, apply threat mitigation, and prepare for geo-based access restrictions. What should you do?

A.

Make the Cloud Run service public by enabling allUsers access. Configure Identity-Aware Proxy (IAP) for authentication and IP-based access control. Use custom SSL certificates for HTTPS.

B.

Assign a custom domain to the Cloud Run service. Enable HTTPS. Configure IAM to allow allUsers to invoke the service. Use firewall rules and VPC Service Controls for geo-based restriction and traffic filtering.

C.

Deploy an external HTTP(S) load balancer with a serverless NEG that points to the Cloud Run service. Use a Google-managed certificate for TLS termination. Configure a Cloud Armor policy with geo-based access control.

D.

Create a Cloud DNS public zone for the Cloud Run URL. Bind a static IP to the service. Use VPC firewall rules to restrict incoming traffic based on IP ranges and threat signatures.

Question # 73

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

A.

• 1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets• 2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions• 3 Query the data access logs to report on unauthorized access

B.

• 1 Change bucket permissions to limit access• 2 Query the data access audit logs for any unauthorized access to the buckets• 3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.

• 1 Change permissions to limit access for authorized users• 2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access• 3 Review the administrator activity audit logs to report on any unauthorized access

D.

• 1 Change the bucket permissions to limit access• 2 Query the buckets usage logs to report on unauthorized access to the data• 3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

Question # 74

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard

Which options should you recommend to meet the requirements?

A.

Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.

B.

Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.

C.

Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.

D.

Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.

Question # 75

You need to use Cloud External Key Manager to create an encryption key to encrypt specific BigQuery data at rest in Google Cloud. Which steps should you do first?

A.

1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project.2. Grant your Google Cloud project access to a supported external key management partner system.

B.

1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

C.

1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system.2. In the external key management partner system, grant access for this key to use your Google Cloud project.

D.

1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

Question # 76

Your organization's application is being integrated with a partner application that requires read access to customer data to process customer orders. The customer data is stored in one of your Cloud Storage buckets. You have evaluated different options and determined that this activity requires the use of service account keys. You must advise the partner on how to minimize the risk of a compromised service account key causing a loss of data. What should you advise the partner to do?

A.

Define a VPC Service Controls perimeter, and restrict the Cloud Storage API. Add an ingress rule to the perimeter to allow access to the Cloud Storage API for the service account from outside of the perimeter.?

B.

Scan the Cloud Storage bucket with Sensitive Data Protection when new data is added, and automatically mask all customer data.?

C.

Ensure that all data for the application that is accessed through the relevant service accounts is encrypted at rest by using customer-managed encryption keys (CMEK).?

D.

Implement a secret management service. Configure the service to frequently rotate the service account key. Configure proper access control to the key, and restrict who can create service account keys.?

Question # 77

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

A.

Create a project with multiple VPC networks for each environment.

B.

Create a folder for each development and production environment.

C.

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.

Create an Organizational Policy constraint for each folder environment.

E.

Create projects for each environment, and grant IAM rights to each engineering user.

Question # 78

A customer is collaborating with another company to build an application on Compute Engine. The customer is building the application tier in their GCP Organization, and the other company is building the storage tier in a different GCP Organization. This is a 3-tier web application. Communication between portions of the application must not traverse the public internet by any means.

Which connectivity option should be implemented?

A.

VPC peering

B.

Cloud VPN

C.

Cloud Interconnect

D.

Shared VPC

Question # 79

Your financial services company has an audit requirement under a strict regulatory framework that requires comprehensive, immutable audit trails for all administrative and data access activity that ensures that data is kept for seven years. Your current logging is fragmented across individual projects. You need to establish a centralized, tamper-proof, long-term logging solution accessible for audits. What should you do?

A.

Implement Pub/Sub to stream all audit logs from each project in real-time to an external Security Information and Event Management (SIEM) for long-term analysis.

B.

Establish organization-level Cloud Logging sinks to export Cloud Audit Logs to a dedicated Cloud Storage bucket with object retention lock.

C.

Enable Security Command Center across the organization to gain centralized visibility into threats and manage compliance posture for all Google Cloud projects.

D.

Individually configure Cloud Audit Logs for all Google Cloud services in each project. Store the logs in regional Cloud Logging buckets with 30-day retention policies.

Question # 80

You are responsible for managing your company’s identities in Google Cloud. Your company enforces 2-Step Verification (2SV) for all users. You need to reset a user’s access, but the user lost their second factor for 2SV. You want to minimize risk. What should you do?

A.

On the Google Admin console, select the appropriate user account, and generate a backup code to allow the user to sign in. Ask the user to update their second factor.

B.

On the Google Admin console, temporarily disable the 2SV requirements for all users. Ask the user to log in and add their new second factor to their account. Re-enable the 2SV requirement for all users.

C.

On the Google Admin console, select the appropriate user account, and temporarily disable 2SV for this account Ask the user to update their second factor, and then re-enable 2SV for this account.

D.

On the Google Admin console, use a super administrator account to reset the user account's credentials. Ask the user to update their credentials after their first login.

Question # 81

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

A.

Create a Log Sink at the organization level with a Pub/Sub destination.

B.

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.

Enable Data Access audit logs at the organization level to apply to all projects.

D.

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Question # 82

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question # 83

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

A.

Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.

B.

On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.

C.

On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.

D.

Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.

Question # 84

What are the steps to encrypt data using envelope encryption?

A.

Generate a data encryption key (DEK) locally.Use a key encryption key (KEK) to wrap the DEK. Encrypt data with the KEK.Store the encrypted data and the wrapped KEK.

B.

Generate a key encryption key (KEK) locally.Use the KEK to generate a data encryption key (DEK). Encrypt data with the DEK.Store the encrypted data and the wrapped DEK.

C.

Generate a data encryption key (DEK) locally.Encrypt data with the DEK.Use a key encryption key (KEK) to wrap the DEK. Store the encrypted data and the wrapped DEK.

D.

Generate a key encryption key (KEK) locally.Generate a data encryption key (DEK) locally. Encrypt data with the KEK.Store the encrypted data and the wrapped DEK.

Question # 85

You are creating an internal App Engine application that needs to access a user’s Google Drive on the user’s behalf. Your company does not want to rely on the current user’s credentials. It also wants to follow Google- recommended practices.

What should you do?

A.

Create a new Service account, and give all application users the role of Service Account User.

B.

Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User.

C.

Use a dedicated G Suite Admin account, and authenticate the application’s operations with these G Suite credentials.

D.

Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user.

Question # 86

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

A.

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Question # 87

You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization. What should you do?

A.

Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment.

B.

Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts.

C.

Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts.

D.

Use organization policy constraints/iam.disableServiceAccountCreation boolean to disable the creation of new service accounts.

Question # 88

Which Google Cloud service should you use to enforce access control policies for applications and resources?

A.

Identity-Aware Proxy

B.

Cloud NAT

C.

Google Cloud Armor

D.

Shielded VMs

Question # 89

You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network dev-vpc." You want to minimize implementation and maintenance effort

What should you do?

A.

• 1. Attach external IP addresses to the VMs in scope.• 2. Configure a VPC Firewall rule in "dev-vpc" that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network.

B.

• 1. Attach external IP addresses to the VMs in scope.• 2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10 58.5.0/24 from network dev-vpc.

C.

• 1. Leave the network configuration of the VMs in scope unchanged.• 2. Create a new project including a new VPC network "new-vpc."• 3 Deploy a network appliance in "new-vpc" to filter access requests and only allow egress connections from -dev-vpc" to 10.58.5.0/24.

D.

• 1 Leave the network configuration of the VMs in scope unchanged• 2 Enable Cloud NAT for dev-vpc" and restrict the target range in Cloud NAT to 10.58.5 0/24.

Professional-Cloud-Security-Engineer PDF

$33

$109.99

3 Months Free Update

  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions

Professional-Cloud-Security-Engineer PDF + Testing Engine

$52.8

$175.99

3 Months Free Update

  • Exam Name: Google Cloud Certified - Professional Cloud Security Engineer
  • Last Update: Dec 14, 2025
  • Questions and Answers: 297
  • Free Real Questions Demo
  • Recommended by Industry Experts
  • Best Economical Package
  • Immediate Access

Professional-Cloud-Security-Engineer Engine

$39.6

$131.99

3 Months Free Update

  • Best Testing Engine
  • One Click installation
  • Recommended by Teachers
  • Easy to use
  • 3 Modes of Learning
  • State of Art Technology
  • 100% Real Questions included