Google Professional Cloud Security Engineer Dump Free – 50 Practice Questions to Sharpen Your Exam Readiness.
Looking for a reliable way to prepare for your Google Professional Cloud Security Engineer certification? Our Google Professional Cloud Security Engineer Dump Free includes 50 exam-style practice questions designed to reflect real test scenarios—helping you study smarter and pass with confidence.
Using an Google Professional Cloud Security Engineer dump free set of questions can give you an edge in your exam prep by helping you:
- Understand the format and types of questions you’ll face
- Pinpoint weak areas and focus your study efforts
- Boost your confidence with realistic question practice
Below, you will find 50 free questions from our Google Professional Cloud Security Engineer Dump Free collection. These cover key topics and are structured to simulate the difficulty level of the real exam, making them a valuable tool for review or final prep.
Your company is storing sensitive data in Cloud Storage. You want a key generated on-premises to be used in the encryption process.
What should you do?A. Use the Cloud Key Management Service to manage a data encryption key (DEK).
B. Use the Cloud Key Management Service to manage a key encryption key (KEK).
C. Use customer-supplied encryption keys to manage the data encryption key (DEK).
D. Use customer-supplied encryption keys to manage the key encryption key (KEK).
Â
For compliance reporting purposes, the internal audit department needs you to provide the list of virtual machines (VMs) that have critical operating system (OS) security updates available, but not installed. You must provide this list every six months, and you want to perform this task quickly.
What should you do?A. Run a Security Command Center security scan on all VMs to extract a list of VMs with critical OS vulnerabilities every six months.
B. Run a gcloud CLI command from the Command Line Interface (CLI) to extract the VM's OS version information every six months.
C. Ensure that the Cloud Logging agent is installed on all VMs, and extract the OS last update log date every six months.
D. Ensure the OS Config agent is installed on all VMs and extract the patch status dashboard every six months.
Â
Your team uses a service account to authenticate data transfers from a given Compute Engine virtual machine instance of to a specified Cloud Storage bucket. An engineer accidentally deletes the service account, which breaks application functionality. You want to recover the application as quickly as possible without compromising security.
What should you do?A. Temporarily disable authentication on the Cloud Storage bucket.
B. Use the undelete command to recover the deleted service account.
C. Create a new service account with the same name as the deleted service account.
D. Update the permissions of another existing service account and supply those credentials to the applications.
Â
You are working with a client that is concerned about control of their encryption keys for sensitive data. The client does not want to store encryption keys at rest in the same cloud service provider (CSP) as the data that the keys are encrypting. Which Google Cloud encryption solutions should you recommend to this client?
(Choose two.)A. Customer-supplied encryption keys.
B. Google default encryption
C. Secret Manager
D. Cloud External Key Manager
E. Customer-managed encryption keys
Â
You manage a fleet of virtual machines (VMs) in your organization. You have encountered issues with lack of patching in many VMs. You need to automate regular patching in your VMs and view the patch management data across multiple projects.
What should you do? (Choose two.)A. View patch management data in VM Manager by using OS patch management.
B. View patch management data in Artifact Registry.
C. View patch management data in a Security Command Center dashboard.
D. Deploy patches with Security Command Genter by using Rapid Vulnerability Detection.
E. Deploy patches with VM Manager by using OS patch management.
Â
You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?A. Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.
B. Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.
C. Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.
D. Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region.
Â
Your Security team believes that a former employee of your company gained unauthorized access to Google Cloud resources some time in the past 2 months by using a service account key. You need to confirm the unauthorized access and determine the user activity. What should you do?A. Use Security Health Analytics to determine user activity.
B. Use the Cloud Monitoring console to filter audit logs by user.
C. Use the Cloud Data Loss Prevention API to query logs in Cloud Storage.
D. Use the Logs Explorer to search for user activity.
Â
An administrative application is running on a virtual machine (VM) in a managed group at port 5601 inside a Virtual Private Cloud (VPC) instance without access to the internet currently. You want to expose the web interface at port 5601 to users and enforce authentication and authorization Google credentials.
What should you do?A. Configure the bastion host with OS Login enabled and allow connection to port 5601 at VPC firewall. Log in to the bastion host from the Google Cloud console by using SSH-in-browser and then to the web application.
B. Modify the VPC routing with the default route point to the default internet gateway. Modify the VPC Firewall rule to allow access from the internet 0.0.0.0/0 to port 5601 on the application instance.
C. Configure Secure Shell Access (SSH) bastion host in a public network, and allow only the bastion host to connect to the application on port 5601. Use a bastion host as a jump host to connect to the application.
D. Configure an HTTP Load Balancing instance that points to the managed group with Identity-Aware Proxy (IAP) protection with Google credentials. Modify the VPC firewall to allow access from IAP network range.
Â
You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.
What should you do?A. Use multi-factor authentication for admin access to the web application.
B. Use only applications certified compliant with PA-DSS.
C. Move the cardholder data environment into a separate GCP project.
D. Use VPN for all connections between your office and cloud environments.
Â
Your company's chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on a plan to implement this requirement, you determine the following:
âś‘ The services in scope are included in the Google Cloud data residency requirements.
âś‘ The business data remains within specific locations under the same organization.
âś‘ The folder structure can contain multiple data residency locations.
âś‘ The projects are aligned to specific locations.
You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?A. Organization
B. Resource
C. Project
D. Folder
Â
Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.
What should you do?A. Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (IaC).
B. Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.
C. Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing.
D. Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises. Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.
Â
Your organization uses the top-tier folder to separate application environments (prod and dev). The developers need to see all application development audit logs, but they are not permitted to review production logs. Your security team can review all logs in production and development environments. You must grant Identity and Access Management (IAM) roles at the right resource level for the developers and security team while you ensure least privilege.
What should you do?A. 1. Grant logging.viewer role to the security team at the organization resource level.2. Grant logging.viewer role to the developer team at the folder resource level that contains all the dev projects.
B. 1. Grant logging.viewer role to the security team at the organization resource level.2. Grant logging.admin role to the developer team at the organization resource level.
C. 1. Grant logging.admin role to the security team at the organization resource level.2. Grant logging.viewer role to the developer team at the folder resource level that contains all the dev projects.
D. 1. Grant logging.admin role to the security team at the organization resource level.2. Grant logging.admin role to the developer team at the organization resource level.
Â
You want to use the gcloud command-line tool to authenticate using a third-party single sign-on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.)A. SSO SAML as a third-party IdP
B. Identity Platform
C. OpenID Connect
D. Identity-Aware Proxy
E. Cloud Identity
Â
You have created an OS image that is hardened per your organization's security standards and is being stored in a project managed by the security team. As a
Google Cloud administrator, you need to make sure all VMs in your Google Cloud organization can only use that specific OS image while minimizing operational overhead. What should you do? (Choose two.)A. Grant users the compute.imageUser role in their own projects.
B. Grant users the compute.imageUser role in the OS image project.
C. Store the image in every project that is spun up in your organization.
D. Set up an image access organization policy constraint, and list the security team managed project in the project's allow list.
E. Remove VM instance creation permission from users of the projects, and only allow you and your team to create VM instances.
Â
You work for an organization in a regulated industry that has strict data protection requirements. The organization backs up their data in the cloud. To comply with data privacy regulations, this data can only be stored for a specific length of time and must be deleted after this specific period.
You want to automate the compliance with this regulation while minimizing storage costs. What should you do?A. Store the data in a persistent disk, and delete the disk at expiration time.
B. Store the data in a Cloud Bigtable table, and set an expiration time on the column families.
C. Store the data in a BigQuery table, and set the table's expiration time.
D. Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.
Â
You are migrating an on-premises data warehouse to BigQuery, Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must:
• Protect data at rest with full lifecycle management on cryptographic keys.
• Implement a separate key management provider from data management.
• Provide visibility into all encryption key requests.
What services should be included in the data warehouse implementation? (Choose two.)A. Customer-managed encryption keys
B. Customer-Supplied Encryption Keys
C. Key Access Justifications
D. Access Transparency and Approval
E. Cloud External Key Manager
Â
Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer.
What type of Load Balancing should you use?A. Network Load Balancing
B. HTTP(S) Load Balancing
C. TCP Proxy Load Balancing
D. SSL Proxy Load Balancing
Â
Your team wants to make sure Compute Engine instances running in your production project do not have public IP addresses. The frontend application Compute
Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.
How should your team meet these requirements?A. Enable Private Access on the VPC network in the production project.
B. Remove the Editor role and grant the Compute Admin IAM role to the engineers.
C. Set up an organization policy to only permit public IPs for the front-end Compute Engine instances.
D. Set up a VPC network with two subnets: one with public IPs and one without public IPs.
Â
You need to set up a Cloud interconnect connection between your company's on-premises data center and VPC host network. You want to make sure that on- premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?A. Enable Private Google Access on the regional subnets and global dynamic routing mode.
B. Set up a Private Service Connect endpoint IP address with the API bundle of "all-apis", which is advertised as a route over the Cloud interconnect connection.
C. Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.
D. Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.
Â
Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need to join user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.
This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?A. Deterministic encryption
B. Secure, key-based hashes
C. Format-preserving encryption
D. Cryptographic hashing
Â
A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.
What should you do?A. Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.
B. Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.
C. On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.
D. On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.
Â
Which Identity-Aware Proxy role should you grant to an Identity and Access Management (IAM) user to access HTTPS resources?A. Security Reviewer
B. IAP-Secured Tunnel User
C. IAP-Secured Web App User
D. Service Broker Operator
Â
A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.
Where should you export the logs?A. BigQuery datasets
B. Cloud Storage buckets
C. StackDriver logging
D. Cloud Pub/Sub topics
Â
Your organization previously stored files in Cloud Storage by using Google Managed Encryption Keys (GMEK), but has recently updated the internal policy to require Customer Managed Encryption Keys (CMEK). You need to re-encrypt the files quickly and efficiently with minimal cost.
What should you do?A. Reupload the files to the same Cloud Storage bucket specifying a key file by using gsutil.
B. Encrypt the files locally, and then use gsutil to upload the files to a new bucket.
C. Copy the files to a new bucket with CMEK enabled in a secondary region.
D. Change the encryption type on the bucket to CMEK, and rewrite the objects.
Â
A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.
Which service should be used to accomplish this?A. Cloud Armor
B. Google Cloud Audit Logs
C. Web Security Scanner
D. Anomaly Detection
Â
You are in charge of creating a new Google Cloud organization for your company. Which two actions should you take when creating the super administrator accounts? (Choose two.)A. Create an access level in the Google Admin console to prevent super admin from logging in to Google Cloud.
B. Disable any Identity and Access Management (IAM) roles for super admin at the organization level in the Google Cloud Console.
C. Use a physical token to secure the super admin credentials with multi-factor authentication (MFA).
D. Use a private connection to create the super admin accounts to avoid sending your credentials over the Internet.
E. Provide non-privileged identities to the super admin users for their day-to-day activities.
Â
In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and
UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard
Which options should you recommend to meet the requirements?A. Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.
B. Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.
C. Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.
D. Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.
Â
You're developing the incident response plan for your company. You need to define the access strategy that your DevOps team will use when reviewing and investigating a deployment issue in your Google Cloud environment. There are two main requirements:
âś‘ Least-privilege access must be enforced at all times.
âś‘ The DevOps team must be able to access the required resources only during the deployment issue.
How should you grant access while following Google-recommended best practices?A. Assign the Project Viewer Identity and Access Management (IAM) role to the DevOps team.
B. Create a custom IAM role with limited list/view permissions, and assign it to the DevOps team.
C. Create a service account, and grant it the Project Owner IAM role. Give the Service Account User Role on this service account to the DevOps team.
D. Create a service account, and grant it limited list/view permissions. Give the Service Account User Role on this service account to the DevOps team.
Â
Your organization recently activated the Security Command Center (SCC) standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.
What should you do?A. 1. Remove the Identity and Access Management (IAM) granting access to all Users from the buckets.2. Apply the organization policy storage.uniformBucketLevelAccess to prevent regressions.3. Query the data access logs to report on unauthorized access.
B. 1. Change permissions to limit access for authorized users.2. Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access.3. Review the administrator activity audit logs to report on any unauthorized access.
C. 1. Change the bucket permissions to limit access.2. Query the bucket's usage logs to report on unauthorized access to the data.3. Enforce the organization policy storage.publicAccessPrevention to avoid regressions.
D. 1. Change bucket permissions to limit access.2. Query the data access audit logs for any unauthorized access to the buckets.3. After the misconfiguration is corrected, mute the finding in the Security Command Center.
Â
You need to connect your organization's on-premises network with an existing Google Cloud environment that includes one Shared VPC with two subnets named
Production and Non-Production. You are required to:
âś‘ Use a private transport link.
âś‘ Configure access to Google Cloud APIs through private API endpoints originating from on-premises environments.
âś‘ Ensure that Google Cloud APIs are only consumed via VPC Service Controls.
What should you do?A. 1. Set up a Cloud VPN link between the on-premises environment and Google Cloud. 2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.
B. 1. Set up a Partner Interconnect link between the on-premises environment and Google Cloud. 2. Configure private access using the private.googleapis.com domains in on-premises DNS configurations.
C. 1. Set up a Direct Peering link between the on-premises environment and Google Cloud. 2. Configure private access for both VPC subnets.
D. 1. Set up a Dedicated Interconnect link between the on-premises environment and Google Cloud. 2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.
Â
Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.
What should you do?A. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Create a Cloud Run function to check for the VM settings, generate metrics, and run the function regularly.
B. 1. Activate Virtual Machine Threat Detection in Security Command Center (SCC) Premium.2. Monitor the findings in SCC.
C. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Activate Confidential Computing.3. Enforce these actions by using organization policies.
D. 1. Use secure hardened images from the Google Cloud Marketplace.2. When deploying the images, activate the Confidential Computing option.3. Enforce the use of the correct images and Confidential Computing by using organization policies.
Â
You are troubleshooting access denied errors between Compute Engine instances connected to a Shared VPC and BigQuery datasets. The datasets reside in a project protected by a VPC Service Controls perimeter. What should you do?A. Add the host project containing the Shared VPC to the service perimeter.
B. Add the service project where the Compute Engine instances reside to the service perimeter.
C. Create a service perimeter between the service project where the Compute Engine instances reside and the host project that contains the Shared VPC.
D. Create a perimeter bridge between the service project where the Compute Engine instances reside and the perimeter that contains the protected BigQuery datasets.
Â
Your company's Google Cloud organization has about 200 projects and 1,500 virtual machines. There is no uniform strategy for logs and events management, which reduces visibility for your security operations team. You need to design a logs management solution that provides visibility and allows the security team to view the environment's configuration.
What should you do?A. 1. Create a dedicated log sink for each project that is in scope.2. Use a BigQuery dataset with time partitioning enabled as a destination of the log sinks.3. Deploy alerts based on log metrics in every project.4. Grant the role "Monitoring Viewer" to the security operations team in each project.
B. 1. Create one log sink at the organization level that includes all the child resources.2. Use as destination a Pub/Sub topic to ingest the logs into the security information and event. management (SIEM) on-premises, and ensure that the right team can access the SIEM.3. Grant the Viewer role at organization level to the security operations team.
C. 1. Enable network logs and data access logs for all resources in the "Production" folder.2. Do not create log sinks to avoid unnecessary costs and latency.3. Grant the roles "Logs Viewer" and "Browser" at project level to the security operations team.
D. 1. Create one sink for the "Production" folder that includes child resources and one sink for the logs ingested at the organization level that excludes child resources.2. As destination, use a log bucket with a minimum retention period of 90 days in a project that can be accessed by the security team.3. Grant the security operations team the role of Security Reviewer at organization level.
Â
Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.
What should you do?A. 1. Use Cloud Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Hide Matching Entries. 4. Make sure the resulting list is empty.
B. 1. Use Cloud Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Show Matching Entries. 4. Make sure the resulting list is empty.
C. 1. In BigQuery, select the related dataset. 2. Make sure that the App Engine Default Service Account is the only account that can write to the dataset.
D. 1. Go to the Identity and Access Management (IAM) section of the project. 2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.
Â
An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization's production environment will remain on- premises for an indefinite time. The organization wants a scalable and cost-efficient solution.
Which GCP solution should the organization use?A. BigQuery using a data pipeline job with continuous updates
B. Cloud Storage using a scheduled task and gsutil
C. Compute Engine Virtual Machines using Persistent Disk
D. Cloud Datastore using regularly scheduled batch upload jobs
Â
Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements:
âś‘ Scans must run at least once per week
âś‘ Must be able to detect cross-site scripting vulnerabilities
âś‘ Must be able to authenticate using Google accounts
Which solution should you use?A. Google Cloud Armor
B. Web Security Scanner
C. Security Health Analytics
D. Container Threat Detection
Â
You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:
âś‘ Schedule key rotation for sensitive data.
âś‘ Control which region the encryption keys for sensitive data are stored in.
âś‘ Minimize the latency to access encryption keys for both sensitive and non-sensitive data.
What should you do?A. Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.
B. Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.
C. Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.
D. Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.
Â
You are creating an internal App Engine application that needs to access a user's Google Drive on the user's behalf. Your company does not want to rely on the current user's credentials. It also wants to follow Google-recommended practices.
What should you do?A. Create a new Service account, and give all application users the role of Service Account User.
B. Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User.
C. Use a dedicated G Suite Admin account, and authenticate the application's operations with these G Suite credentials.
D. Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user.
Â
An organization receives an increasing number of phishing emails.
Which method should be used to protect employee credentials in this situation?A. Multifactor Authentication
B. A strict password policy
C. Captcha on login pages
D. Encrypted emails
Â
You manage a mission-critical workload for your organization, which is in a highly regulated industry. The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpoint computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements:
• Manage the data encryption key (DEK) outside the Google Cloud boundary.
• Maintain full control of encryption keys through a third-party provider.
• Encrypt the sensitive data before uploading it to Cloud Storage.
• Decrypt the sensitive data during processing in the Compute Engine VMs.
• Encrypt the sensitive data in memory while in use in the Compute Engine VMs.
What should you do? (Choose two.)A. Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
B. Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
C. Create Confidential VMs to access the sensitive data.
D. Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.
E. Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets.
Â
A company allows every employee to use Google Cloud Platform. Each department has a Google Group, with all department members as group members. If a department member creates a new project, all members of that department should automatically have read-only access to all new project resources. Members of any other department should not have access to the project. You need to configure this behavior.
What should you do to meet these requirements?A. Create a Folder per department under the Organization. For each department's Folder, assign the Project Viewer role to the Google Group related to that department.
B. Create a Folder per department under the Organization. For each department's Folder, assign the Project Browser role to the Google Group related to that department.
C. Create a Project per department under the Organization. For each department's Project, assign the Project Viewer role to the Google Group related to that department.
D. Create a Project per department under the Organization. For each department's Project, assign the Project Browser role to the Google Group related to that department.
Â
You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access
Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.
What should you do?A. Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.
B. Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.
C. Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.
D. Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.
Â
Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT. Everyday, you must patch all VMs with critical OS updates and provide summary reports.
What should you do?A. Validate that the egress firewall rules allow any outgoing traffic. Log in to each VM and execute OS specific update commands. Configure the Cloud Scheduler job to update with critical patches daily for daily updates.
B. Copy the latest patches to the Cloud Storage bucket. Log in to each VM, download the patches from the bucket, and install them.
C. Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic. Log in to each VM, and configure a daily cron job to enable for OS updates at night during low activity periods.
D. Ensure that VM Manager is installed and running on the VMs. In the OS patch management service, configure the patch jobs to update with critical patches dally.
Â
Your organization has had a few recent DDoS attacks. You need to authenticate responses to domain name lookups. Which Google Cloud service should you use?A. Cloud DNS with DNSSEC
B. Cloud NAT
C. HTTP(S) Load Balancing
D. Google Cloud Armor
Â
An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their Cloud
Identity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters.
Which Cloud Identity password guidelines can the organization use to inform their new requirements?A. Set the minimum length for passwords to be 8 characters.
B. Set the minimum length for passwords to be 10 characters.
C. Set the minimum length for passwords to be 12 characters.
D. Set the minimum length for passwords to be 6 characters.
Â
You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud.
You want to validate these policy changes before they are enforced. What service should you use?A. Google Cloud Armor's preconfigured rules in preview mode
B. Prepopulated VPC firewall rules in monitor mode
C. The inherent protections of Google Front End (GFE)
D. Cloud Load Balancing firewall rules
E. VPC Service Controls in dry run mode
Â
A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.
Which Storage solution are they allowed to use?A. Cloud Bigtable
B. Cloud BigQuery
C. Compute Engine SSD Disk
D. Compute Engine Persistent Disk
Â
The security operations team needs access to the security-related logs for all projects in their organization. They have the following requirements:
âś‘ Follow the least privilege model by having only view access to logs.
âś‘ Have access to Admin Activity logs.
âś‘ Have access to Data Access logs.
âś‘ Have access to Access Transparency logs.
Which Identity and Access Management (IAM) role should the security operations team be granted?A. roles/logging.privateLogViewer
B. roles/logging.admin
C. roles/viewer
D. roles/logging.viewer
Â
Your company’s users access data in a BigQuery table. You want to ensure they can only access the data during working hours.
What should you do?A. Assign a BigQuery Data Viewer role along with an IAM condition that limits the access to specified working hours.
B. Run a gsutil script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.
C. Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours.
D. Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraint for BigQuery during the specified working hours.
Â
You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks. You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network “dev-vpc”. You want to minimize implementation and maintenance effort.
What should you do?A. 1. Leave the network configuration of the VMs in scope unchanged.2. Create a new project including a new VPC network “new-vpc”.3. Deploy a network appliance in “new-vpc” to filter access requests and only allow egress connections from “dev-vpc” to 10.58.5.0/24.
B. 1. Leave the network configuration of the VMs in scope unchanged.2. Enable Cloud NAT for “dev-vpc” and restrict the target range in Cloud NAT to 10.58.5.0/24.
C. 1. Attach external IP addresses to the VMs in scope.2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10.58.5.0/24 from network dev-vpc.
D. 1. Attach external IP addresses to the VMs in scope.2. Configure a VPC Firewall rule in “dev-vpc” that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network.
Â
Access Full Google Professional Cloud Security Engineer Dump Free
Looking for even more practice questions? Click here to access the complete Google Professional Cloud Security Engineer Dump Free collection, offering hundreds of questions across all exam objectives.
We regularly update our content to ensure accuracy and relevance—so be sure to check back for new material.
Begin your certification journey today with our Google Professional Cloud Security Engineer dump free questions — and get one step closer to exam success!