Google Professional Cloud Security Engineer Practice Questions Free – 50 Exam-Style Questions to Sharpen Your Skills
Are you preparing for the Google Professional Cloud Security Engineer certification exam? Kickstart your success with our Google Professional Cloud Security Engineer Practice Questions Free – a carefully selected set of 50 real exam-style questions to help you test your knowledge and identify areas for improvement.
Practicing with Google Professional Cloud Security Engineer practice questions free gives you a powerful edge by allowing you to:
- Understand the exam structure and question formats
- Discover your strong and weak areas
- Build the confidence you need for test day success
Below, you will find 50 free Google Professional Cloud Security Engineer practice questions designed to match the real exam in both difficulty and topic coverage. They’re ideal for self-assessment or final review. You can click on each Question to explore the details.
You are working with protected health information (PHI) for an electronic health record system. The privacy officer is concerned that sensitive data is stored in the analytics system. You are tasked with anonymizing the sensitive data in a way that is not reversible. Also, the anonymized data should not preserve the character set and length. Which Google Cloud solution should you use?
A. Cloud Data Loss Prevention with deterministic encryption using AES-SIV
B. Cloud Data Loss Prevention with format-preserving encryption
C. Cloud Data Loss Prevention with cryptographic hashing
D. Cloud Data Loss Prevention with Cloud Key Management Service wrapped cryptographic keys
You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?
A. Cloud External Key Manager
B. Customer-managed encryption keys
C. Customer-supplied encryption keys
D. Google default encryption
Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet. What should your team grant to Engineering Group A to meet this requirement?
A. Compute Network User Role at the host project level.
B. Compute Network User Role at the subnet level.
C. Compute Shared VPC Admin Role at the host project level.
D. Compute Shared VPC Admin Role at the service project level.
You are on your company's development team. You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user's browser in a production environment. How should you prevent and fix this vulnerability?
A. Use Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability.
B. Set up an HTTPS load balancer, and then use Cloud Armor for the production environment to prevent the potential XSS attack.
C. Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library.
D. Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping.
You need to set up a Cloud interconnect connection between your company's on-premises data center and VPC host network. You want to make sure that on- premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?
A. Enable Private Google Access on the regional subnets and global dynamic routing mode.
B. Set up a Private Service Connect endpoint IP address with the API bundle of “all-apis”, which is advertised as a route over the Cloud interconnect connection.
C. Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.
D. Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.
You work for a large organization where each business unit has thousands of users. You need to delegate management of access control permissions to each business unit. You have the following requirements: ✑ Each business unit manages access controls for their own projects. ✑ Each business unit manages access control permissions at scale. ✑ Business units cannot access other business units' projects. ✑ Users lose their access if they move to a different business unit or leave the company. ✑ Users and access control permissions are managed by the on-premises directory service. What should you do? (Choose two.)
A. Use VPC Service Controls to create perimeters around each business unit’s project.
B. Organize projects in folders, and assign permissions to Google groups at the folder level.
C. Group business units based on Organization Units (OUs) and manage permissions based on OUs
D. Create a project naming convention, and use Google’s IAM Conditions to manage access based on the prefix of project names.
E. Use Google Cloud Directory Sync to synchronize users and group memberships in Cloud Identity.
You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data. Specifically, your company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?
A. Use customer-managed encryption keys.
B. Use Google’s Identity and Access Management (IAM) service to manage access controls on Google Cloud.
C. Enable Admin activity logs to monitor access to resources.
D. Enable Access Transparency logs with Access Approval requests for Google employees.
You manage a BigQuery analytical data warehouse in your organization. You want to keep data for all your customers in a common table while you also restrict query access based on rows and columns permissions. Non-query operations should not be supported. What should you do? (Choose two.)
A. Create row-level access policies to restrict the result data when you run queries with the filter expression set to TRUE.
B. Configure column-level encryption by using Authenticated Encryption with Associated Data (AEAD) functions with Cloud Key Management Service (KMS) to control access to columns at query runtime.
C. Create row-level access policies to restrict the result data when you run queries with the filter expression set to FALSE.
D. Configure dynamic data masking rules to control access to columns at query runtime.
E. Create column-level policy tags to control access to columns at query runtime.
A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects. Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources. Which type of access should your team grant to meet this requirement?
A. Organization Administrator
B. Security Reviewer
C. Organization Role Administrator
D. Organization Policy Administrator
Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements: ✑ The Cloud Storage bucket in Project A can only be readable from Project
A. ✑ The Cloud Storage bucket in Project A cannot be accessed from outside the network.
✑ Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.
What should the security team do?
B. Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.
C. Enable VPC Service Controls, create a perimeter around Projects A and B, and include the Cloud Storage API in the Service Perimeter configuration.
D. Enable Private Access in both Project A and B’s networks with strict firewall rules that allow communication between the networks.
E. Enable VPC Peering between Project A and B’s networks with strict firewall rules that allow communication between the networks.
Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1. What command should you execute?
A. • organization poli-cy:constraints/gcp.restrictStorageNonCmekServices• binding at: org1• policy type: allow• policy value: all supported services
B. • organization policy: con-straints/gcp.restrictNonCmekServices• binding at: org1• policy type: deny• policy value: storage.googleapis.com
C. • organization policy: con-straints/gcp.restrictStorageNonCmekServices• binding at: org1• policy type: deny• policy value: storage.googleapis.com
D. • organization policy: con-straints/gcp.restrictNonCmekServices• binding at: org1• policy type: allow• policy value: storage.googleapis.com
You need to create a VPC that enables your security team to control network resources such as firewall rules. How should you configure the network to allow for separation of duties for network resources?
A. Set up multiple VPC networks, and set up multi-NIC virtual appliances to connect the networks.
B. Set up VPC Network Peering, and allow developers to peer their network with a Shared VPC.
C. Set up a VPC in a project. Assign the Compute Network Admin role to the security team, and assign the Compute Admin role to the developers.
D. Set up a Shared VPC where the security team manages the firewall rules, and share the network with developers via service projects.
Your organization's Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?
A. Deploy a Cloud NAT Gateway in the service project for the MIG.
B. Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.
C. Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.
D. Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.
A service account key has been publicly exposed on multiple public code repositories. After reviewing the logs, you notice that the keys were used to generate short-lived credentials. You need to immediately remove access with the service account. What should you do?
A. Delete the compromised service account.
B. Disable the compromised service account key.
C. Wait until the service account credentials expire automatically.
D. Rotate the compromised service account key.
Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises. What should you do?
A. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS). Configure a rule to let principals in the pool impersonate the Google Cloud service account.
B. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS). Let all principals in the pool impersonate the Google Cloud service account.
C. Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine. Configure a rule to let principals in the pool impersonate the Google Cloud service account.
D. Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine. Let all principals in the pool impersonate the Google Cloud service account.
Your organization's record data exists in Cloud Storage. You must retain all record data for at least seven years. This policy must be permanent. What should you do?
A. 1. Identify buckets with record data.2. Apply a retention policy, and set it to retain for seven years.3. Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs.
B. 1. Identify buckets with record data.2. Apply a retention policy, and set it to retain for seven years.3. Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission.
C. 1. Identify buckets with record data.2. Enable the bucket policy only to ensure that data is retained.3. Enable bucket lock.
D. 1. Identify buckets with record data.2. Apply a retention policy and set it to retain for seven years.3. Enable bucket lock.
You want to evaluate your organization's Google Cloud instance for PCI compliance. You need to identify Google's inherent controls. Which document should you review to find the information?
A. Google Cloud Platform: Customer Responsibility Matrix
B. PCI DSS Requirements and Security Assessment Procedures
C. PCI SSC Cloud Computing Guidelines
D. Product documentation for Compute Engine
A large financial institution is moving its Big Data analytics to Google Cloud Platform. They want to have maximum control over the encryption process of data stored at rest in BigQuery. What technique should the institution use?
A. Use Cloud Storage as a federated Data Source.
B. Use a Cloud Hardware Security Module (Cloud HSM).
C. Customer-managed encryption keys (CMEK).
D. Customer-supplied encryption keys (CSEK).
You have numerous private virtual machines on Google Cloud. You occasionally need to manage the servers through Secure Socket Shell (SSH) from a remote location. You want to configure remote access to the servers in a manner that optimizes security and cost efficiency. What should you do?
A. Create a site-to-site VPN from your corporate network to Google Cloud.
B. Configure server instances with public IP addresses. Create a firewall rule to only allow traffic from your corporate IPs.
C. Create a firewall rule to allow access from the Identity-Aware Proxy (IAP) IP range. Grant the role of an IAP-secured Tunnel User to the administrators.
D. Create a jump host instance with public IP. Manage the instances by connecting through the jump host.
You are auditing all your Google Cloud resources in the production project. You want to identify all principals who can change firewall rules. What should you do?
A. Use Policy Analyzer to query the permissions compute.firewalls.get or compute.firewalls.list.
B. Use Firewall Insights to understand your firewall rules usage patterns.
C. Reference the Security Health Analytics – Firewall Vulnerability Findings in the Security Command Center.
D. Use Policy Analyzer to query the permissions compute.firewalls.create or compute.firewalls.update or compute.firewalls.delete.
Your organization wants full control of the keys used to encrypt data at rest in their Google Cloud environments. Keys must be generated and stored outside of Google and integrate with many Google Services including BigQuery. What should you do?
A. Use customer-supplied encryption keys (CSEK) with keys generated on trusted external systems. Provide the raw CSEK as part of the API call.
B. Create a KMS key that is stored on a Google managed FIPS 140-2 level 3 Hardware Security Module (HSM). Manage the Identity and Access Management (IAM) permissions settings, and set up the key rotation period.
C. Use Cloud External Key Management (EKM) that integrates with an external Hardware Security Module (HSM) system from supported vendors.
D. Create a Cloud Key Management Service (KMS) key with imported key material. Wrap the key for protection during import. Import the key generated on a trusted system in Cloud KMS.
You are migrating an on-premises data warehouse to BigQuery, Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must: • Protect data at rest with full lifecycle management on cryptographic keys. • Implement a separate key management provider from data management. • Provide visibility into all encryption key requests. What services should be included in the data warehouse implementation? (Choose two.)
A. Customer-managed encryption keys
B. Customer-Supplied Encryption Keys
C. Key Access Justifications
D. Access Transparency and Approval
E. Cloud External Key Manager
Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution. What should you do?
A. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Create a Cloud Run function to check for the VM settings, generate metrics, and run the function regularly.
B. 1. Activate Virtual Machine Threat Detection in Security Command Center (SCC) Premium.2. Monitor the findings in SCC.
C. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Activate Confidential Computing.3. Enforce these actions by using organization policies.
D. 1. Use secure hardened images from the Google Cloud Marketplace.2. When deploying the images, activate the Confidential Computing option.3. Enforce the use of the correct images and Confidential Computing by using organization policies.
Your company plans to move most of its IT infrastructure to Google Cloud. They want to leverage their existing on-premises Active Directory as an identity provider for Google Cloud. Which two steps should you take to integrate the company's on-premises Active Directory with Google Cloud and configure access management? (Choose two.)
A. Use Identity Platform to provision users and groups to Google Cloud.
B. Use Cloud Identity SAML integration to provision users and groups to Google Cloud.
C. Install Google Cloud Directory Sync and connect it to Active Directory and Cloud Identity.
D. Create Identity and Access Management (IAM) roles with permissions corresponding to each Active Directory group.
E. Create Identity and Access Management (IAM) groups with permissions corresponding to each Active Directory group.
A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities. Which service should be used to accomplish this?
A. Cloud Armor
B. Google Cloud Audit Logs
C. Web Security Scanner
D. Anomaly Detection
You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use?
A. Policy Troubleshooter
B. Policy Analyzer
C. IAM Recommender
D. Policy Simulator
Your organization wants to be General Data Protection Regulation (GDPR) compliant. You want to ensure that your DevOps teams can only create Google Cloud resources in the Europe regions. What should you do?
A. Use Identity-Aware Proxy (IAP) with Access Context Manager to restrict the location of Google Cloud resources.
B. Use the org policy constraint ‘Google Cloud Platform – Resource Location Restriction’ on your Google Cloud organization node.
C. Use the org policy constraint ‘Restrict Resource Service Usage’ on your Google Cloud organization node.
D. Use Identity and Access Management (IAM) custom roles to ensure that your DevOps team can only create resources in the Europe regions.
For compliance reporting purposes, the internal audit department needs you to provide the list of virtual machines (VMs) that have critical operating system (OS) security updates available, but not installed. You must provide this list every six months, and you want to perform this task quickly. What should you do?
A. Run a Security Command Center security scan on all VMs to extract a list of VMs with critical OS vulnerabilities every six months.
B. Run a gcloud CLI command from the Command Line Interface (CLI) to extract the VM’s OS version information every six months.
C. Ensure that the Cloud Logging agent is installed on all VMs, and extract the OS last update log date every six months.
D. Ensure the OS Config agent is installed on all VMs and extract the patch status dashboard every six months.
Your team needs to obtain a unified log view of all development cloud projects in your SIEM. The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization. Which logging export strategy should you use to meet the requirements?
A. 1. Export logs to a Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project. 2. Subscribe SIEM to the topic.
B. 1. Create a Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project. 2. Process Cloud Storage objects in SIEM.
C. 1. Export logs in each dev project to a Cloud Pub/Sub topic in a dedicated SIEM project. 2. Subscribe SIEM to the topic.
D. 1. Create a Cloud Storage sink with a publicly shared Cloud Storage bucket in each project. 2. Process Cloud Storage objects in SIEM.
You are using Security Command Center (SCC) to protect your workloads and receive alerts for suspected security breaches at your company. You need to detect cryptocurrency mining software. Which SCC service should you use?
A. Virtual Machine Threat Detection
B. Container Threat Detection
C. Rapid Vulnerability Detection
D. Web Security Scanner
A customer has an analytics workload running on Compute Engine that should have limited internet access. Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet. The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do?
A. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000.
B. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000.
C. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000.
D. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000.
A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity-Aware Proxy. What should the customer do to meet these requirements?
A. Make sure that the ERP system can validate the JWT assertion in the HTTP requests.
B. Make sure that the ERP system can validate the identity headers in the HTTP requests.
C. Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.
D. Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.
Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?
A. Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.
B. Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.
C. In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.
D. Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.
You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?
A. Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.
B. Configure your Compute Engine instances to use the Google Cloud’s operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.
C. Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.
D. Configure a custom retention policy of 12 years on your Google Cloud’s operations suite log bucket in the EUROPE-WEST1 region.
You need to enable VPC Service Controls and allow changes to perimeters in existing environments without preventing access to resources. Which VPC Service Controls mode should you use?
A. Cloud Run
B. Native
C. Enforced
D. Dry run
A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator. What should you do?
A. Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.
B. Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.
C. On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.
D. On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.
When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.)
A. Ensure that the app does not run as PID 1.
B. Package a single app as a container.
C. Remove any unnecessary tools not needed by the app.
D. Use public container images as a base image for the app.
E. Use many container image layers to hide sensitive information.
Your company's Chief Information Security Officer (CISO) creates a requirement that business data must be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on the details to implement this requirement, you determine the following: ✑ The services in scope are included in the Google Cloud Data Residency Terms. ✑ The business data remains within specific locations under the same organization. ✑ The folder structure can contain multiple data residency locations. You plan to use the Resource Location Restriction organization policy constraint. At which level in the resource hierarchy should you set the constraint?
A. Folder
B. Resource
C. Project
D. Organization
You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your Google Cloud VPCs based on packet header information. However, you want the capability to explore network flows and their payload to aid investigations. Which Google Cloud product should you use?
A. Marketplace IDS
B. VPC Flow Logs
C. VPC Service Controls logs
D. Packet Mirroring
E. Google Cloud Armor Deep Packet Inspection
Your organization is transitioning to Google Cloud. You want to ensure that only trusted container images are deployed on Google Kubernetes Engine (GKE) clusters in a project. The containers must be deployed from a centrally managed Container Registry and signed by a trusted authority. What should you do? (Choose two.)
A. Enable Container Threat Detection in the Security Command Center (SCC) for the project.
B. Configure the trusted image organization policy constraint for the project.
C. Create a custom organization policy constraint to enforce Binary Authorization for Google Kubernetes Engine (GKE).
D. Enable PodSecurity standards, and set them to Restricted.
E. Configure the Binary Authorization policy with respective attestations for the project.
You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?
A. Use Google Cloud Directory Sync to convert the unmanaged user accounts.
B. Create a new managed user account for each consumer user account.
C. Use the transfer tool for unmanaged user accounts.
D. Configure single sign-on using a customer’s third-party provider.
How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?
A. Send all logs to the SIEM system via an existing protocol such as syslog.
B. Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.
C. Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.
D. Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.
You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your VPCs based on network logs. However, you want to explore your environment using network payloads and headers. Which Google Cloud product should you use?
A. Cloud IDS
B. VPC Service Controls logs
C. VPC Flow Logs
D. Google Cloud Armor
E. Packet Mirroring
A company has redundant mail servers in different Google Cloud Platform regions and wants to route customers to the nearest mail server based on location. How should the company accomplish this?
A. Configure TCP Proxy Load Balancing as a global load balancing service listening on port 995.
B. Create a Network Load Balancer to listen on TCP port 995 with a forwarding rule to forward traffic based on location.
C. Use Cross-Region Load Balancing with an HTTP(S) load balancer to route traffic to the nearest region.
D. Use Cloud CDN to route the mail traffic to the closest origin mail server based on client IP address.
You manage a mission-critical workload for your organization, which is in a highly regulated industry. The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpoint computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements: • Manage the data encryption key (DEK) outside the Google Cloud boundary. • Maintain full control of encryption keys through a third-party provider. • Encrypt the sensitive data before uploading it to Cloud Storage. • Decrypt the sensitive data during processing in the Compute Engine VMs. • Encrypt the sensitive data in memory while in use in the Compute Engine VMs. What should you do? (Choose two.)
A. Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
B. Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
C. Create Confidential VMs to access the sensitive data.
D. Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.
E. Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets.
You need to audit the network segmentation for your Google Cloud footprint. You currently operate Production and Non-Production infrastructure-as-a-service (IaaS) environments. All your VM instances are deployed without any service account customization. After observing the traffic in your custom network, you notice that all instances can communicate freely `" despite tag-based VPC firewall rules in place to segment traffic properly `" with a priority of 1000. What are the most likely reasons for this behavior?
A. All VM instances are missing the respective network tags.
B. All VM instances are residing in the same network subnet.
C. All VM instances are configured with the same network route.
D. A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 999. E . A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 1001.
Your company's cloud security policy dictates that VM instances should not have an external IP address. You need to identify the Google Cloud service that will allow VM instances without external IP addresses to connect to the internet to update the VMs. Which service should you use?
A. Identity Aware-Proxy
B. Cloud NAT
C. TCP/UDP Load Balancing
D. Cloud DNS
A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery What should you do?
A. Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.
B. Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery.
C. Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery.
D. Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery.
You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk. What should you do?
A. Migrate the application into an isolated project using a ג€Lift & Shiftג€ approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.
B. Migrate the application into an isolated project using a ג€Lift & Shiftג€ approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.
C. Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.
D. Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project. Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.
You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides. What should you do?
A. Enable Access Transparency Logging.
B. Deploy Assured Workloads.
C. Deploy resources only to regions permitted by data residency requirements.
D. Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.
Free Access Full Google Professional Cloud Security Engineer Practice Questions Free
Want more hands-on practice? Click here to access the full bank of Google Professional Cloud Security Engineer practice questions free and reinforce your understanding of all exam objectives.
We update our question sets regularly, so check back often for new and relevant content.
Good luck with your Google Professional Cloud Security Engineer certification journey!