Google Professional Cloud Security Engineer Mock Test Free – 50 Realistic Questions to Prepare with Confidence.
Getting ready for your Google Professional Cloud Security Engineer certification exam? Start your preparation the smart way with our Google Professional Cloud Security Engineer Mock Test Free – a carefully crafted set of 50 realistic, exam-style questions to help you practice effectively and boost your confidence.
Using a mock test free for Google Professional Cloud Security Engineer exam is one of the best ways to:
- Familiarize yourself with the actual exam format and question style
- Identify areas where you need more review
- Strengthen your time management and test-taking strategy
Below, you will find 50 free questions from our Google Professional Cloud Security Engineer Mock Test Free resource. These questions are structured to reflect the real exam’s difficulty and content areas, helping you assess your readiness accurately.
An organization is moving applications to Google Cloud while maintaining a few mission-critical applications on-premises. The organization must transfer the data at a bandwidth of at least 50 Gbps. What should they use to ensure secure continued connectivity between sites?
A. Dedicated Interconnect
B. Cloud Router
C. Cloud VPN
D. Partner Interconnect
You are auditing all your Google Cloud resources in the production project. You want to identify all principals who can change firewall rules. What should you do?
A. Use Policy Analyzer to query the permissions compute.firewalls.get or compute.firewalls.list.
B. Use Firewall Insights to understand your firewall rules usage patterns.
C. Reference the Security Health Analytics – Firewall Vulnerability Findings in the Security Command Center.
D. Use Policy Analyzer to query the permissions compute.firewalls.create or compute.firewalls.update or compute.firewalls.delete.
A customer has an analytics workload running on Compute Engine that should have limited internet access. Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet. The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do?
A. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000.
B. Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000.
C. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000.
D. Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000.
After completing a security vulnerability assessment, you learned that cloud administrators leave Google Cloud CLI sessions open for days. You need to reduce the risk of attackers who might exploit these open sessions by setting these sessions to the minimum duration. What should you do?
A. Set the session duration for the Google session control to one hour.
B. Set the reauthentication frequency for the Google Cloud Session Control to one hour.
C. Set the organization policy constraint constraints/iam.allowServiceAccountCredentialLifetimeExtension to one hour.
D. Set the organization policy constraint constraints/iam.serviceAccountKeyExpiryHours to one hour and inheritFromParent to false.
Which two security characteristics are related to the use of VPC peering to connect two VPC networks? (Choose two.)
A. Central management of routes, firewalls, and VPNs for peered networks
B. Non-transitive peered networks; where only directly peered networks can communicate
C. Ability to peer networks that belong to different Google Cloud organizations
D. Firewall rules that can be created with a tag from one peered network to another peered network
E. Ability to share specific subnets across peered networks
Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer. What type of Load Balancing should you use?
A. Network Load Balancing
B. HTTP(S) Load Balancing
C. TCP Proxy Load Balancing
D. SSL Proxy Load Balancing
You have numerous private virtual machines on Google Cloud. You occasionally need to manage the servers through Secure Socket Shell (SSH) from a remote location. You want to configure remote access to the servers in a manner that optimizes security and cost efficiency. What should you do?
A. Create a site-to-site VPN from your corporate network to Google Cloud.
B. Configure server instances with public IP addresses. Create a firewall rule to only allow traffic from your corporate IPs.
C. Create a firewall rule to allow access from the Identity-Aware Proxy (IAP) IP range. Grant the role of an IAP-secured Tunnel User to the administrators.
D. Create a jump host instance with public IP. Manage the instances by connecting through the jump host.
Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises. What should you do?
A. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS). Configure a rule to let principals in the pool impersonate the Google Cloud service account.
B. Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS). Let all principals in the pool impersonate the Google Cloud service account.
C. Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine. Configure a rule to let principals in the pool impersonate the Google Cloud service account.
D. Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine. Let all principals in the pool impersonate the Google Cloud service account.
Your company's Google Cloud organization has about 200 projects and 1,500 virtual machines. There is no uniform strategy for logs and events management, which reduces visibility for your security operations team. You need to design a logs management solution that provides visibility and allows the security team to view the environment's configuration. What should you do?
A. 1. Create a dedicated log sink for each project that is in scope.2. Use a BigQuery dataset with time partitioning enabled as a destination of the log sinks.3. Deploy alerts based on log metrics in every project.4. Grant the role “Monitoring Viewer” to the security operations team in each project.
B. 1. Create one log sink at the organization level that includes all the child resources.2. Use as destination a Pub/Sub topic to ingest the logs into the security information and event. management (SIEM) on-premises, and ensure that the right team can access the SIEM.3. Grant the Viewer role at organization level to the security operations team.
C. 1. Enable network logs and data access logs for all resources in the “Production” folder.2. Do not create log sinks to avoid unnecessary costs and latency.3. Grant the roles “Logs Viewer” and “Browser” at project level to the security operations team.
D. 1. Create one sink for the “Production” folder that includes child resources and one sink for the logs ingested at the organization level that excludes child resources.2. As destination, use a log bucket with a minimum retention period of 90 days in a project that can be accessed by the security team.3. Grant the security operations team the role of Security Reviewer at organization level.
Your application is deployed as a highly available, cross-region solution behind a global external HTTP(S) load balancer. You notice significant spikes in traffic from multiple IP addresses, but it is unknown whether the IPs are malicious. You are concerned about your application's availability. You want to limit traffic from these clients over a specified time interval. What should you do?
A. Configure a throttle action by using Google Cloud Armor to limit the number of requests per client over a specified time interval.
B. Configure a rate_based_ban action by using Google Cloud Armor and set the ban_duration_sec parameter to the specified lime interval.
C. Configure a firewall rule in your VPC to throttle traffic from the identified IP addresses.
D. Configure a deny action by using Google Cloud Armor to deny the clients that issued too many requests over the specified time interval.
Your organization's record data exists in Cloud Storage. You must retain all record data for at least seven years. This policy must be permanent. What should you do?
A. 1. Identify buckets with record data.2. Apply a retention policy, and set it to retain for seven years.3. Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs.
B. 1. Identify buckets with record data.2. Apply a retention policy, and set it to retain for seven years.3. Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission.
C. 1. Identify buckets with record data.2. Enable the bucket policy only to ensure that data is retained.3. Enable bucket lock.
D. 1. Identify buckets with record data.2. Apply a retention policy and set it to retain for seven years.3. Enable bucket lock.
In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard Which options should you recommend to meet the requirements?
A. Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.
B. Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.
C. Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients’ TLS connections.
D. Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.
Your organization has implemented synchronization and SAML federation between Cloud Identity and Microsoft Active Directory. You want to reduce the risk of Google Cloud user accounts being compromised. What should you do?
A. Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with security keys in the Google Admin console.
B. Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with verification codes via text or phone call in the Google Admin console.
C. Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with security keys in the Google Admin console.
D. Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with verification codes via text or phone call in the Google Admin console.
You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization's compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement is for customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?
A. Organization Policy Service constraints
B. Shielded VM instances
C. Access control lists
D. Geolocation access controls
E. Google Cloud Armor
Your company runs a website that will store PII on Google Cloud Platform. To comply with data privacy regulations, this data can only be stored for a specific amount of time and must be fully deleted after this specific period. Data that has not yet reached the time period should not be deleted. You want to automate the process of complying with this regulation. What should you do?
A. Store the data in a single Persistent Disk, and delete the disk at expiration time.
B. Store the data in a single BigQuery table and set the appropriate table expiration time.
C. Store the data in a single Cloud Storage bucket and configure the bucket’s Time to Live.
D. Store the data in a single BigTable table and set an expiration time on the column families.
You are responsible for protecting highly sensitive data in BigQuery. Your operations teams need access to this data, but given privacy regulations, you want to ensure that they cannot read the sensitive fields such as email addresses and first names. These specific sensitive fields should only be available on a need-to- know basis to the Human Resources team. What should you do?
A. Perform data masking with the Cloud Data Loss Prevention API, and store that data in BigQuery for later use.
B. Perform data redaction with the Cloud Data Loss Prevention API, and store that data in BigQuery for later use.
C. Perform data inspection with the Cloud Data Loss Prevention API, and store that data in BigQuery for later use.
D. Perform tokenization for Pseudonymization with the Cloud Data Loss Prevention API, and store that data in BigQuery for later use.
You work for an organization in a regulated industry that has strict data protection requirements. The organization backs up their data in the cloud. To comply with data privacy regulations, this data can only be stored for a specific length of time and must be deleted after this specific period. You want to automate the compliance with this regulation while minimizing storage costs. What should you do?
A. Store the data in a persistent disk, and delete the disk at expiration time.
B. Store the data in a Cloud Bigtable table, and set an expiration time on the column families.
C. Store the data in a BigQuery table, and set the table’s expiration time.
D. Store the data in a Cloud Storage bucket, and configure the bucket’s Object Lifecycle Management feature.
An organization's security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud and where Google's responsibility lies. They are mostly running workloads using Google Cloud's platform-as-a-Service (PaaS) offerings, including App Engine primarily. Which area in the technology stack should they focus on as their primary responsibility when using App Engine?
A. Configuring and monitoring VPC Flow Logs
B. Defending against XSS and SQLi attacks
C. Managing the latest updates and security patches for the Guest OS
D. Encrypting all stored data
For compliance reasons, an organization needs to ensure that in-scope PCI Kubernetes Pods reside on `in-scope` Nodes only. These Nodes can only contain the `in-scope` Pods. How should the organization achieve this objective?
A. Add a nodeSelector field to the pod configuration to only use the Nodes labeled inscope: true.
B. Create a node pool with the label inscope: true and a Pod Security Policy that only allows the Pods to run on Nodes with that label.
C. Place a taint on the Nodes with the label inscope: true and effect NoSchedule and a toleration to match in the Pod configuration.
D. Run all in-scope Pods in the namespace ג€in-scope-pciג€.
Your company’s users access data in a BigQuery table. You want to ensure they can only access the data during working hours. What should you do?
A. Assign a BigQuery Data Viewer role along with an IAM condition that limits the access to specified working hours.
B. Run a gsutil script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.
C. Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours.
D. Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraint for BigQuery during the specified working hours.
A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication Which GCP product should the customer implement to meet these requirements?
A. Cloud Identity-Aware Proxy
B. Cloud Armor
C. Cloud Endpoints
D. Cloud VPN
For data residency requirements, you want your secrets in Google Clouds Secret Manager to only have payloads in europe-west1 and europe-west4. Your secrets must be highly available in both regions. What should you do?
A. Create your secret with a user managed replication policy, and choose only compliant locations.
B. Create your secret with an automatic replication policy, and choose only compliant locations.
C. Create two secrets by using Terraform, one in europe-west1 and the other in europe-west4.
D. Create your secret with an automatic replication policy, and create an organizational policy to deny secret creation in non-compliant locations.
Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1. What command should you execute?
A. • organization poli-cy:constraints/gcp.restrictStorageNonCmekServices• binding at: org1• policy type: allow• policy value: all supported services
B. • organization policy: con-straints/gcp.restrictNonCmekServices• binding at: org1• policy type: deny• policy value: storage.googleapis.com
C. • organization policy: con-straints/gcp.restrictStorageNonCmekServices• binding at: org1• policy type: deny• policy value: storage.googleapis.com
D. • organization policy: con-straints/gcp.restrictNonCmekServices• binding at: org1• policy type: allow• policy value: storage.googleapis.com
A customer's internal security team must manage its own encryption keys for encrypting data on Cloud Storage and decides to use customer-supplied encryption keys (CSEK). How should the team complete this task?
A. Upload the encryption key to a Cloud Storage bucket, and then upload the object to the same bucket.
B. Use the gsutil command line tool to upload the object to Cloud Storage, and specify the location of the encryption key.
C. Generate an encryption key in the Google Cloud Platform Console, and upload an object to Cloud Storage using the specified key.
D. Encrypt the object, then use the gsutil command line tool or the Google Cloud Platform Console to upload the object to Cloud Storage.
Your company operates an application instance group that is currently deployed behind a Google Cloud load balancer in us-central-1 and is configured to use the Standard Tier network. The infrastructure team wants to expand to a second Google Cloud region, us-east-2. You need to set up a single external IP address to distribute new requests to the instance groups in both regions. What should you do?
A. Change the load balancer backend configuration to use network endpoint groups instead of instance groups.
B. Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.
C. Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.
D. Create a Cloud VPN connection between the two regions, and enable Google Private Access.
You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?
A. Cloud External Key Manager
B. Customer-managed encryption keys
C. Customer-supplied encryption keys
D. Google default encryption
Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services. Which two settings must remain disabled to meet these requirements? (Choose two.)
A. Public IP
B. IP Forwarding
C. Private Google Access
D. Static routes
E. IAM Network User Role
A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries. Where should you export the logs?
A. BigQuery datasets
B. Cloud Storage buckets
C. StackDriver logging
D. Cloud Pub/Sub topics
You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements: ✑ Export related logs for all projects in the Google Cloud organization. ✑ Export logs in near real-time to an external SIEM. What should you do? (Choose two.)
A. Create a Log Sink at the organization level with a Pub/Sub destination.
B. Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.
C. Enable Data Access audit logs at the organization level to apply to all projects.
D. Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.
E. Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.
You need to enforce a security policy in your Google Cloud organization that prevents users from exposing objects in their buckets externally. There are currently no buckets in your organization. Which solution should you implement proactively to achieve this goal with the least operational overhead?
A. Create an hourly cron job to run a Cloud Function that finds public buckets and makes them private.
B. Enable the constraints/storage.publicAccessPrevention constraint at the organization level.
C. Enable the constraints/storage.uniformBucketLevelAccess constraint at the organization level.
D. Create a VPC Service Controls perimeter that protects the storage.googleapis.com service in your projects that contains buckets. Add any new project that contains a bucket to the perimeter.
You need to create a VPC that enables your security team to control network resources such as firewall rules. How should you configure the network to allow for separation of duties for network resources?
A. Set up multiple VPC networks, and set up multi-NIC virtual appliances to connect the networks.
B. Set up VPC Network Peering, and allow developers to peer their network with a Shared VPC.
C. Set up a VPC in a project. Assign the Compute Network Admin role to the security team, and assign the Compute Admin role to the developers.
D. Set up a Shared VPC where the security team manages the firewall rules, and share the network with developers via service projects.
An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization's risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud. What solution would help meet the requirements?
A. Ensure that firewall rules are in place to meet the required controls.
B. Set up Cloud Armor to ensure that network security controls can be managed for G Suite.
C. Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.
D. Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.
A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet. Which two strategies should your team use to meet these requirements? (Choose two.)
A. Configure Private Google Access on the Compute Engine subnet
B. Avoid assigning public IP addresses to the Compute Engine cluster.
C. Make sure that the Compute Engine cluster is running on a separate subnet.
D. Turn off IP forwarding on the Compute Engine instances in the cluster.
E. Configure a Cloud NAT gateway.
A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time. What should you do?
A. Use Resource Manager on the organization level.
B. Use Forseti Security to automate inventory snapshots.
C. Use Stackdriver to create a dashboard across all projects.
D. Use Security Command Center to view all assets across the organization.
As part of your organization's zero trust strategy, you use Identity-Aware Proxy (IAP) to protect multiple applications. You need to ingest logs into a Security Information and Event Management (SIEM) system so that you are alerted to possible intrusions. Which logs should you analyze?
A. Data Access audit logs
B. Policy Denied audit logs
C. Cloud Identity user log events
D. Admin Activity audit logs
A customer's company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions. Which strategy should you use to meet these needs?
A. Create an organization node, and assign folders for each business unit.
B. Establish standalone projects for each business unit, using gmail.com accounts.
C. Assign GCP resources in a project, with a label identifying which business unit owns the resource.
D. Assign GCP resources in a VPC for each business unit to separate network access.
Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate, and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud. What should you do?
A. Use the Cloud Key Management Service to manage the data encryption key (DEK).
B. Use the Cloud Key Management Service to manage the key encryption key (KEK).
C. Use customer-supplied encryption keys to manage the data encryption key (DEK).
D. Use customer-supplied encryption keys to manage the key encryption key (KEK).
You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?
A. Change the access control model for the bucket
B. Update your sink with the correct bucket destination.
C. Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.
D. Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.
An organization is evaluating the use of Google Cloud Platform (GCP) for certain IT workloads. A well-established directory service is used to manage user identities and lifecycle management. This directory service must continue for the organization to use as the `source of truth` directory for identities. Which solution meets the organization's requirements?
A. Google Cloud Directory Sync (GCDS)
B. Cloud Identity
C. Security Assertion Markup Language (SAML)
D. Pub/Sub
You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements: ✑ The master key must be rotated at least once every 45 days. ✑ The solution that stores the master key must be FIPS 140-2 Level 3 validated. ✑ The master key must be stored in multiple regions within the US for redundancy. Which solution meets these requirements?
A. Customer-managed encryption keys with Cloud Key Management Service
B. Customer-managed encryption keys with Cloud HSM
C. Customer-supplied encryption keys
D. Google-managed encryption keys
You manage one of your organization's Google Cloud projects (Project A). A VPC Service Control (SC) perimeter is blocking API access requests to this project, including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project. Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least privilege. What should you do?
A. Configure an ingress policy for the perimeter in Project A, and allow access for the service account in Project B to collect messages.
B. Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is located in Project
C. Create a perimeter bridge between Project A and Project B to allow the required communication between both projects.
D. Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project
E.
Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution. What should you do?
A. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Create a Cloud Run function to check for the VM settings, generate metrics, and run the function regularly.
B. 1. Activate Virtual Machine Threat Detection in Security Command Center (SCC) Premium.2. Monitor the findings in SCC.
C. 1. Use Google Shielded VM including secure boot, Virtual Trusted Platform Module (vTPM), and integrity monitoring.2. Activate Confidential Computing.3. Enforce these actions by using organization policies.
D. 1. Use secure hardened images from the Google Cloud Marketplace.2. When deploying the images, activate the Confidential Computing option.3. Enforce the use of the correct images and Confidential Computing by using organization policies.
Your DevOps team uses Packer to build Compute Engine images by using this process: 1. Create an ephemeral Compute Engine VM. 2. Copy a binary from a Cloud Storage bucket to the VM's file system. 3. Update the VM's package manager. 4. Install external packages from the internet onto the VM. Your security team just enabled the organizational policy, constraints/ compute.vmExternalIpAccess, to restrict the usage of public IP Addresses on VMs. In response, your DevOps team updated their scripts to remove public IP addresses on the Compute Engine VMs; however, the build pipeline is failing due to connectivity issues. What should you do? (Choose two.)
A. Provision an HTTP load balancer with the VM in an unmanaged instance group to allow inbound connections from the internet to your VM.
B. Provision a Cloud NAT instance in the same VPC and region as the Compute Engine VM.
C. Enable Private Google Access on the subnet that the Compute Engine VM is deployed within.
D. Update the VPC routes to allow traffic to and from the internet.
E. Provision a Cloud VPN tunnel in the same VPC and region as the Compute Engine VM.
You need to set up a Cloud Interconnect connection between your company’s on-premises data center and VPC host network. You want to make sure that on-premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?
A. Enable Private Google Access on the regional subnets and global dynamic routing mode.
B. Create a CNAME to map *.googleapis.com to restricted.googleapis.com, and create A records for restricted.googleapis.com mapped to 199.36.153.8/30.
C. Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.
D. Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.
You are a Cloud Identity administrator for your organization. In your Google Cloud environment, groups are used to manage user permissions. Each application team has a dedicated group. Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups. What should you do?
A. Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.
B. Set an Identity and Access Management (IAM) policy that includes a condition that restricts group membership to user principals that belong to your organization.
C. Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.
D. Export the Cloud Identity logs to BigQuery. Configure an alert for external members added to groups. Have the alert trigger a Cloud Function instance that removes the external members from the group.
A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer's browser and GCP when the customers checkout online. What should they do?
A. Configure an SSL Certificate on an L7 Load Balancer and require encryption.
B. Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.
C. Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.
D. Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.
You are running applications outside Google Cloud that need access to Google Cloud resources. You are using workload identity federation to grant external identities Identity and Access Management (IAM) roles to eliminate the maintenance and security burden associated with service account keys. You must protect against attempts to spoof another user's identity and gain unauthorized access to Google Cloud resources. What should you do? (Choose two.)
A. Enable data access logs for IAM APIs.
B. Limit the number of external identities that can impersonate a service account.
C. Use a dedicated project to manage workload identity pools and providers.
D. Use immutable attributes in attribute mappings.
E. Limit the resources that a service account can access.
You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides. What should you do?
A. Enable Access Transparency Logging.
B. Deploy Assured Workloads.
C. Deploy resources only to regions permitted by data residency requirements.
D. Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.
You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?
A. Use Google Cloud Directory Sync to convert the unmanaged user accounts.
B. Create a new managed user account for each consumer user account.
C. Use the transfer tool for unmanaged user accounts.
D. Configure single sign-on using a customer’s third-party provider.
Your company conducts clinical trials and needs to analyze the results of a recent study that are stored in BigQuery. The interval when the medicine was taken contains start and stop dates. The interval data is critical to the analysis, but specific dates may identify a particular batch and introduce bias. You need to obfuscate the start and end dates for each row and preserve the interval data. What should you do?
A. Use date shifting with the context set to the unique ID of the test subject.
B. Extract the date using TimePartConfig from each date field and append a random month and year.
C. Use bucketing to shift values to a predetermined date based on the initial value.
D. Use the FFX mode of format preserving encryption (FPE) and maintain data consistency.
Access Full Google Professional Cloud Security Engineer Mock Test Free
Want a full-length mock test experience? Click here to unlock the complete Google Professional Cloud Security Engineer Mock Test Free set and get access to hundreds of additional practice questions covering all key topics.
We regularly update our question sets to stay aligned with the latest exam objectives—so check back often for fresh content!
Start practicing with our Google Professional Cloud Security Engineer mock test free today—and take a major step toward exam success!