Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Mock Test Free

Google Professional Cloud Database Engineer Mock Test Free

Table of Contents

Toggle
  • Google Professional Cloud Database Engineer Mock Test Free – 50 Realistic Questions to Prepare with Confidence.
  • Access Full Google Professional Cloud Database Engineer Mock Test Free

Google Professional Cloud Database Engineer Mock Test Free – 50 Realistic Questions to Prepare with Confidence.

Getting ready for your Google Professional Cloud Database Engineer certification exam? Start your preparation the smart way with our Google Professional Cloud Database Engineer Mock Test Free – a carefully crafted set of 50 realistic, exam-style questions to help you practice effectively and boost your confidence.

Using a mock test free for Google Professional Cloud Database Engineer exam is one of the best ways to:

  • Familiarize yourself with the actual exam format and question style
  • Identify areas where you need more review
  • Strengthen your time management and test-taking strategy

Below, you will find 50 free questions from our Google Professional Cloud Database Engineer Mock Test Free resource. These questions are structured to reflect the real exam’s difficulty and content areas, helping you assess your readiness accurately.

Question 1

You are designing a highly available (HA) Cloud SQL for PostgreSQL instance that will be used by 100 databases. Each database contains 80 tables that were migrated from your on-premises environment to Google Cloud. The applications that use these databases are located in multiple regions in the US, and you need to ensure that read and write operations have low latency. What should you do?

A. Deploy 2 Cloud SQL instances in the us-central1 region with HA enabled, and create read replicas in us-east1 and us-west1.

B. Deploy 2 Cloud SQL instances in the us-central1 region, and create read replicas in us-east1 and us-west1.

C. Deploy 4 Cloud SQL instances in the us-central1 region with HA enabled, and create read replicas in us-central1, us-east1, and us-west1.

D. Deploy 4 Cloud SQL instances in the us-central1 region, and create read replicas in us-central1, us-east1 and us-west1.

 


Correct Answer: B

Question 2

An analytics team needs to read data out of Cloud SQL for SQL Server and update a table in Cloud Spanner. You need to create a service account and grant least privilege access using predefined roles. What roles should you assign to the service account?

A. roles/cloudsql.viewer and roles/spanner.databaseUser

B. roles/cloudsql.editor and roles/spanner.admin

C. roles/cloudsql.client and roles/spanner.databaseReader

D. roles/cloudsql.instanceUser and roles/spanner.databaseUser

 


Correct Answer: C

Question 3

Your company is migrating all legacy applications to Google Cloud. All on-premises applications are using legacy Oracle 12c databases with Oracle Real Application Cluster (RAC) for high availability (HA) and Oracle Data Guard for disaster recovery. You need a solution that requires minimal code changes, provides the same high availability you have today on-premises, and supports a low latency network for migrated legacy applications. What should you do?

A. Migrate the databases to Cloud Spanner.

B. Migrate the databases to Cloud SQL, and enable a standby database.

C. Migrate the databases to Compute Engine using regional persistent disks.

D. Migrate the databases to Bare Metal Solution for Oracle.

 


Correct Answer: D

Question 4

You support a consumer inventory application that runs on a multi-region instance of Cloud Spanner. A customer opened a support ticket to complain about slow response times. You notice a Cloud Monitoring alert about high CPU utilization. You want to follow Google-recommended practices to address the CPU performance issue. What should you do first?

A. Increase the number of processing units.

B. Modify the database schema, and add additional indexes.

C. Shard data required by the application into multiple instances.

D. Decrease the number of processing units.

 


Correct Answer: A

Question 5

Your organization works with sensitive data that requires you to manage your own encryption keys. You are working on a project that stores that data in a Cloud SQL database. You need to ensure that stored data is encrypted with your keys. What should you do?

A. Export data periodically to a Cloud Storage bucket protected by Customer-Supplied Encryption Keys.

B. Use Cloud SQL Auth proxy.

C. Connect to Cloud SQL using a connection that has SSL encryption.

D. Use customer-managed encryption keys with Cloud SQL.

 


Correct Answer: C

Question 6

Your company's mission-critical, globally available application is supported by a Cloud Spanner database. Experienced users of the application have read and write access to the database, but new users are assigned read-only access to the database. You need to assign the appropriate Cloud Spanner Identity and Access Management (IAM) role to new users being onboarded soon. What roles should you set up?

A. roles/spanner.databaseReader

B. roles/spanner.databaseUser

C. roles/spanner.viewer

D. roles/spanner.backupWriter

 


Correct Answer: C

Question 7

Your project is using Bigtable to store data that should not be accessed from the public internet under any circumstances, even if the requestor has a valid service account key. You need to secure access to this data. What should you do?

A. Use Identity and Access Management (IAM) for Bigtable access control.

B. Use VPC Service Controls to create a trusted network for the Bigtable service.

C. Use customer-managed encryption keys (CMEK).

D. Use Google Cloud Armor to add IP addresses to an allowlist.

 


Correct Answer: B

Question 8

You are managing a set of Cloud SQL databases in Google Cloud. Regulations require that database backups reside in the region where the database is created. You want to minimize operational costs and administrative effort. What should you do?

A. Configure the automated backups to use a regional Cloud Storage bucket as a custom location.

B. Use the default configuration for the automated backups location.

C. Disable automated backups, and create an on-demand backup routine to a regional Cloud Storage bucket.

D. Disable automated backups, and configure serverless exports to a regional Cloud Storage bucket.

 


Correct Answer: C

Question 9

Your organization stores marketing data such as customer preferences and purchase history on Bigtable. The consumers of this database are predominantly data analysts and operations users. You receive a service ticket from the database operations department citing poor database performance between 9 AM-10 AM every day. The application team has confirmed no latency from their logs. A new cohort of pilot users that is testing a dataset loaded from a third-party data provider is experiencing poor database performance. Other users are not affected. You need to troubleshoot the issue. What should you do?

A. Isolate the data analysts and operations user groups to use different Bigtable instances.

B. Check the Cloud Monitoring table/bytes_used metric from Bigtable.

C. Use Key Visualizer for Bigtable.

D. Add more nodes to the Bigtable cluster.

 


Correct Answer: B

Question 10

You work for a financial services company that wants to use fully managed database services. Traffic volume for your consumer services products has increased annually at a constant rate with occasional spikes around holidays. You frequently need to upgrade the capacity of your database. You want to use Cloud Spanner and include an automated method to increase your hardware capacity to support a higher level of concurrency. What should you do?

A. Use linear scaling to implement the Autoscaler-based architecture

B. Use direct scaling to implement the Autoscaler-based architecture.

C. Upgrade the Cloud Spanner instance on a periodic basis during the scheduled maintenance window.

D. Set up alerts that are triggered when Cloud Spanner utilization metrics breach the threshold, and then schedule an upgrade during the scheduled maintenance window.

 


Correct Answer: C

Question 11

Your ecommerce website captures user clickstream data to analyze customer traffic patterns in real time and support personalization features on your website. You plan to analyze this data using big data tools. You need a low-latency solution that can store 8 TB of data and can scale to millions of read and write requests per second. What should you do?

A. Write your data into Bigtable and use Dataproc and the Apache Hbase libraries for analysis.

B. Deploy a Cloud SQL environment with read replicas for improved performance. Use Datastream to export data to Cloud Storage and analyze with Dataproc and the Cloud Storage connector.

C. Use Memorystore to handle your low-latency requirements and for real-time analytics.

D. Stream your data into BigQuery and use Dataproc and the BigQuery Storage API to analyze large volumes of data.

 


Correct Answer: B

Question 12

Your organization is running a low-latency reporting application on Microsoft SQL Server. In addition to the database engine, you are using SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS) in your on-premises environment. You want to migrate your Microsoft SQL Server database instances to Google Cloud. You need to ensure minimal disruption to the existing architecture during migration. What should you do?

A. Migrate to Cloud SQL for SQL Server.

B. Migrate to Cloud SQL for PostgreSQL.

C. Migrate to Compute Engine.

D. Migrate to Google Kubernetes Engine (GKE).

 


Correct Answer: B

Question 13

Your digital-native business runs its database workloads on Cloud SQL. Your website must be globally accessible 24/7. You need to prepare your Cloud SQL instance for high availability (HA). You want to follow Google-recommended practices. What should you do? (Choose two.)

A. Set up manual backups.

B. Create a PostgreSQL database on-premises as the HA option.

C. Configure single zone availability for automated backups.

D. Enable point-in-time recovery.

E. Schedule automated backups.

 


Correct Answer: BD

Question 14

Your company has PostgreSQL databases on-premises and on Amazon Web Services (flaws). You are planning multiple database migrations to Cloud SQL in an effort to reduce costs and downtime. You want to follow Google-recommended practices and use Google native data migration tools. You also want to closely monitor the migrations as part of the cutover strategy. What should you do?

A. Use Database Migration Service to migrate all databases to Cloud SQL.

B. Use Database Migration Service for one-time migrations, and use third-party or partner tools for change data capture (CDC) style migrations.

C. Use data replication tools and CDC tools to enable migration.

D. Use a combination of Database Migration Service and partner tools to support the data migration strategy.

 


Correct Answer: B

Question 15

You are troubleshooting a connection issue with a newly deployed Cloud SQL instance on Google Cloud. While investigating the Cloud SQL Proxy logs, you see the message Error 403: Access Not Configured. What should you do?

A. Check the app.yaml value cloud_sql_instances for a misspelled or incorrect instance connection name.

B. Check whether your service account has cloudsql.instances.connect permission.

C. Enable the Cloud SQL Admin API.

D. Ensure that you are using an external (public) IP address interface.

 


Correct Answer: A

Question 16

You have an application that sends banking events to Bigtable cluster-a in us-east. You decide to add cluster-b in us-central1. Cluster-a replicates data to cluster-b. You need to ensure that Bigtable continues to accept read and write requests if one of the clusters becomes unavailable and that requests are routed automatically to the other cluster. What deployment strategy should you use?

A. Use the default app profile with single-cluster routing.

B. Use the default app profile with multi-cluster routing.

C. Create a custom app profile with multi-cluster routing.

D. Create a custom app profile with single-cluster routing.

 


Correct Answer: A

Question 17

Your organization is migrating 50 TB Oracle databases to Bare Metal Solution for Oracle. Database backups must be available for quick restore. You also need to have backups available for 5 years. You need to design a cost-effective architecture that meets a recovery time objective (RTO) of 2 hours and recovery point objective (RPO) of 15 minutes. What should you do?

A. 1 Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on all flash storage.3. Keep backups older than one day stored in Actifio OnVault storage.

B. 1 Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on standard storage.3. Keep backups older than one day stored in Actifio OnVault storage.

C. 1. Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on standard storage.3. Use the Oracle Recovery Manager (RMAN) backup utility to move backups older than one day to a Coldline Storage bucket.

D. 1. Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on all flash storage.3. Use the Oracle Recovery Manager (RMAN) backup utility to move backups older than one day to an Archive Storage bucket.

 


Correct Answer: B

Question 18

You are using Compute Engine on Google Cloud and your data center to manage a set of MySQL databases in a hybrid configuration. You need to create replicas to scale reads and to offload part of the management operation. What should you do?

A. Use external server replication.

B. Use Data Migration Service.

C. Use Cloud SQL for MySQL external replica.

D. Use the mysqldump utility and binary logs.

 


Correct Answer: B

Question 19

You are writing an application that will run on Cloud Run and require a database running in the Cloud SQL managed service. You want to secure this instance so that it only receives connections from applications running in your VPC environment in Google Cloud. What should you do?

A. 1. Create your instance with a specified external (public) IP address.2. Choose the VPC and create firewall rules to allow only connections from Cloud Run into your instance.3. Use Cloud SQL Auth proxy to connect to the instance.

B. 1. Create your instance with a specified external (public) IP address.2. Choose the VPC and create firewall rules to allow only connections from Cloud Run into your instance.3. Connect to the instance using a connection pool to best manage connections to the instance.

C. 1. Create your instance with a specified internal (private) IP address.2. Choose the VPC with private service connection configured.3. Configure the Serverless VPC Access connector in the same VPC network as your Cloud SQL instance.4. Use Cloud SQL Auth proxy to connect to the instance.

D. 1. Create your instance with a specified internal (private) IP address.2. Choose the VPC with private service connection configured.3. Configure the Serverless VPC Access connector in the same VPC network as your Cloud SQL instance.4. Connect to the instance using a connection pool to best manage connections to the instance.

 


Correct Answer: C

Question 20

You are designing an augmented reality game for iOS and Android devices. You plan to use Cloud Spanner as the primary backend database for game state storage and player authentication. You want to track in-game rewards that players unlock at every stage of the game. During the testing phase, you discovered that costs are much higher than anticipated, but the query response times are within the SLA. You want to follow Google-recommended practices. You need the database to be performant and highly available while you keep costs low. What should you do?

A. Manually scale down the number of nodes after the peak period has passed.

B. Use interleaving to co-locate parent and child rows.

C. Use the Cloud Spanner query optimizer to determine the most efficient way to execute the SQL query.

D. Use granular instance sizing in Cloud Spanner and Autoscaler.

 


Correct Answer: C

Question 21

Your organization is running a critical production database on a virtual machine (VM) on Compute Engine. The VM has an ext4-formatted persistent disk for data files. The database will soon run out of storage space. You need to implement a solution that avoids downtime. What should you do?

A. In the Google Cloud Console, increase the size of the persistent disk, and use the resize2fs command to extend the disk.

B. In the Google Cloud Console, increase the size of the persistent disk, and use the fdisk command to verify that the new space is ready to use

C. In the Google Cloud Console, create a snapshot of the persistent disk, restore the snapshot to a new larger disk, unmount the old disk, mount the new disk, and restart the database service.

D. In the Google Cloud Console, create a new persistent disk attached to the VM, and configure the database service to move the files to the new disk.

 


Correct Answer: D

Question 22

Your organization has a ticketing system that needs an online marketing analytics and reporting application. You need to select a relational database that can manage hundreds of terabytes of data to support this new application. Which database should you use?

A. Cloud SQL

B. BigQuery

C. Cloud Spanner

D. Bigtable

 


Correct Answer: B

Question 23

You are designing a database strategy for a new web application. You plan to start with a small pilot in one country and eventually expand to millions of users in a global audience. You need to ensure that the application can run 24/7 with minimal downtime for maintenance. What should you do?

A. Use Cloud Spanner in a regional configuration.

B. Use Cloud Spanner in a multi-region configuration.

C. Use Cloud SQL with cross-region replicas.

D. Use highly available Cloud SQL with multiple zones.

 


Correct Answer: A

Question 24

You are working on a new centralized inventory management system to track items available in 200 stores, which each have 500 GB of data. You are planning a gradual rollout of the system to a few stores each week. You need to design an SQL database architecture that minimizes costs and user disruption during each regional rollout and can scale up or down on nights and holidays. What should you do?

A. Use Oracle Real Application Cluster (RAC) databases on Bare Metal Solution for Oracle.

B. Use sharded Cloud SQL instances with one or more stores per database instance.

C. Use a Biglable cluster with autoscaling.

D. Use Cloud Spanner with a custom autoscaling solution.

 


Correct Answer: B

Question 25

Your company is shutting down their on-premises data center and migrating their Oracle databases using Oracle Real Application Clusters (RAC) to Google Cloud. You want minimal to no changes to the applications during the database migration. What should you do?

A. Migrate the Oracle databases to Cloud Spanner.

B. Migrate the Oracle databases to Compute Engine.

C. Migrate the Oracle databases to Cloud SQL.

D. Migrate the Oracle databases to Bare Metal Solution for Oracle.

 


Correct Answer: C

Question 26

Your company is evaluating Google Cloud database options for a mission-critical global payments gateway application. The application must be available 24/7 to users worldwide, horizontally scalable, and support open source databases. You need to select an automatically shardable, fully managed database with 99.999% availability and strong transactional consistency. What should you do?

A. Select Bare Metal Solution for Oracle.

B. Select Cloud SQL.

C. Select Bigtable.

D. Select Cloud Spanner.

 


Correct Answer: A

Question 27

You are designing a database architecture for a global application that stores information about public parks worldwide. The application uses the database for read-only purposes, and a centralized batch job updates the database nightly. You want to select an open source, SQL-compliant database. What should you do?

A. Use Bigtable with multi-region clusters.

B. Use Memorystore for Redis with multi-zones within a region.

C. Use Cloud SQL for PostgreSQL with cross-region replicas.

D. Use Cloud Spanner with multi-region configuration.

 


Correct Answer: C

Question 28

During an internal audit, you realized that one of your Cloud SQL for MySQL instances does not have high availability (HA) enabled. You want to follow Google-recommended practices to enable HA on your existing instance. What should you do?

A. Create a new Cloud SQL for MySQL instance, enable HA, and use the export and import option to migrate your data.

B. Create a new Cloud SQL for MySQL instance, enable HA, and use Cloud Data Fusion to migrate your data.

C. Use the gcloud instances patch command to update your existing Cloud SQL for MySQL instance.

D. Shut down your existing Cloud SQL for MySQL instance, and enable HA.

 


Correct Answer: A

Question 29

You are starting a large CSV import into a Cloud SQL for MySQL instance that has many open connections. You checked memory and CPU usage, and sufficient resources are available. You want to follow Google-recommended practices to ensure that the import will not time out. What should you do?

A. Close idle connections or restart the instance before beginning the import operation.

B. Increase the amount of memory allocated to your instance.

C. Ensure that the service account has the Storage Admin role.

D. Increase the number of CPUs for the instance to ensure that it can handle the additional import operation.

 


Correct Answer: C

Question 30

Your company is shutting down their data center and migrating several MySQL and PostgreSQL databases to Google Cloud. Your database operations team is severely constrained by ongoing production releases and the lack of capacity for additional on-premises backups. You want to ensure that the scheduled migrations happen with minimal downtime and that the Google Cloud databases stay in sync with the on-premises data changes until the applications can cut over.
What should you do? (Choose two.)

A. Use an external read replica to migrate the databases to Cloud SQL.

B. Use a read replica to migrate the databases to Cloud SQL.

C. Use Database Migration Service to migrate the databases to Cloud SQL.

D. Use a cross-region read replica to migrate the databases to Cloud SQL.

E. Use replication from an external server to migrate the databases to Cloud SQL.

 


Correct Answer: BD

Question 31

You manage a production MySQL database running on Cloud SQL at a retail company. You perform routine maintenance on Sunday at midnight when traffic is slow, but you want to skip routine maintenance during the year-end holiday shopping season. You need to ensure that your production system is available 24/7 during the holidays. What should you do?

A. Define a maintenance window on Sundays between 12 AM and 1 AM, and deny maintenance periods between November 1 and January 15.

B. Define a maintenance window on Sundays between 12 AM and 5 AM, and deny maintenance periods between November 1 and February 15.

C. Build a Cloud Composer job to start a maintenance window on Sundays between 12 AM and 1AM, and deny maintenance periods between November 1 and January 15.

D. Create a Cloud Scheduler job to start maintenance at 12 AM on Sundays. Pause the Cloud Scheduler job between November 1 and January 15.

 


Correct Answer: B

Question 32

You are designing a payments processing application on Google Cloud. The application must continue to serve requests and avoid any user disruption if a regional failure occurs. You need to use AES-256 to encrypt data in the database, and you want to control where you store the encryption key. What should you do?

A. Use Cloud Spanner with a customer-managed encryption key (CMEK).

B. Use Cloud Spanner with default encryption.

C. Use Cloud SQL with a customer-managed encryption key (CMEK).

D. Use Bigtable with default encryption.

 


Correct Answer: C

Question 33

Your organization has a production Cloud SQL for MySQL instance. Your instance is configured with 16 vCPUs and 104 GB of RAM that is running between 90% and 100% CPU utilization for most of the day. You need to scale up the database and add vCPUs with minimal interruption and effort. What should you do?

A. Issue a gcloud sql instances patch command to increase the number of vCPUs.

B. Update a MySQL database flag to increase the number of vCPUs.

C. Issue a gcloud compute instances update command to increase the number of vCPUs.

D. Back up the database, create an instance with additional vCPUs, and restore the database.

 


Correct Answer: A

Question 34

You manage a meeting booking application that uses Cloud SQL. During an important launch, the Cloud SQL instance went through a maintenance event that resulted in a downtime of more than 5 minutes and adversely affected your production application. You need to immediately address the maintenance issue to prevent any unplanned events in the future. What should you do?

A. Set your production instance’s maintenance window to non-business hours.

B. Migrate the Cloud SQL instance to Cloud Spanner to avoid any future disruptions due to maintenance.

C. Contact Support to understand why your Cloud SQL instance had a downtime of more than 5 minutes.

D. Use Cloud Scheduler to schedule a maintenance window of no longer than 5 minutes.

 


Correct Answer: B

Question 35

You are managing two different applications: Order Management and Sales Reporting. Both applications interact with the same Cloud SQL for MySQL database. The Order Management application reads and writes to the database 24/7, but the Sales Reporting application is read-only. Both applications need the latest data. You need to ensure that the Performance of the Order Management application is not affected by the Sales Reporting application. What should you do?

A. Create a read replica for the Sales Reporting application.

B. Create two separate databases in the instance, and perform dual writes from the Order Management application.

C. Use a Cloud SQL federated query for the Sales Reporting application.

D. Queue up all the requested reports in PubSub, and execute the reports at night.

 


Correct Answer: C

Question 36

You want to migrate an on-premises mission-critical PostgreSQL database to Cloud SQL. The database must be able to withstand a zonal failure with less than five minutes of downtime and still not lose any transactions. You want to follow Google-recommended practices for the migration. What should you do?

A. Take nightly snapshots of the primary database instance, and restore them in a secondary zone.

B. Build a change data capture (CDC) pipeline to read transactions from the primary instance, and replicate them to a secondary instance.

C. Create a read replica in another region, and promote the read replica if a failure occurs.

D. Enable high availability (HA) for the database to make it regional.

 


Correct Answer: D

Question 37

You plan to use Database Migration Service to migrate data from a PostgreSQL on-premises instance to Cloud SQL. You need to identify the prerequisites for creating and automating the task. What should you do? (Choose two.)

A. Drop or disable all users except database administration users.

B. Disable all foreign key constraints on the source PostgreSQL database.

C. Ensure that all PostgreSQL tables have a primary key.

D. Shut down the database before the Data Migration Service task is started.

E. Ensure that pglogical is installed on the source PostgreSQL database.

 


Correct Answer: BE

Question 38

Your organization has an existing app that just went viral. The app uses a Cloud SQL for MySQL backend database that is experiencing slow disk performance while using hard disk drives (HDDs). You need to improve performance and reduce disk I/O wait times. What should you do?

A. Export the data from the existing instance, and import the data into a new instance with solid-state drives (SSDs).

B. Edit the instance to change the storage type from HDD to SSD.

C. Create a high availability (HA) failover instance with SSDs, and perform a failover to the new instance.

D. Create a read replica of the instance with SSDs, and perform a failover to the new instance

 


Correct Answer: C

Question 39

You host an application in Google Cloud. The application is located in a single region and uses Cloud SQL for transactional data. Most of your users are located in the same time zone and expect the application to be available 7 days a week, from 6 AM to 10 PM. You want to ensure regular maintenance updates to your Cloud SQL instance without creating downtime for your users. What should you do?

A. Configure a maintenance window during a period when no users will be on the system. Control the order of update by setting non-production instances to earlier and production instances to later.

B. Create your database with one primary node and one read replica in the region.

C. Enable maintenance notifications for users, and reschedule maintenance activities to a specific time after notifications have been sent.

D. Configure your Cloud SQL instance with high availability enabled.

 


Correct Answer: A

Question 40

You work in the logistics department. Your data analysis team needs daily extracts from Cloud SQL for MySQL to train a machine learning model. The model will be used to optimize next-day routes. You need to export the data in CSV format. You want to follow Google-recommended practices. What should you do?

A. Use Cloud Scheduler to trigger a Cloud Function that will run a select * from table(s) query to call the cloudsql.instances.export API.

B. Use Cloud Scheduler to trigger a Cloud Function through Pub/Sub to call the cloudsql.instances.export API.

C. Use Cloud Composer to orchestrate an export by calling the cloudsql.instances.export API.

D. Use Cloud Composer to execute a select * from table(s) query and export results.

 


Correct Answer: A

Question 41

Your company is developing a new global transactional application that must be ACID-compliant and have 99.999% availability. You are responsible for selecting the appropriate Google Cloud database to serve as a datastore for this new application. What should you do?

A. Use Firestore.

B. Use Cloud Spanner.

C. Use Cloud SQL.

D. Use Bigtable.

 


Correct Answer: C

Question 42

You need to provision several hundred Cloud SQL for MySQL instances for multiple project teams over a one-week period. You must ensure that all instances adhere to company standards such as instance naming conventions, database flags, and tags. What should you do?

A. Automate instance creation by writing a Dataflow job.

B. Automate instance creation by setting up Terraform scripts.

C. Create the instances using the Google Cloud Console UI.

D. Create clones from a template Cloud SQL instance.

 


Correct Answer: C

Question 43

Your team is running a Cloud SQL for MySQL instance with a 5 TB database that must be available 24/7. You need to save database backups on object storage with minimal operational overhead or risk to your production workloads. What should you do?

A. Use Cloud SQL serverless exports.

B. Create a read replica, and then use the mysqldump utility to export each table.

C. Clone the Cloud SQL instance, and then use the mysqldump utlity to export the data.

D. Use the mysqldump utility on the primary database instance to export the backup.

 


Correct Answer: C

Question 44

Your organization has a security policy to ensure that all Cloud SQL for PostgreSQL databases are secure. You want to protect sensitive data by using a key that meets specific locality or residency requirements. Your organization needs to control the key's lifecycle activities. You need to ensure that data is encrypted at rest and in transit. What should you do?

A. Create the database with Google-managed encryption keys.

B. Create the database with customer-managed encryption keys.

C. Create the database persistent disk with Google-managed encryption keys.

D. Create the database persistent disk with customer-managed encryption keys.

 


Correct Answer: D

Question 45

Your company is using Cloud SQL for MySQL with an internal (private) IP address and wants to replicate some tables into BigQuery in near-real time for analytics and machine learning. You need to ensure that replication is fast and reliable and uses Google-managed services. What should you do?

A. Develop a custom data replication service to send data into BigQuery.

B. Use Cloud SQL federated queries.

C. Use Database Migration Service to replicate tables into BigQuery.

D. Use Datastream to capture changes, and use Dataflow to write those changes to BigQuery.

 


Correct Answer: D

Question 46

You want to migrate your PostgreSQL database from another cloud provider to Cloud SQL. You plan on using Database Migration Service and need to assess the impact of any known limitations. What should you do? (Choose two.)

A. Identify whether the database has over 512 tables.

B. Identify all tables that do not have a primary key.

C. Identity all tables that do not have at least one foreign key.

D. Identify whether the source database is encrypted using pgcrypto extension.

E. Identify whether the source database uses customer-managed encryption keys (CMEK).

 


Correct Answer: CE

Question 47

Your company uses Cloud Spanner for a mission-critical inventory management system that is globally available. You recently loaded stock keeping unit (SKU) and product catalog data from a company acquisition and observed hotspots in the Cloud Spanner database. You want to follow Google-recommended schema design practices to avoid performance degradation. What should you do? (Choose two.)

A. Use an auto-incrementing value as the primary key.

B. Normalize the data model.

C. Promote low-cardinality attributes in multi-attribute primary keys.

D. Promote high-cardinality attributes in multi-attribute primary keys.

E. Use bit-reverse sequential value as the primary key.

 


Correct Answer: AD

Question 48

You have a Cloud SQL instance (DB-1) with two cross-region read replicas (DB-2 and DB-3). During a business continuity test, the primary instance (DB-1) was taken offline and a replica (DB-2) was promoted. The test has concluded and you want to return to the pre-test configuration. What should you do?

A. Bring DB-1 back online.

B. Delete DB-1, and re-create DB-1 as a read replica in the same region as DB-1.

C. Delete DB-2 so that DB-1 automatically reverts to the primary instance.

D. Create DB-4 as a read replica in the same region as DB-1, and promote DB-4 to primary.

 


Correct Answer: C

Question 49

You are setting up a Bare Metal Solution environment. You need to update the operating system to the latest version. You need to connect the Bare Metal Solution environment to the internet so you can receive software updates. What should you do?

A. Setup a static external IP address in your VPC network.

B. Set up bring your own IP (BYOIP) in your VPC.

C. Set up a Cloud NAT gateway on the Compute Engine VM.

D. Set up Cloud NAT service.

 


Correct Answer: C

Question 50

You are evaluating Cloud SQL for PostgreSQL as a possible destination for your on-premises PostgreSQL instances. Geography is becoming increasingly relevant to customer privacy worldwide. Your solution must support data residency requirements and include a strategy to: configure where data is stored control where the encryption keys are stored govern the access to data
What should you do?

A. Replicate Cloud SQL databases across different zones.

B. Create a Cloud SQL for PostgreSQL instance on Google Cloud for the data that does not need to adhere to data residency requirements. Keep the data that must adhere to data residency requirements on-premises. Make application changes to support both databases.

C. Allow application access to data only if the users are in the same region as the Google Cloud region for the Cloud SQL for PostgreSQL database.

D. Use features like customer-managed encryption keys (CMEK), VPC Service Controls, and Identity and Access Management (IAM) policies.

 


Correct Answer: C

Access Full Google Professional Cloud Database Engineer Mock Test Free

Want a full-length mock test experience? Click here to unlock the complete Google Professional Cloud Database Engineer Mock Test Free set and get access to hundreds of additional practice questions covering all key topics.

We regularly update our question sets to stay aligned with the latest exam objectives—so check back often for fresh content!

Start practicing with our Google Professional Cloud Database Engineer mock test free today—and take a major step toward exam success!

Share18Tweet11
Previous Post

Google Professional Cloud Architect Mock Test Free

Next Post

Google Professional Cloud Developer Mock Test Free

Next Post

Google Professional Cloud Developer Mock Test Free

Google Professional Cloud DevOps Engineer Mock Test Free

Google Professional Cloud Network Engineer Mock Test Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.