Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Exam Prep Free

Google Professional Cloud Database Engineer Exam Prep Free

Table of Contents

Toggle
  • Google Professional Cloud Database Engineer Exam Prep Free – 50 Practice Questions to Get You Ready for Exam Day
  • Access Full Google Professional Cloud Database Engineer Exam Prep Free

Google Professional Cloud Database Engineer Exam Prep Free – 50 Practice Questions to Get You Ready for Exam Day

Getting ready for the Google Professional Cloud Database Engineer certification? Our Google Professional Cloud Database Engineer Exam Prep Free resource includes 50 exam-style questions designed to help you practice effectively and feel confident on test day

Effective Google Professional Cloud Database Engineer exam prep free is the key to success. With our free practice questions, you can:

  • Get familiar with exam format and question style
  • Identify which topics you’ve mastered—and which need more review
  • Boost your confidence and reduce exam anxiety

Below, you will find 50 realistic Google Professional Cloud Database Engineer Exam Prep Free questions that cover key exam topics. These questions are designed to reflect the structure and challenge level of the actual exam, making them perfect for your study routine.

Question 1

Your company uses Bigtable for a user-facing application that displays a low-latency real-time dashboard. You need to recommend the optimal storage type for this read-intensive database. What should you do?

A. Recommend solid-state drives (SSD).

B. Recommend splitting the Bigtable instance into two instances in order to load balance the concurrent reads.

C. Recommend hard disk drives (HDD).

D. Recommend mixed storage types.

 


Correct Answer: B

Question 2

You are managing a set of Cloud SQL databases in Google Cloud. Regulations require that database backups reside in the region where the database is created. You want to minimize operational costs and administrative effort. What should you do?

A. Configure the automated backups to use a regional Cloud Storage bucket as a custom location.

B. Use the default configuration for the automated backups location.

C. Disable automated backups, and create an on-demand backup routine to a regional Cloud Storage bucket.

D. Disable automated backups, and configure serverless exports to a regional Cloud Storage bucket.

 


Correct Answer: C

Question 3

You are migrating your 2 TB on-premises PostgreSQL cluster to Compute Engine. You want to set up your new environment in an Ubuntu virtual machine instance in Google Cloud and seed the data to a new instance. You need to plan your database migration to ensure minimum downtime. What should you do?

A. 1. Take a full export while the database is offline.2. Create a bucket in Cloud Storage.3. Transfer the dump file to the bucket you just created.4. Import the dump file into the Google Cloud primary server.

B. 1. Take a full export while the database is offline.2. Create a bucket in Cloud Storage.3. Transfer the dump file to the bucket you just created.4. Restore the backup into the Google Cloud primary server.

C. 1. Take a full backup while the database is online.2. Create a bucket in Cloud Storage.3. Transfer the backup to the bucket you just created.4. Restore the backup into the Google Cloud primary server.5. Create a recovery.conf file in the $PG_DATA directory.6. Stop the source database.7. Transfer the write ahead logs to the bucket you created before.8. Start the PostgreSQL service.9. Wait until Google Cloud primary server syncs with the running primary server.

D. 1. Take a full export while the database is online.2. Create a bucket in Cloud Storage.3. Transfer the dump file and write-ahead logs to the bucket you just created.4. Restore the dump file into the Google Cloud primary server.5. Create a recovery.conf file in the $PG_DATA directory.6. Stop the source database.7. Transfer the write-ahead logs to the bucket you created before.8. Start the PostgreSQL service.9. Wait until the Google Cloud primary server syncs with the running primary server.

 


Correct Answer: C

Question 4

Your company is using Cloud SQL for MySQL with an internal (private) IP address and wants to replicate some tables into BigQuery in near-real time for analytics and machine learning. You need to ensure that replication is fast and reliable and uses Google-managed services. What should you do?

A. Develop a custom data replication service to send data into BigQuery.

B. Use Cloud SQL federated queries.

C. Use Database Migration Service to replicate tables into BigQuery.

D. Use Datastream to capture changes, and use Dataflow to write those changes to BigQuery.

 


Correct Answer: D

Question 5

Your project is using Bigtable to store data that should not be accessed from the public internet under any circumstances, even if the requestor has a valid service account key. You need to secure access to this data. What should you do?

A. Use Identity and Access Management (IAM) for Bigtable access control.

B. Use VPC Service Controls to create a trusted network for the Bigtable service.

C. Use customer-managed encryption keys (CMEK).

D. Use Google Cloud Armor to add IP addresses to an allowlist.

 


Correct Answer: B

Question 6

Your company wants to migrate an Oracle-based application to Google Cloud. The application team currently uses Oracle Recovery Manager (RMAN) to back up the database to tape for long-term retention (LTR). You need a cost-effective backup and restore solution that meets a 2-hour recovery time objective (RTO) and a 15-minute recovery point objective (RPO). What should you do?

A. Migrate the Oracle databases to Bare Metal Solution for Oracle, and store backups on tapes on-premises.

B. Migrate the Oracle databases to Bare Metal Solution for Oracle, and use Actifio to store backup files on Cloud Storage using the Nearline Storage class.

C. Migrate the Oracle databases to Bare Metal Solution for Oracle, and back up the Oracle databases to Cloud Storage using the Standard Storage class.

D. Migrate the Oracle databases to Compute Engine, and store backups on tapes on-premises.

 


Correct Answer: C

Question 7

You need to issue a new server certificate because your old one is expiring. You need to avoid a restart of your Cloud SQL for MySQL instance. What should you do in your Cloud SQL instance?

A. Issue a rollback, and download your server certificate.

B. Create a new client certificate, and download it.

C. Create a new server certificate, and download it.

D. Reset your SSL configuration, and download your server certificate.

 


Correct Answer: D

Question 8

During an internal audit, you realized that one of your Cloud SQL for MySQL instances does not have high availability (HA) enabled. You want to follow Google-recommended practices to enable HA on your existing instance. What should you do?

A. Create a new Cloud SQL for MySQL instance, enable HA, and use the export and import option to migrate your data.

B. Create a new Cloud SQL for MySQL instance, enable HA, and use Cloud Data Fusion to migrate your data.

C. Use the gcloud instances patch command to update your existing Cloud SQL for MySQL instance.

D. Shut down your existing Cloud SQL for MySQL instance, and enable HA.

 


Correct Answer: A

Question 9

You need to migrate a 1 TB PostgreSQL database from a Compute Engine VM to Cloud SQL for PostgreSQL. You want to ensure that there is minimal downtime during the migration. What should you do?

A. Export the data from the existing database, and load the data into a new Cloud SQL database.

B. Use Migrate for Compute Engine to complete the migration.

C. Use Datastream to complete the migration.

D. Use Database Migration Service to complete the migration.

 


Correct Answer: A

Question 10

You are configuring the networking of a Cloud SQL instance. The only application that connects to this database resides on a Compute Engine VM in the same project as the Cloud SQL instance. The VM and the Cloud SQL instance both use the same VPC network, and both have an external (public) IP address and an internal (private) IP address. You want to improve network security. What should you do?

A. Disable and remove the internal IP address assignment.

B. Disable both the external IP address and the internal IP address, and instead rely on Private Google Access.

C. Specify an authorized network with the CIDR range of the VM.

D. Disable and remove the external IP address assignment.

 


Correct Answer: B

Question 11

You are migrating an on-premises application to Google Cloud. The application requires a high availability (HA) PostgreSQL database to support business-critical functions. Your company's disaster recovery strategy requires a recovery time objective (RTO) and recovery point objective (RPO) within 30 minutes of failure. You plan to use a Google Cloud managed service. What should you do to maximize uptime for your application?

A. Deploy Cloud SQL for PostgreSQL in a regional configuration. Create a read replica in a different zone in the same region and a read replica in another region for disaster recovery.

B. Deploy Cloud SQL for PostgreSQL in a regional configuration with HA enabled. Take periodic backups, and use this backup to restore to a new Cloud SQL for PostgreSQL instance in another region during a disaster recovery event.

C. Deploy Cloud SQL for PostgreSQL in a regional configuration with HA enabled. Create a cross-region read replica, and promote the read replica as the primary node for disaster recovery.

D. Migrate the PostgreSQL database to multi-regional Cloud Spanner so that a single region outage will not affect your application. Update the schema to support Cloud Spanner data types, and refactor the application.

 


Correct Answer: C

Question 12

You are managing two different applications: Order Management and Sales Reporting. Both applications interact with the same Cloud SQL for MySQL database. The Order Management application reads and writes to the database 24/7, but the Sales Reporting application is read-only. Both applications need the latest data. You need to ensure that the Performance of the Order Management application is not affected by the Sales Reporting application. What should you do?

A. Create a read replica for the Sales Reporting application.

B. Create two separate databases in the instance, and perform dual writes from the Order Management application.

C. Use a Cloud SQL federated query for the Sales Reporting application.

D. Queue up all the requested reports in PubSub, and execute the reports at night.

 


Correct Answer: C

Question 13

Your company's mission-critical, globally available application is supported by a Cloud Spanner database. Experienced users of the application have read and write access to the database, but new users are assigned read-only access to the database. You need to assign the appropriate Cloud Spanner Identity and Access Management (IAM) role to new users being onboarded soon. What roles should you set up?

A. roles/spanner.databaseReader

B. roles/spanner.databaseUser

C. roles/spanner.viewer

D. roles/spanner.backupWriter

 


Correct Answer: C

Question 14

You have a Cloud SQL instance (DB-1) with two cross-region read replicas (DB-2 and DB-3). During a business continuity test, the primary instance (DB-1) was taken offline and a replica (DB-2) was promoted. The test has concluded and you want to return to the pre-test configuration. What should you do?

A. Bring DB-1 back online.

B. Delete DB-1, and re-create DB-1 as a read replica in the same region as DB-1.

C. Delete DB-2 so that DB-1 automatically reverts to the primary instance.

D. Create DB-4 as a read replica in the same region as DB-1, and promote DB-4 to primary.

 


Correct Answer: C

Question 15

You plan to use Database Migration Service to migrate data from a PostgreSQL on-premises instance to Cloud SQL. You need to identify the prerequisites for creating and automating the task. What should you do? (Choose two.)

A. Drop or disable all users except database administration users.

B. Disable all foreign key constraints on the source PostgreSQL database.

C. Ensure that all PostgreSQL tables have a primary key.

D. Shut down the database before the Data Migration Service task is started.

E. Ensure that pglogical is installed on the source PostgreSQL database.

 


Correct Answer: BE

Question 16

You are the DBA of an online tutoring application that runs on a Cloud SQL for PostgreSQL database. You are testing the implementation of the cross-regional failover configuration. The database in region R1 fails over successfully to region R2, and the database becomes available for the application to process data. During testing, certain scenarios of the application work as expected in region R2, but a few scenarios fail with database errors. The application-related database queries, when executed in isolation from Cloud SQL for PostgreSQL in region R2, work as expected. The application performs completely as expected when the database fails back to region R1. You need to identify the cause of the database errors in region R2. What should you do?

A. Determine whether the versions of Cloud SQL for PostgreSQL in regions R1 and R2 are different.

B. Determine whether the database patches of Cloud SQI for PostgreSQL in regions R1 and R2 are different.

C. Determine whether the failover of Cloud SQL for PostgreSQL from region R1 to region R2 is in progress or has completed successfully.

D. Determine whether Cloud SQL for PostgreSQL in region R2 is a near-real-time copy of region R1 but not an exact copy.

 


Correct Answer: B

Question 17

You are configuring a new application that has access to an existing Cloud Spanner database. The new application reads from this database to gather statistics for a dashboard. You want to follow Google-recommended practices when granting Identity and Access Management (IAM) permissions. What should you do?

A. Reuse the existing service account that populates this database.

B. Create a new service account, and grant it the Cloud Spanner Database Admin role.

C. Create a new service account, and grant it the Cloud Spanner Database Reader role.

D. Create a new service account, and grant it the spanner.databases.select permission.

 


Correct Answer: B

Question 18

You are the primary DBA of a Cloud SQL for PostgreSQL database that supports 6 enterprise applications in production. You used Cloud SQL Insights to identify inefficient queries and now need to identify the application that is originating the inefficient queries. You want to follow Google-recommended practices. What should you do?

A. Shut down and restart each application.

B. Write a utility to scan database query logs.

C. Write a utility to scan application logs.

D. Use query tags to add application-centric database monitoring.

 


Correct Answer: B

Question 19

Your company is shutting down their data center and migrating several MySQL and PostgreSQL databases to Google Cloud. Your database operations team is severely constrained by ongoing production releases and the lack of capacity for additional on-premises backups. You want to ensure that the scheduled migrations happen with minimal downtime and that the Google Cloud databases stay in sync with the on-premises data changes until the applications can cut over. What should you do? (Choose two.)

A. Use Database Migration Service to migrate the databases to Cloud SQL.

B. Use a cross-region read replica to migrate the databases to Cloud SQL.

C. Use replication from an external server to migrate the databases to Cloud SQL.

D. Use an external read replica to migrate the databases to Cloud SQL.

E. Use a read replica to migrate the databases to Cloud SQL.

 


Correct Answer: CE

Question 20

You are building an Android game that needs to store data on a Google Cloud serverless database. The database will log user activity, store user preferences, and receive in-game updates. The target audience resides in developing countries that have intermittent internet connectivity. You need to ensure that the game can synchronize game data to the backend database whenever an internet network is available. What should you do?

A. Use Firestore.

B. Use Cloud SQL with an external (public) IP address.

C. Use an in-app embedded database.

D. Use Cloud Spanner.

 


Correct Answer: B

Question 21

You are designing a physician portal app in Node.js. This application will be used in hospitals and clinics that might have intermittent internet connectivity. If a connectivity failure occurs, the app should be able to query the cached data. You need to ensure that the application has scalability, strong consistency, and multi-region replication. What should you do?

A. Use Firestore and ensure that the PersistenceEnabled option is set to true.

B. Use Memorystore for Memcached.

C. Use Pub/Sub to synchronize the changes from the application to Cloud Spanner.

D. Use Table.read with the exactStaleness option to perform a read of rows in Cloud Spanner.

 


Correct Answer: C

Question 22

You have deployed a Cloud SQL for SQL Server instance. In addition, you created a cross-region read replica for disaster recovery (DR) purposes. Your company requires you to maintain and monitor a recovery point objective (RPO) of less than 5 minutes. You need to verify that your cross-region read replica meets the allowed RPO. What should you do?

A. Use Cloud SQL instance monitoring.

B. Use the Cloud Monitoring dashboard with available metrics from Cloud SQL.

C. Use Cloud SQL logs.

D. Use the SQL Server Always On Availability Group dashboard.

 


Correct Answer: B

Question 23

You support a consumer inventory application that runs on a multi-region instance of Cloud Spanner. A customer opened a support ticket to complain about slow response times. You notice a Cloud Monitoring alert about high CPU utilization. You want to follow Google-recommended practices to address the CPU performance issue. What should you do first?

A. Increase the number of processing units.

B. Modify the database schema, and add additional indexes.

C. Shard data required by the application into multiple instances.

D. Decrease the number of processing units.

 


Correct Answer: A

Question 24

Your organization operates in a highly regulated industry. Separation of concerns (SoC) and security principle of least privilege (PoLP) are critical. The operations team consists of:
Person A is a database administrator.
Person B is an analyst who generates metric reports.
Application C is responsible for automatic backups.
You need to assign roles to team members for Cloud Spanner. Which roles should you assign?

A. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseReader for Person Broles/spanner.backupWriter for Application C

B. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseReader for Person Broles/spanner.backupAdmin for Application C

C. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseUser for Person Broles/spanner databaseReader for Application C

D. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseUser for Person Broles/spanner.backupWriter for Application C

 


Correct Answer: B

Question 25

Your company is shutting down their on-premises data center and migrating their Oracle databases using Oracle Real Application Clusters (RAC) to Google Cloud. You want minimal to no changes to the applications during the database migration. What should you do?

A. Migrate the Oracle databases to Cloud Spanner.

B. Migrate the Oracle databases to Compute Engine.

C. Migrate the Oracle databases to Cloud SQL.

D. Migrate the Oracle databases to Bare Metal Solution for Oracle.

 


Correct Answer: C

Question 26

Your company is migrating the existing infrastructure for a highly transactional application to Google Cloud. You have several databases in a MySQL database instance and need to decide how to transfer the data to Cloud SQL. You need to minimize the downtime for the migration of your 500 GB instance. What should you do?

A. 1. Create a Cloud SQL for MySQL instance for your databases, and configure Datastream to stream your database changes to Cloud SQL.2. Select the Backfill historical data check box on your stream configuration to initiate Datastream to backfill any data that is out of sync between the source and destination.3. Delete your stream when all changes are moved to Cloud SQL for MySQL, and update your application to use the new instance.

B. 1. Create migration job using Database Migration Service.2. Set the migration job type to Continuous, and allow the databases to complete the full dump phase and start sending data in change data capture (CDC) mode.3. Wait for the replication delay to minimize, initiate a promotion of the new Cloud SQL instance, and wait for the migration job to complete.4. Update your application connections to the new instance.

C. 1. Create migration job using Database Migration Service.2. Set the migration job type to One-time, and perform this migration during a maintenance window.3. Stop all write workloads to the source database and initiate the dump. Wait for the dump to be loaded into the Cloud SQL destination database and the destination database to be promoted to the primary database.4. Update your application connections to the new instance.

D. 1. Use the mysqldump utility to manually initiate a backup of MySQL during the application maintenance window.2. Move the files to Cloud Storage, and import each database into your Cloud SQL instance.3. Continue to dump each database until all the databases are migrated.4. Update your application connections to the new instance.

 


Correct Answer: C

Question 27

You have a large Cloud SQL for PostgreSQL instance. The database instance is not mission-critical, and you want to minimize operational costs. What should you do to lower the cost of backups in this environment?

A. Set the automated backups to occur every other day to lower the frequency of backups.

B. Change the storage tier of the automated backups from solid-state drive (SSD) to hard disk drive (HDD).

C. Select a different region to store your backups.

D. Reduce the number of automated backups that are retained to two (2).

 


Correct Answer: A

Question 28

You are configuring a brand new Cloud SQL for PostgreSQL database instance in Google Cloud. Your application team wants you to deploy one primary instance, one standby instance, and one read replica instance. You need to ensure that you are following Google-recommended practices for high availability. What should you do?

A. Configure the primary instance in zone A, the standby instance in zone C, and the read replica in zone B, all in the same region.

B. Configure the primary and standby instances in zone A and the read replica in zone B, all in the same region.

C. Configure the primary instance in one region, the standby instance in a second region, and the read replica in a third region.

D. Configure the primary, standby, and read replica instances in zone A, all in the same region.

 


Correct Answer: B

Question 29

Your DevOps team is using Terraform to deploy applications and Cloud SQL databases. After every new application change is rolled out, the environment is torn down and recreated, and the persistent database layer is lost. You need to prevent the database from being dropped. What should you do?

A. Set Terraform deletion_protection to true.

B. Rerun terraform apply.

C. Create a read replica.

D. Use point-in-time-recovery (PITR) to recover the database.

 


Correct Answer: A

Question 30

You are a DBA of Cloud SQL for PostgreSQL. You want the applications to have password-less authentication for read and write access to the database. Which authentication mechanism should you use?

A. Use Identity and Access Management (IAM) authentication.

B. Use Managed Active Directory authentication.

C. Use Cloud SQL federated queries.

D. Use PostgreSQL database’s built-in authentication.

 


Correct Answer: A

Question 31

Your company wants to move to Google Cloud. Your current data center is closing in six months. You are running a large, highly transactional Oracle application footprint on VMWare. You need to design a solution with minimal disruption to the current architecture and provide ease of migration to Google Cloud. What should you do?

A. Migrate applications and Oracle databases to Google Cloud VMware Engine (VMware Engine).

B. Migrate applications and Oracle databases to Compute Engine.

C. Migrate applications to Cloud SQL.

D. Migrate applications and Oracle databases to Google Kubernetes Engine (GKE).

 


Correct Answer: A

Question 32

You recently launched a new product to the US market. You currently have two Bigtable clusters in one US region to serve all the traffic. Your marketing team is planning an immediate expansion to APAC. You need to roll out the regional expansion while implementing high availability according to Google-recommended practices. What should you do?

A. Maintain a target of 23% CPU utilization by locating:cluster-a in zone us-central1-acluster-b in zone europe-west1-dcluster-c in zone asia-east1-b

B. Maintain a target of 23% CPU utilization by locating:cluster-a in zone us-central1-acluster-b in zone us-central1-bcluster-c in zone us-east1-a

C. Maintain a target of 35% CPU utilization by locating:cluster-a in zone us-central1-acluster-b in zone australia-southeast1-acluster-c in zone europe-west1-dcluster-d in zone asia-east1-b

D. Maintain a target of 35% CPU utilization by locating:cluster-a in zone us-central1-acluster-b in zone us-central2-acluster-c in zone asia-northeast1-bcluster-d in zone asia-east1-b

 


Correct Answer: D

Question 33

Your organization needs to migrate a critical, on-premises MySQL database to Cloud SQL for MySQL. The on-premises database is on a version of MySQL that is supported by Cloud SQL and uses the InnoDB storage engine. You need to migrate the database while preserving transactions and minimizing downtime. What should you do?

A. 1. Use Database Migration Service to connect to your on-premises database, and choose continuous replication.2. After the on-premises database is migrated, promote the Cloud SQL for MySQL instance, and connect applications to your Cloud SQL instance.

B. 1. Build a Cloud Data Fusion pipeline for each table to migrate data from the on-premises MySQL database to Cloud SQL for MySQL.2. Schedule downtime to run each Cloud Data Fusion pipeline.3. Verify that the migration was successful.4. Re-point the applications to the Cloud SQL for MySQL instance.

C. 1. Pause the on-premises applications.2. Use the mysqldump utility to dump the database content in compressed format.3. Run gsutil –m to move the dump file to Cloud Storage.4. Use the Cloud SQL for MySQL import option.5. After the import operation is complete, re-point the applications to the Cloud SQL for MySQL instance.

D. 1 Pause the on-premises applications.2. Use the mysqldump utility to dump the database content in CSV format.3. Run gsutil –m to move the dump file to Cloud Storage.4. Use the Cloud SQL for MySQL import option.5. After the import operation is complete, re-point the applications to the Cloud SQL for MySQL instance.

 


Correct Answer: B

Question 34

Your organization is migrating 50 TB Oracle databases to Bare Metal Solution for Oracle. Database backups must be available for quick restore. You also need to have backups available for 5 years. You need to design a cost-effective architecture that meets a recovery time objective (RTO) of 2 hours and recovery point objective (RPO) of 15 minutes. What should you do?

A. 1 Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on all flash storage.3. Keep backups older than one day stored in Actifio OnVault storage.

B. 1 Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on standard storage.3. Keep backups older than one day stored in Actifio OnVault storage.

C. 1. Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on standard storage.3. Use the Oracle Recovery Manager (RMAN) backup utility to move backups older than one day to a Coldline Storage bucket.

D. 1. Create the database on a Bare Metal Solution server with the database running on flash storage.2. Keep a local backup copy on all flash storage.3. Use the Oracle Recovery Manager (RMAN) backup utility to move backups older than one day to an Archive Storage bucket.

 


Correct Answer: B

Question 35

You are responsible for designing a new database for an airline ticketing application in Google Cloud. This application must be able to:
Work with transactions and offer strong consistency.
Work with structured and semi-structured (JSON) data.
Scale transparently to multiple regions globally as the operation grows.
You need a Google Cloud database that meets all the requirements of the application. What should you do?

A. Use Cloud SQL for PostgreSQL with both cross-region read replicas.

B. Use Cloud Spanner in a multi-region configuration.

C. Use Firestore in Datastore mode.

D. Use a Bigtable instance with clusters in multiple regions.

 


Correct Answer: A

Question 36

You are choosing a new database backend for an existing application. The current database is running PostgreSQL on an on-premises VM and is managed by a database administrator and operations team. The application data is relational and has light traffic. You want to minimize costs and the migration effort for this application. What should you do?

A. Migrate the existing database to Firestore.

B. Migrate the existing database to Cloud SQL for PostgreSQL.

C. Migrate the existing database to Cloud Spanner.

D. Migrate the existing database to PostgreSQL running on Compute Engine.

 


Correct Answer: B

Question 37

You are managing a mission-critical Cloud SQL for PostgreSQL instance. Your application team is running important transactions on the database when another DBA starts an on-demand backup. You want to verify the status of the backup. What should you do?

A. Check the cloudsql.googleapis.com/postgres.log instance log.

B. Perform the gcloud sql operations list command.

C. Use Cloud Audit Logs to verify the status.

D. Use the Google Cloud Console.

 


Correct Answer: C

Question 38

Your organization is running a Firestore-backed Firebase app that serves the same top ten news stories on a daily basis to a large global audience. You want to optimize content delivery while decreasing cost and latency. What should you do?

A. Enable serializable isolation in the Firebase app.

B. Deploy a US multi-region Firestore location.

C. Build a Firestore bundle, and deploy bundles to Cloud CDN.

D. Create a Firestore index on the news story date.

 


Correct Answer: C

Question 39

You are troubleshooting a connection issue with a newly deployed Cloud SQL instance on Google Cloud. While investigating the Cloud SQL Proxy logs, you see the message Error 403: Access Not Configured. What should you do?

A. Check the app.yaml value cloud_sql_instances for a misspelled or incorrect instance connection name.

B. Check whether your service account has cloudsql.instances.connect permission.

C. Enable the Cloud SQL Admin API.

D. Ensure that you are using an external (public) IP address interface.

 


Correct Answer: A

Question 40

Your team is running a Cloud SQL for MySQL instance with a 5 TB database that must be available 24/7. You need to save database backups on object storage with minimal operational overhead or risk to your production workloads. What should you do?

A. Use Cloud SQL serverless exports.

B. Create a read replica, and then use the mysqldump utility to export each table.

C. Clone the Cloud SQL instance, and then use the mysqldump utlity to export the data.

D. Use the mysqldump utility on the primary database instance to export the backup.

 


Correct Answer: C

Question 41

You are running a transactional application on Cloud SQL for PostgreSQL in Google Cloud. The database is running in a high availability configuration within one region. You have encountered issues with data and want to restore to the last known pristine version of the database. What should you do?

A. Create a clone database from a read replica database, and restore the clone in the same region.

B. Create a clone database from a read replica database, and restore the clone into a different zone.

C. Use the Cloud SQL point-in-time recovery (PITR) feature. Restore the copy from two hours ago to a new database instance.

D. Use the Cloud SQL database import feature. Import last week’s dump file from Cloud Storage.

 


Correct Answer: C

Question 42

Your organization has a production Cloud SQL for MySQL instance. Your instance is configured with 16 vCPUs and 104 GB of RAM that is running between 90% and 100% CPU utilization for most of the day. You need to scale up the database and add vCPUs with minimal interruption and effort. What should you do?

A. Issue a gcloud sql instances patch command to increase the number of vCPUs.

B. Update a MySQL database flag to increase the number of vCPUs.

C. Issue a gcloud compute instances update command to increase the number of vCPUs.

D. Back up the database, create an instance with additional vCPUs, and restore the database.

 


Correct Answer: A

Question 43

Your digital-native business runs its database workloads on Cloud SQL. Your website must be globally accessible 24/7. You need to prepare your Cloud SQL instance for high availability (HA). You want to follow Google-recommended practices. What should you do? (Choose two.)

A. Set up manual backups.

B. Create a PostgreSQL database on-premises as the HA option.

C. Configure single zone availability for automated backups.

D. Enable point-in-time recovery.

E. Schedule automated backups.

 


Correct Answer: BD

Question 44

Your organization has an existing app that just went viral. The app uses a Cloud SQL for MySQL backend database that is experiencing slow disk performance while using hard disk drives (HDDs). You need to improve performance and reduce disk I/O wait times. What should you do?

A. Export the data from the existing instance, and import the data into a new instance with solid-state drives (SSDs).

B. Edit the instance to change the storage type from HDD to SSD.

C. Create a high availability (HA) failover instance with SSDs, and perform a failover to the new instance.

D. Create a read replica of the instance with SSDs, and perform a failover to the new instance

 


Correct Answer: C

Question 45

You manage a meeting booking application that uses Cloud SQL. During an important launch, the Cloud SQL instance went through a maintenance event that resulted in a downtime of more than 5 minutes and adversely affected your production application. You need to immediately address the maintenance issue to prevent any unplanned events in the future. What should you do?

A. Set your production instance’s maintenance window to non-business hours.

B. Migrate the Cloud SQL instance to Cloud Spanner to avoid any future disruptions due to maintenance.

C. Contact Support to understand why your Cloud SQL instance had a downtime of more than 5 minutes.

D. Use Cloud Scheduler to schedule a maintenance window of no longer than 5 minutes.

 


Correct Answer: B

Question 46

You are managing a small Cloud SQL instance for developers to do testing. The instance is not critical and has a recovery point objective (RPO) of several days. You want to minimize ongoing costs for this instance. What should you do?

A. Take no backups, and turn off transaction log retention.

B. Take one manual backup per day, and turn off transaction log retention.

C. Turn on automated backup, and turn off transaction log retention.

D. Turn on automated backup, and turn on transaction log retention.

 


Correct Answer: B

Question 47

You are managing a Cloud SQL for PostgreSQL instance in Google Cloud. You need to test the high availability of your Cloud SQL instance by performing a failover. You want to use the cloud command. What should you do?

A. Use gcloud sql instances failover .

B. Use gcloud sql instances failover .

C. Use gcloud sql instances promote-replica .

D. Use gcloud sql instances promote-replica .

 


Correct Answer: D

Question 48

Your company is developing a global ecommerce website on Google Cloud. Your development team is working on a shopping cart service that is durable and elastically scalable with live traffic. Business disruptions from unplanned downtime are expected to be less than 5 minutes per month. In addition, the application needs to have very low latency writes. You need a data storage solution that has high write throughput and provides 99.99% uptime. What should you do?

A. Use Cloud SQL for data storage.

B. Use Cloud Spanner for data storage.

C. Use Memorystore for data storage.

D. Use Bigtable for data storage.

 


Correct Answer: A

Question 49

Your company has PostgreSQL databases on-premises and on Amazon Web Services (flaws). You are planning multiple database migrations to Cloud SQL in an effort to reduce costs and downtime. You want to follow Google-recommended practices and use Google native data migration tools. You also want to closely monitor the migrations as part of the cutover strategy. What should you do?

A. Use Database Migration Service to migrate all databases to Cloud SQL.

B. Use Database Migration Service for one-time migrations, and use third-party or partner tools for change data capture (CDC) style migrations.

C. Use data replication tools and CDC tools to enable migration.

D. Use a combination of Database Migration Service and partner tools to support the data migration strategy.

 


Correct Answer: B

Question 50

You finished migrating an on-premises MySQL database to Cloud SQL. You want to ensure that the daily export of a table, which was previously a cron job running on the database server, continues. You want the solution to minimize cost and operations overhead. What should you do?

A. Use Cloud Scheduler and Cloud Functions to run the daily export.

B. Create a streaming Datatlow job to export the table.

C. Set up Cloud Composer, and create a task to export the table daily.

D. Run the cron job on a Compute Engine instance to continue the export.

 


Correct Answer: C

Access Full Google Professional Cloud Database Engineer Exam Prep Free

Want to go beyond these 50 questions? Click here to unlock a full set of Google Professional Cloud Database Engineer exam prep free questions covering every domain tested on the exam.

We continuously update our content to ensure you have the most current and effective prep materials.

Good luck with your Google Professional Cloud Database Engineer certification journey!

Share18Tweet11
Previous Post

Google Professional Cloud Architect Exam Prep Free

Next Post

Google Professional Cloud Developer Exam Prep Free

Next Post

Google Professional Cloud Developer Exam Prep Free

Google Professional Cloud DevOps Engineer Exam Prep Free

Google Professional Cloud Network Engineer Exam Prep Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.