Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Practice Questions Free

Google Professional Cloud Database Engineer Practice Questions Free

Table of Contents

Toggle
  • Google Professional Cloud Database Engineer Practice Questions Free – 50 Exam-Style Questions to Sharpen Your Skills
  • Free Access Full Google Professional Cloud Database Engineer Practice Questions Free

Google Professional Cloud Database Engineer Practice Questions Free – 50 Exam-Style Questions to Sharpen Your Skills

Are you preparing for the Google Professional Cloud Database Engineer certification exam? Kickstart your success with our Google Professional Cloud Database Engineer Practice Questions Free – a carefully selected set of 50 real exam-style questions to help you test your knowledge and identify areas for improvement.

Practicing with Google Professional Cloud Database Engineer practice questions free gives you a powerful edge by allowing you to:

  • Understand the exam structure and question formats
  • Discover your strong and weak areas
  • Build the confidence you need for test day success

Below, you will find 50 free Google Professional Cloud Database Engineer practice questions designed to match the real exam in both difficulty and topic coverage. They’re ideal for self-assessment or final review. You can click on each Question to explore the details.

Question 1

You manage a production MySQL database running on Cloud SQL at a retail company. You perform routine maintenance on Sunday at midnight when traffic is slow, but you want to skip routine maintenance during the year-end holiday shopping season. You need to ensure that your production system is available 24/7 during the holidays. What should you do?

A. Define a maintenance window on Sundays between 12 AM and 1 AM, and deny maintenance periods between November 1 and January 15.

B. Define a maintenance window on Sundays between 12 AM and 5 AM, and deny maintenance periods between November 1 and February 15.

C. Build a Cloud Composer job to start a maintenance window on Sundays between 12 AM and 1AM, and deny maintenance periods between November 1 and January 15.

D. Create a Cloud Scheduler job to start maintenance at 12 AM on Sundays. Pause the Cloud Scheduler job between November 1 and January 15.

 


Correct Answer: B

Question 2

You are starting a large CSV import into a Cloud SQL for MySQL instance that has many open connections. You checked memory and CPU usage, and sufficient resources are available. You want to follow Google-recommended practices to ensure that the import will not time out. What should you do?

A. Close idle connections or restart the instance before beginning the import operation.

B. Increase the amount of memory allocated to your instance.

C. Ensure that the service account has the Storage Admin role.

D. Increase the number of CPUs for the instance to ensure that it can handle the additional import operation.

 


Correct Answer: C

Question 3

You want to migrate your on-premises PostgreSQL database to Compute Engine. You need to migrate this database with the minimum downtime possible. What should you do?

A. Perform a full backup of your on-premises PostgreSQL, and then, in the migration window, perform an incremental backup.

B. Create a read replica on Cloud SQL, and then promote it to a read/write standalone instance.

C. Use Database Migration Service to migrate your database.

D. Create a hot standby on Compute Engine, and use PgBouncer to switch over the connections.

 


Correct Answer: B

Question 4

Your organization works with sensitive data that requires you to manage your own encryption keys. You are working on a project that stores that data in a Cloud SQL database. You need to ensure that stored data is encrypted with your keys. What should you do?

A. Export data periodically to a Cloud Storage bucket protected by Customer-Supplied Encryption Keys.

B. Use Cloud SQL Auth proxy.

C. Connect to Cloud SQL using a connection that has SSL encryption.

D. Use customer-managed encryption keys with Cloud SQL.

 


Correct Answer: C

Question 5

You are responsible for designing a new database for an airline ticketing application in Google Cloud. This application must be able to:
Work with transactions and offer strong consistency.
Work with structured and semi-structured (JSON) data.
Scale transparently to multiple regions globally as the operation grows.
You need a Google Cloud database that meets all the requirements of the application. What should you do?

A. Use Cloud SQL for PostgreSQL with both cross-region read replicas.

B. Use Cloud Spanner in a multi-region configuration.

C. Use Firestore in Datastore mode.

D. Use a Bigtable instance with clusters in multiple regions.

 


Correct Answer: A

Question 6

You are using Compute Engine on Google Cloud and your data center to manage a set of MySQL databases in a hybrid configuration. You need to create replicas to scale reads and to offload part of the management operation. What should you do?

A. Use external server replication.

B. Use Data Migration Service.

C. Use Cloud SQL for MySQL external replica.

D. Use the mysqldump utility and binary logs.

 


Correct Answer: B

Question 7

You are configuring a new application that has access to an existing Cloud Spanner database. The new application reads from this database to gather statistics for a dashboard. You want to follow Google-recommended practices when granting Identity and Access Management (IAM) permissions. What should you do?

A. Reuse the existing service account that populates this database.

B. Create a new service account, and grant it the Cloud Spanner Database Admin role.

C. Create a new service account, and grant it the Cloud Spanner Database Reader role.

D. Create a new service account, and grant it the spanner.databases.select permission.

 


Correct Answer: B

Question 8

Your company is developing a global ecommerce website on Google Cloud. Your development team is working on a shopping cart service that is durable and elastically scalable with live traffic. Business disruptions from unplanned downtime are expected to be less than 5 minutes per month. In addition, the application needs to have very low latency writes. You need a data storage solution that has high write throughput and provides 99.99% uptime. What should you do?

A. Use Cloud SQL for data storage.

B. Use Cloud Spanner for data storage.

C. Use Memorystore for data storage.

D. Use Bigtable for data storage.

 


Correct Answer: A

Question 9

You are running a large, highly transactional application on Oracle Real Application Cluster (RAC) that is multi-tenant and uses shared storage. You need a solution that ensures high-performance throughput and a low-latency connection between applications and databases. The solution must also support existing Oracle features and provide ease of migration to Google Cloud. What should you do?

A. Migrate to Compute Engine.

B. Migrate to Bare Metal Solution for Oracle.

C. Migrate to Google Kubernetes Engine (GKE)

D. Migrate to Google Cloud VMware Engine

 


Correct Answer: A

Question 10

Your customer is running a MySQL database on-premises with read replicas. The nightly incremental backups are expensive and add maintenance overhead. You want to follow Google-recommended practices to migrate the database to Google Cloud, and you need to ensure minimal downtime. What should you do?

A. Create a Google Kubernetes Engine (GKE) cluster, install MySQL on the cluster, and then import the dump file.

B. Use the mysqldump utility to take a backup of the existing on-premises database, and then import it into Cloud SQL.

C. Create a Compute Engine VM, install MySQL on the VM, and then import the dump file.

D. Create an external replica, and use Cloud SQL to synchronize the data to the replica.

 


Correct Answer: B

Question 11

You are managing a set of Cloud SQL databases in Google Cloud. Regulations require that database backups reside in the region where the database is created. You want to minimize operational costs and administrative effort. What should you do?

A. Configure the automated backups to use a regional Cloud Storage bucket as a custom location.

B. Use the default configuration for the automated backups location.

C. Disable automated backups, and create an on-demand backup routine to a regional Cloud Storage bucket.

D. Disable automated backups, and configure serverless exports to a regional Cloud Storage bucket.

 


Correct Answer: C

Question 12

You have a large Cloud SQL for PostgreSQL instance. The database instance is not mission-critical, and you want to minimize operational costs. What should you do to lower the cost of backups in this environment?

A. Set the automated backups to occur every other day to lower the frequency of backups.

B. Change the storage tier of the automated backups from solid-state drive (SSD) to hard disk drive (HDD).

C. Select a different region to store your backups.

D. Reduce the number of automated backups that are retained to two (2).

 


Correct Answer: A

Question 13

Your organization operates in a highly regulated industry. Separation of concerns (SoC) and security principle of least privilege (PoLP) are critical. The operations team consists of:
Person A is a database administrator.
Person B is an analyst who generates metric reports.
Application C is responsible for automatic backups.
You need to assign roles to team members for Cloud Spanner. Which roles should you assign?

A. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseReader for Person Broles/spanner.backupWriter for Application C

B. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseReader for Person Broles/spanner.backupAdmin for Application C

C. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseUser for Person Broles/spanner databaseReader for Application C

D. roles/spanner.databaseAdmin for Person Aroles/spanner.databaseUser for Person Broles/spanner.backupWriter for Application C

 


Correct Answer: B

Question 14

Your organization is running a low-latency reporting application on Microsoft SQL Server. In addition to the database engine, you are using SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS) in your on-premises environment. You want to migrate your Microsoft SQL Server database instances to Google Cloud. You need to ensure minimal disruption to the existing architecture during migration. What should you do?

A. Migrate to Cloud SQL for SQL Server.

B. Migrate to Cloud SQL for PostgreSQL.

C. Migrate to Compute Engine.

D. Migrate to Google Kubernetes Engine (GKE).

 


Correct Answer: B

Question 15

You work for a financial services company that wants to use fully managed database services. Traffic volume for your consumer services products has increased annually at a constant rate with occasional spikes around holidays. You frequently need to upgrade the capacity of your database. You want to use Cloud Spanner and include an automated method to increase your hardware capacity to support a higher level of concurrency. What should you do?

A. Use linear scaling to implement the Autoscaler-based architecture

B. Use direct scaling to implement the Autoscaler-based architecture.

C. Upgrade the Cloud Spanner instance on a periodic basis during the scheduled maintenance window.

D. Set up alerts that are triggered when Cloud Spanner utilization metrics breach the threshold, and then schedule an upgrade during the scheduled maintenance window.

 


Correct Answer: C

Question 16

You are the DBA of an online tutoring application that runs on a Cloud SQL for PostgreSQL database. You are testing the implementation of the cross-regional failover configuration. The database in region R1 fails over successfully to region R2, and the database becomes available for the application to process data. During testing, certain scenarios of the application work as expected in region R2, but a few scenarios fail with database errors. The application-related database queries, when executed in isolation from Cloud SQL for PostgreSQL in region R2, work as expected. The application performs completely as expected when the database fails back to region R1. You need to identify the cause of the database errors in region R2. What should you do?

A. Determine whether the versions of Cloud SQL for PostgreSQL in regions R1 and R2 are different.

B. Determine whether the database patches of Cloud SQI for PostgreSQL in regions R1 and R2 are different.

C. Determine whether the failover of Cloud SQL for PostgreSQL from region R1 to region R2 is in progress or has completed successfully.

D. Determine whether Cloud SQL for PostgreSQL in region R2 is a near-real-time copy of region R1 but not an exact copy.

 


Correct Answer: B

Question 17

Your application uses Cloud SQL for MySQL. Your users run reports on data that relies on near-real time; however, the additional analytics caused excessive load on the primary database. You created a read replica for the analytics workloads, but now your users are complaining about the lag in data changes and that their reports are still slow. You need to improve the report performance and shorten the lag in data replication without making changes to the current reports. Which two approaches should you implement? (Choose two.)

A. Create secondary indexes on the replica.

B. Create additional read replicas, and partition your analytics users to use different read replicas.

C. Disable replication on the read replica, and set the flag for parallel replication on the read replica. Re-enable replication and optimize performance by setting flags on the primary instance.

D. Disable replication on the primary instance, and set the flag for parallel replication on the primary instance. Re-enable replication and optimize performance by setting flags on the read replica.

E. Move your analytics workloads to BigQuery, and set up a streaming pipeline to move data and update BigQuery.

 


Correct Answer: BE

Question 18

You support a consumer inventory application that runs on a multi-region instance of Cloud Spanner. A customer opened a support ticket to complain about slow response times. You notice a Cloud Monitoring alert about high CPU utilization. You want to follow Google-recommended practices to address the CPU performance issue. What should you do first?

A. Increase the number of processing units.

B. Modify the database schema, and add additional indexes.

C. Shard data required by the application into multiple instances.

D. Decrease the number of processing units.

 


Correct Answer: A

Question 19

You need to redesign the architecture of an application that currently uses Cloud SQL for PostgreSQL. The users of the application complain about slow query response times. You want to enhance your application architecture to offer sub-millisecond query latency. What should you do?

A. Configure Firestore, and modify your application to offload queries.

B. Configure Bigtable, and modify your application to offload queries.

C. Configure Cloud SQL for PostgreSQL read replicas to offload queries.

D. Configure Memorystore, and modify your application to offload queries.

 


Correct Answer: D

Question 20

Your company wants to migrate its MySQL, PostgreSQL, and Microsoft SQL Server on-premises databases to Google Cloud. You need a solution that provides near-zero downtime, requires no application changes, and supports change data capture (CDC). What should you do?

A. Use the native export and import functionality of the source database.

B. Create a database on Google Cloud, and use database links to perform the migration.

C. Create a database on Google Cloud, and use Dataflow for database migration.

D. Use Database Migration Service.
–

 


Correct Answer: B

Question 21

Your company is shutting down their data center and migrating several MySQL and PostgreSQL databases to Google Cloud. Your database operations team is severely constrained by ongoing production releases and the lack of capacity for additional on-premises backups. You want to ensure that the scheduled migrations happen with minimal downtime and that the Google Cloud databases stay in sync with the on-premises data changes until the applications can cut over. What should you do? (Choose two.)

A. Use Database Migration Service to migrate the databases to Cloud SQL.

B. Use a cross-region read replica to migrate the databases to Cloud SQL.

C. Use replication from an external server to migrate the databases to Cloud SQL.

D. Use an external read replica to migrate the databases to Cloud SQL.

E. Use a read replica to migrate the databases to Cloud SQL.

 


Correct Answer: CE

Question 22

You need to migrate existing databases from Microsoft SQL Server 2016 Standard Edition on a single Windows Server 2019 Datacenter Edition to a single Cloud SQL for SQL Server instance. During the discovery phase of your project, you notice that your on-premises server peaks at around 25,000 read IOPS. You need to ensure that your Cloud SQL instance is sized appropriately to maximize read performance. What should you do?

A. Create a SQL Server 2019 Standard on Standard machine type with 4 vCPUs, 15 GB of RAM, and 800 GB of solid-state drive (SSD).

B. Create a SQL Server 2019 Standard on High Memory machine type with at least 16 vCPUs, 104 GB of RAM, and 200 GB of SSD.

C. Create a SQL Server 2019 Standard on High Memory machine type with 16 vCPUs, 104 GB of RAM, and 4 TB of SSD.

D. Create a SQL Server 2019 Enterprise on High Memory machine type with 16 vCPUs, 104 GB of RAM, and 500 GB of SSD.

 


Correct Answer: B

Question 23

Your company uses Bigtable for a user-facing application that displays a low-latency real-time dashboard. You need to recommend the optimal storage type for this read-intensive database. What should you do?

A. Recommend solid-state drives (SSD).

B. Recommend splitting the Bigtable instance into two instances in order to load balance the concurrent reads.

C. Recommend hard disk drives (HDD).

D. Recommend mixed storage types.

 


Correct Answer: B

Question 24

Your DevOps team is using Terraform to deploy applications and Cloud SQL databases. After every new application change is rolled out, the environment is torn down and recreated, and the persistent database layer is lost. You need to prevent the database from being dropped. What should you do?

A. Set Terraform deletion_protection to true.

B. Rerun terraform apply.

C. Create a read replica.

D. Use point-in-time-recovery (PITR) to recover the database.

 


Correct Answer: A

Question 25

Your digital-native business runs its database workloads on Cloud SQL. Your website must be globally accessible 24/7. You need to prepare your Cloud SQL instance for high availability (HA). You want to follow Google-recommended practices. What should you do? (Choose two.)

A. Set up manual backups.

B. Create a PostgreSQL database on-premises as the HA option.

C. Configure single zone availability for automated backups.

D. Enable point-in-time recovery.

E. Schedule automated backups.

 


Correct Answer: BD

Question 26

Your team recently released a new version of a highly consumed application to accommodate additional user traffic. Shortly after the release, you received an alert from your production monitoring team that there is consistently high replication lag between your primary instance and the read replicas of your Cloud SQL for MySQL instances. You need to resolve the replication lag. What should you do?

A. Identify and optimize slow running queries, or set parallel replication flags.

B. Stop all running queries, and re-create the replicas.

C. Edit the primary instance to upgrade to a larger disk, and increase vCPU count.

D. Edit the primary instance to add additional memory.

 


Correct Answer: C

Question 27

You are managing a mission-critical Cloud SQL for PostgreSQL instance. Your application team is running important transactions on the database when another DBA starts an on-demand backup. You want to verify the status of the backup. What should you do?

A. Check the cloudsql.googleapis.com/postgres.log instance log.

B. Perform the gcloud sql operations list command.

C. Use Cloud Audit Logs to verify the status.

D. Use the Google Cloud Console.

 


Correct Answer: C

Question 28

Your team is running a Cloud SQL for MySQL instance with a 5 TB database that must be available 24/7. You need to save database backups on object storage with minimal operational overhead or risk to your production workloads. What should you do?

A. Use Cloud SQL serverless exports.

B. Create a read replica, and then use the mysqldump utility to export each table.

C. Clone the Cloud SQL instance, and then use the mysqldump utlity to export the data.

D. Use the mysqldump utility on the primary database instance to export the backup.

 


Correct Answer: C

Question 29

You are managing multiple applications connecting to a database on Cloud SQL for PostgreSQL. You need to be able to monitor database performance to easily identify applications with long-running and resource-intensive queries. What should you do?

A. Use log messages produced by Cloud SQL.

B. Use Query Insights for Cloud SQL.

C. Use the Cloud Monitoring dashboard with available metrics from Cloud SQL.

D. Use Cloud SQL instance monitoring in the Google Cloud Console.

 


Correct Answer: C

Question 30

You are migrating an on-premises application to Google Cloud. The application requires a high availability (HA) PostgreSQL database to support business-critical functions. Your company's disaster recovery strategy requires a recovery time objective (RTO) and recovery point objective (RPO) within 30 minutes of failure. You plan to use a Google Cloud managed service. What should you do to maximize uptime for your application?

A. Deploy Cloud SQL for PostgreSQL in a regional configuration. Create a read replica in a different zone in the same region and a read replica in another region for disaster recovery.

B. Deploy Cloud SQL for PostgreSQL in a regional configuration with HA enabled. Take periodic backups, and use this backup to restore to a new Cloud SQL for PostgreSQL instance in another region during a disaster recovery event.

C. Deploy Cloud SQL for PostgreSQL in a regional configuration with HA enabled. Create a cross-region read replica, and promote the read replica as the primary node for disaster recovery.

D. Migrate the PostgreSQL database to multi-regional Cloud Spanner so that a single region outage will not affect your application. Update the schema to support Cloud Spanner data types, and refactor the application.

 


Correct Answer: C

Question 31

Your organization is running a Firestore-backed Firebase app that serves the same top ten news stories on a daily basis to a large global audience. You want to optimize content delivery while decreasing cost and latency. What should you do?

A. Enable serializable isolation in the Firebase app.

B. Deploy a US multi-region Firestore location.

C. Build a Firestore bundle, and deploy bundles to Cloud CDN.

D. Create a Firestore index on the news story date.

 


Correct Answer: C

Question 32

Your organization has strict policies on tracking rollouts to production and periodically shares this information with external auditors to meet compliance requirements. You need to enable auditing on several Cloud Spanner databases. What should you do?

A. Use replication to roll out changes to higher environments.

B. Use backup and restore to roll out changes to higher environments.

C. Use Liquibase to roll out changes to higher environments.

D. Manually capture detailed DBA audit logs when changes are rolled out to higher environments.

 


Correct Answer: B

Question 33

You are working on a new centralized inventory management system to track items available in 200 stores, which each have 500 GB of data. You are planning a gradual rollout of the system to a few stores each week. You need to design an SQL database architecture that minimizes costs and user disruption during each regional rollout and can scale up or down on nights and holidays. What should you do?

A. Use Oracle Real Application Cluster (RAC) databases on Bare Metal Solution for Oracle.

B. Use sharded Cloud SQL instances with one or more stores per database instance.

C. Use a Biglable cluster with autoscaling.

D. Use Cloud Spanner with a custom autoscaling solution.

 


Correct Answer: B

Question 34

Your organization needs to migrate a critical, on-premises MySQL database to Cloud SQL for MySQL. The on-premises database is on a version of MySQL that is supported by Cloud SQL and uses the InnoDB storage engine. You need to migrate the database while preserving transactions and minimizing downtime. What should you do?

A. 1. Use Database Migration Service to connect to your on-premises database, and choose continuous replication.2. After the on-premises database is migrated, promote the Cloud SQL for MySQL instance, and connect applications to your Cloud SQL instance.

B. 1. Build a Cloud Data Fusion pipeline for each table to migrate data from the on-premises MySQL database to Cloud SQL for MySQL.2. Schedule downtime to run each Cloud Data Fusion pipeline.3. Verify that the migration was successful.4. Re-point the applications to the Cloud SQL for MySQL instance.

C. 1. Pause the on-premises applications.2. Use the mysqldump utility to dump the database content in compressed format.3. Run gsutil –m to move the dump file to Cloud Storage.4. Use the Cloud SQL for MySQL import option.5. After the import operation is complete, re-point the applications to the Cloud SQL for MySQL instance.

D. 1 Pause the on-premises applications.2. Use the mysqldump utility to dump the database content in CSV format.3. Run gsutil –m to move the dump file to Cloud Storage.4. Use the Cloud SQL for MySQL import option.5. After the import operation is complete, re-point the applications to the Cloud SQL for MySQL instance.

 


Correct Answer: B

Question 35

Your team is building an application that stores and analyzes streaming time series financial data. You need a database solution that can perform time series-based scans with sub-second latency. The solution must scale into the hundreds of terabytes and be able to write up to 10k records per second and read up to 200 MB per second. What should you do?

A. Use Firestore.

B. Use Bigtable

C. Use BigQuery.

D. Use Cloud Spanner.

 


Correct Answer: C

Question 36

Your online delivery business that primarily serves retail customers uses Cloud SQL for MySQL for its inventory and scheduling application. The required recovery time objective (RTO) and recovery point objective (RPO) must be in minutes rather than hours as a part of your high availability and disaster recovery design. You need a high availability configuration that can recover without data loss during a zonal or a regional failure. What should you do?

A. Set up all read replicas in a different region using asynchronous replication.

B. Set up all read replicas in the same region as the primary instance with synchronous replication.

C. Set up read replicas in different zones of the same region as the primary instance with synchronous replication, and set up read replicas in different regions with asynchronous replication.

D. Set up read replicas in different zones of the same region as the primary instance with asynchronous replication, and set up read replicas in different regions with synchronous replication.

 


Correct Answer: C

Question 37

Your company is using Cloud SQL for MySQL with an internal (private) IP address and wants to replicate some tables into BigQuery in near-real time for analytics and machine learning. You need to ensure that replication is fast and reliable and uses Google-managed services. What should you do?

A. Develop a custom data replication service to send data into BigQuery.

B. Use Cloud SQL federated queries.

C. Use Database Migration Service to replicate tables into BigQuery.

D. Use Datastream to capture changes, and use Dataflow to write those changes to BigQuery.

 


Correct Answer: D

Question 38

You are designing a database strategy for a new web application. You plan to start with a small pilot in one country and eventually expand to millions of users in a global audience. You need to ensure that the application can run 24/7 with minimal downtime for maintenance. What should you do?

A. Use Cloud Spanner in a regional configuration.

B. Use Cloud Spanner in a multi-region configuration.

C. Use Cloud SQL with cross-region replicas.

D. Use highly available Cloud SQL with multiple zones.

 


Correct Answer: A

Question 39

Your organization is running a MySQL workload in Cloud SQL. Suddenly you see a degradation in database performance. You need to identify the root cause of the performance degradation. What should you do?

A. Use Logs Explorer to analyze log data.

B. Use Cloud Monitoring to monitor CPU, memory, and storage utilization metrics.

C. Use Error Reporting to count, analyze, and aggregate the data.

D. Use Cloud Debugger to inspect the state of an application.

 


Correct Answer: B

Question 40

Your company uses the Cloud SQL out-of-disk recommender to analyze the storage utilization trends of production databases over the last 30 days. Your database operations team uses these recommendations to proactively monitor storage utilization and implement corrective actions. You receive a recommendation that the instance is likely to run out of disk space. What should you do to address this storage alert?

A. Normalize the database to the third normal form.

B. Compress the data using a different compression algorithm.

C. Manually or automatically increase the storage capacity.

D. Create another schema to load older data.

 


Correct Answer: B

Question 41

You want to migrate an on-premises mission-critical PostgreSQL database to Cloud SQL. The database must be able to withstand a zonal failure with less than five minutes of downtime and still not lose any transactions. You want to follow Google-recommended practices for the migration. What should you do?

A. Take nightly snapshots of the primary database instance, and restore them in a secondary zone.

B. Build a change data capture (CDC) pipeline to read transactions from the primary instance, and replicate them to a secondary instance.

C. Create a read replica in another region, and promote the read replica if a failure occurs.

D. Enable high availability (HA) for the database to make it regional.

 


Correct Answer: D

Question 42

You need to perform a one-time migration of data from a running Cloud SQL for MySQL instance in the us-central1 region to a new Cloud SQL for MySQL instance in the us-east1 region. You want to follow Google-recommended practices to minimize performance impact on the currently running instance. What should you do?

A. Create and run a Dataflow job that uses JdbcIO to copy data from one Cloud SQL instance to another.

B. Create two Datastream connection profiles, and use them to create a stream from one Cloud SQL instance to another.

C. Create a SQL dump file in Cloud Storage using a temporary instance, and then use that file to import into a new instance.

D. Create a CSV file by running the SQL statement SELECT…INTO OUTFILE, copy the file to a Cloud Storage bucket, and import it into a new instance.

 


Correct Answer: C

Question 43

You are a DBA on a Cloud Spanner instance with multiple databases. You need to assign these privileges to all members of the application development team on a specific database:
Can read tables, views, and DDL -
Can write rows to the tables -
Can add columns and indexes -
Cannot drop the database -
What should you do?

A. Assign the Cloud Spanner Database Reader and Cloud Spanner Backup Writer roles.

B. Assign the Cloud Spanner Database Admin role.

C. Assign the Cloud Spanner Database User role.

D. Assign the Cloud Spanner Admin role.

 


Correct Answer: C

Question 44

Your project is using Bigtable to store data that should not be accessed from the public internet under any circumstances, even if the requestor has a valid service account key. You need to secure access to this data. What should you do?

A. Use Identity and Access Management (IAM) for Bigtable access control.

B. Use VPC Service Controls to create a trusted network for the Bigtable service.

C. Use customer-managed encryption keys (CMEK).

D. Use Google Cloud Armor to add IP addresses to an allowlist.

 


Correct Answer: B

Question 45

Your ecommerce application connecting to your Cloud SQL for SQL Server is expected to have additional traffic due to the holiday weekend. You want to follow Google-recommended practices to set up alerts for CPU and memory metrics so you can be notified by text message at the first sign of potential issues. What should you do?

A. Use a Cloud Function to pull CPU and memory metrics from your Cloud SQL instance and to call a custom service to send alerts.

B. Use Error Reporting to monitor CPU and memory metrics and to configure SMS notification channels.

C. Use Cloud Logging to set up a log sink for CPU and memory metrics and to configure a sink destination to send a message to Pub/Sub.

D. Use Cloud Monitoring to set up an alerting policy for CPU and memory metrics and to configure SMS notification channels.

 


Correct Answer: B

Question 46

You are managing a Cloud SQL for MySQL environment in Google Cloud. You have deployed a primary instance in Zone A and a read replica instance in Zone B, both in the same region. You are notified that the replica instance in Zone B was unavailable for 10 minutes. You need to ensure that the read replica instance is still working. What should you do?

A. Use the Google Cloud Console or gcloud CLI to manually create a new clone database.

B. Use the Google Cloud Console or gcloud CLI to manually create a new failover replica from backup.

C. Verify that the new replica is created automatically.

D. Start the original primary instance and resume replication.

 


Correct Answer: B

Question 47

You are designing a new gaming application that uses a highly transactional relational database to store player authentication and inventory data in Google Cloud. You want to launch the game in multiple regions. What should you do?

A. Use Cloud Spanner to deploy the database.

B. Use Bigtable with clusters in multiple regions to deploy the database

C. Use BigQuery to deploy the database

D. Use Cloud SQL with a regional read replica to deploy the database.

 


Correct Answer: C

Question 48

During an internal audit, you realized that one of your Cloud SQL for MySQL instances does not have high availability (HA) enabled. You want to follow Google-recommended practices to enable HA on your existing instance. What should you do?

A. Create a new Cloud SQL for MySQL instance, enable HA, and use the export and import option to migrate your data.

B. Create a new Cloud SQL for MySQL instance, enable HA, and use Cloud Data Fusion to migrate your data.

C. Use the gcloud instances patch command to update your existing Cloud SQL for MySQL instance.

D. Shut down your existing Cloud SQL for MySQL instance, and enable HA.

 


Correct Answer: A

Question 49

Your ecommerce website captures user clickstream data to analyze customer traffic patterns in real time and support personalization features on your website. You plan to analyze this data using big data tools. You need a low-latency solution that can store 8 TB of data and can scale to millions of read and write requests per second. What should you do?

A. Write your data into Bigtable and use Dataproc and the Apache Hbase libraries for analysis.

B. Deploy a Cloud SQL environment with read replicas for improved performance. Use Datastream to export data to Cloud Storage and analyze with Dataproc and the Cloud Storage connector.

C. Use Memorystore to handle your low-latency requirements and for real-time analytics.

D. Stream your data into BigQuery and use Dataproc and the BigQuery Storage API to analyze large volumes of data.

 


Correct Answer: B

Question 50

You plan to use Database Migration Service to migrate data from a PostgreSQL on-premises instance to Cloud SQL. You need to identify the prerequisites for creating and automating the task. What should you do? (Choose two.)

A. Drop or disable all users except database administration users.

B. Disable all foreign key constraints on the source PostgreSQL database.

C. Ensure that all PostgreSQL tables have a primary key.

D. Shut down the database before the Data Migration Service task is started.

E. Ensure that pglogical is installed on the source PostgreSQL database.

 


Correct Answer: BE

Free Access Full Google Professional Cloud Database Engineer Practice Questions Free

Want more hands-on practice? Click here to access the full bank of Google Professional Cloud Database Engineer practice questions free and reinforce your understanding of all exam objectives.

We update our question sets regularly, so check back often for new and relevant content.

Good luck with your Google Professional Cloud Database Engineer certification journey!

Share18Tweet11
Previous Post

Google Professional Cloud Architect Practice Questions Free

Next Post

Google Professional Cloud Developer Practice Questions Free

Next Post

Google Professional Cloud Developer Practice Questions Free

Google Professional Cloud DevOps Engineer Practice Questions Free

Google Professional Cloud Network Engineer Practice Questions Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.