Exam Code | Professional-Cloud-Developer |
Exam Name | Google Certified Professional - Cloud Developer |
Questions | 265 Questions Answers With Explanation |
Update Date | November 08,2024 |
Price |
Was : |
Are you ready to take your career to the next level with Google Certified Professional - Cloud Developer? At Prep4Certs, we're dedicated to helping you achieve your goals by providing high-quality Professional-Cloud-Developer Dumps and resources for a wide range of certification exams.
At Prep4Certs, we're committed to your success in the Google Professional-Cloud-Developer exam. Our comprehensive study materials and resources are designed to equip you with the knowledge and skills needed to ace the exam with confidence:
Start Your Certification Journey Today
Whether you're looking to advance your career, expand your skill set, or pursue new opportunities, Prep4Certs is here to support you on your certification journey. Explore our comprehensive study materials, take your exam preparation to the next level, and unlock new possibilities for professional growth and success.
Ready to achieve your certification goals? Begin your journey with Prep4Certs today!
You have an application running in App Engine. Your application is instrumented with Stackdriver Trace. The /product-details request reports details about four known unique products at /sku-details as shown below. You want to reduce the time it takes for the request to complete. What should you do?
A. Increase the size of the instance class.
B. Change the Persistent Disk type to SSD.
C. Change /product-details to perform the requests in parallel.
D. Store the /sku-details information in a database, and replace the webservice call with a database query.
Your company has deployed a new API to App Engine Standard environment. During testing, the API is not behaving as expected. You want to monitor the application over time to diagnose the problem within the application code without redeploying the application. Which tool should you use?
A. Stackdriver Trace
B. Stackdriver Monitoring
C. Stackdriver Debug Snapshots
D. Stackdriver Debug Logpoints
You have been tasked with planning the migration of your company’s application from onpremises to Google Cloud. Your company’s monolithic application is an ecommerce website. The application will be migrated to microservices deployed on Google Cloud in stages. The majority of your company’s revenue is generated through online sales, so it is important to minimize risk during the migration. You need to prioritize features and select the first functionality to migrate. What should you do?
A. Migrate the Product catalog, which has integrations to the frontend and product
database.
B. Migrate Payment processing, which has integrations to the frontend, order database, and third-party payment vendor.
C. Migrate Order fulfillment, which has integrations to the order database, inventory system, and third-party shipping vendor.
D. Migrate the Shopping cart, which has integrations to the frontend, cart database, inventory system, and payment processing system.
Your team develops services that run on Google Cloud. You need to build a data processing service and will use Cloud Functions. The data to be processed by the function is sensitive. You need to ensure that invocations can only happen from authorized services and follow Google-recommended best practices for securing functions. What should you do?
A. Enable Identity-Aware Proxy in your project. Secure function access using its
permissions.
B. Create a service account with the Cloud Functions Viewer role. Use that service account to invoke the function.
C. Create a service account with the Cloud Functions Invoker role. Use that service account to invoke the function.
D. Create an OAuth 2.0 client ID for your calling service in the same project as the function you want to secure. Use those credentials to invoke the function.
You have an HTTP Cloud Function that is called via POST. Each submission’s request body has a flat, unnested JSON structure containing numeric and text data. After the Cloud Function completes, the collected data should be immediately available for ongoing and complex analytics by many users in parallel. How should you persist the submissions?
A. Directly persist each POST request’s JSON data into Datastore.
B. Transform the POST request’s JSON data, and stream it into BigQuery.
C. Transform the POST request’s JSON data, and store it in a regional Cloud SQL cluster.
D. Persist each POST request’s JSON data as an individual file within Cloud Storage, with the file name containing the request identifier.
Your application is running on Compute Engine and is showing sustained failures for a small number of requests. You have narrowed the cause down to a single Compute Engine instance, but the instance is unresponsive to SSH. What should you do next?
A. Reboot the machine.
B. Enable and check the serial port output.
C. Delete the machine and create a new one.
D. Take a snapshot of the disk and attach it to a new machine.
Your application is built as a custom machine image. You have multiple unique deployments of the machine image. Each deployment is a separate managed instance group with its own template. Each deployment requires a unique set of configuration values. You want to provide these unique values to each deployment but use the same custom machine image in all deployments. You want to use out-of-the-box features of Compute Engine. What should you do?
A. Place the unique configuration values in the persistent disk.
B. Place the unique configuration values in a Cloud Bigtable table.
C. Place the unique configuration values in the instance template startup script.
D. Place the unique configuration values in the instance template instance metadata.
You recently developed a new service on Cloud Run. The new service authenticates using a custom service and then writes transactional information to a Cloud Spanner database. You need to verify that your application can support up to 5,000 read and 1,000 write transactions per second while identifying any bottlenecks that occur. Your test infrastructure must be able to autoscale. What should you do?
A. Build a test harness to generate requests and deploy it to Cloud Run. Analyze the VPC Flow Logs using Cloud Logging.
B. Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.
C. Create a Cloud Task to generate a test load. Use Cloud Scheduler to run 60,000 Cloud Task transactions per minute for 10 minutes. Analyze the results using Cloud Monitoring.
D. Create a Compute Engine instance that uses a LAMP stack image from the Marketplace, and use Apache Bench to generate load tests against the service. Analyze the results using Cloud Trace.
Your company is planning to migrate their on-premises Hadoop environment to the cloud. Increasing storage cost and maintenance of data stored in HDFS is a major concern for your company. You also want to make minimal changes to existing data analytics jobs and existing architecture. How should you proceed with the migration?
A. Migrate your data stored in Hadoop to BigQuery. Change your jobs to source their
information from BigQuery instead of the on-premises Hadoop environment.
B. Create Compute Engine instances with HDD instead of SSD to save costs. Then perform a full migration of your existing environment into the new one in Compute Engine instances.
C. Create a Cloud Dataproc cluster on Google Cloud Platform, and then migrate your Hadoop environment to the new Cloud Dataproc cluster. Move your HDFS data into larger HDD disks to save on storage costs.
D. Create a Cloud Dataproc cluster on Google Cloud Platform, and then migrate your Hadoop code objects to the new cluster. Move your data to Cloud Storage and leverage
the Cloud Dataproc connector to run jobs on that data.
Your data is stored in Cloud Storage buckets. Fellow developers have reported that data downloaded from Cloud Storage is resulting in slow API performance. You want to research the issue to provide details to the GCP support team. Which command should you run?
A. gsutil test –o output.json gs://my-bucket
B. gsutil perfdiag –o output.json gs://my-bucket
C. gcloud compute scp example-instance:~/test-data –o output.json gs://my-bucket
D. gcloud services test –o output.json gs://my-bucket