Category Archives: Google

Google Professional Cloud Database Engineer Exam

Google Cloud Platform GCP is Fastest growing Public cloud. Professional Cloud Database engineer certification is the one which help you to advance your career in Cloud Computing & learn about different offering by GCP in database.

Do you want to learn about the how to store terabyte to petabytes scale data in a structured or semi structured manner inside GCP environment .
You want to learn where YouTube & Gmail kind of billion user app store their data.
Do you want to learn about database which can handle billion of request in a second.
Do you want to learn about different Google cloud database product like cloud SQL, spanner, datastore, firestore, bigtable, alloydb.
If yes, You are at right place.
Cloud is the future and GCP is Fastest growing Public cloud.
87 percentage of Google Cloud certified individuals are more confident about their cloud skills.
More than 1 in 4 of Google Cloud certified individuals took on more responsibility or leadership roles at work.
Google Cloud Professional Database Engineer Certification is the best to invest time and energy to enhance your knowledge about GCP Database.

Professional Cloud Database Engineer
A Professional Cloud Database Engineer is a database professional with two years of Google Cloud experience and five years of overall database and IT experience. The Professional Cloud Database Engineer designs, creates, manages, and troubleshoots Google Cloud databases used by applications to store and retrieve data. The Professional Cloud Database Engineer should be comfortable translating business and technical requirements into scalable and cost-effective database solutions.

The Professional Cloud Database Engineer exam assesses your ability to:
Design scalable and highly available cloud database solutions
Manage a solution that can span multiple database solutions
Migrate data solutions
Deploy scalable and highly available databases in Google Cloud

About this certification exam
Length: 2 hours
Registration fee: $200 (plus tax where applicable)
Language: English
Exam format: 50-60 multiple choice and multiple select questions

Exam Delivery Method:
a. Take the online-proctored exam from a remote location.
b. Take the onsite-proctored exam at a testing center.
Prerequisites: None

Recommended experience: 5+ years of overall database and IT experience, including 2 years of hands-on experience working with Google Cloud database solutions

Examkingdom Google Professional Cloud Database Engineer Exam pdf,

MCTS Training, MCITP Trainnig

Best Google Professional Cloud Database Engineer Free downloads , Google Professional Cloud Database Engineer Dumps at Certkingdom.com

Exam overview
1. Get real world experience
Before attempting the Cloud Database Engineer exam, it’s recommended that you have 2+ years of hands-on experience working with Google Cloud database solutions. Ready to start building? Explore the Google Cloud Free Tier for free usage (up to monthly limits) of select products.

Try Google Cloud Free Tier

2. Understand what’s on the exam
The exam guide contains a complete list of topics that may be included on the exam, helping you determine if your skills align with the exam.

3. Review the sample questions
Review the sample questions to familiarize yourself with the format of exam questions and example content that may be covered on the Cloud Database Engineer exam.

4. Round out your skills with training
Follow the learning path

Prepare for the exam by following the Cloud Database Engineer learning path. Explore online training, in-person classes, hands-on labs, and other resources from Google Cloud.

Additional resources
Your Google Cloud database options, explained
Database modernization solutions
Database migration solutions
In-depth discussions on the concepts and critical components of Google Cloud
Google Cloud documentation
Google Cloud solutions

5. Schedule an exam
Register and select whether to take the exam remotely or at a nearby testing center.
Review exam terms and conditions and data sharing policies.

QUESTION 1
You are developing a new application on a VM that is on your corporate network. The application will
use Java Database Connectivity (JDBC) to connect to Cloud SQL for PostgreSQL. Your Cloud SQL
instance is configured with IP address 192.168.3.48, and SSL is disabled. You want to ensure that your
application can access your database instance without requiring configuration changes to your
database. What should you do?

A. Define a connection string using your Google username and password to point to the external
(public) IP address of your Cloud SQL instance.

B. Define a connection string using a database username and password to point to the internal
(private) IP address of your Cloud SQL instance.

C. Define a connection string using Cloud SQL Auth proxy configured with a service account to point
to the internal (private) IP address of your Cloud SQL instance.

D. Define a connection string using Cloud SQL Auth proxy configured with a service account to point
to the external (public) IP address of your Cloud SQL instance.

Answer: C

QUESTION 2
Your digital-native business runs its database workloads on Cloud SQL. Your website must be globally
accessible 24. You need to prepare your Cloud SQL instance for high availability (HA). You want to
follow Google-recommended practices. What should you do? (Choose two.)

A. Set up manual backups.
B. Create a PostgreSQL database on-premises as the HA option.
C. Configure single zone availability for automated backups.
D. Enable point-in-time recovery.
E. Schedule automated backups.

Answer: B, D

QUESTION 3
Your company wants to move to Google Cloud. Your current data center is closing in six months.
You are running a large, highly transactional Oracle application footprint on VMWare.
You need to design a solution with minimal disruption to the current architecture and provide ease of migration to
Google Cloud. What should you do?

A. Migrate applications and Oracle databases to Google Cloud VMware Engine (VMware Engine).
B. Migrate applications and Oracle databases to Compute Engine.
C. Migrate applications to Cloud SQL.
D. Migrate applications and Oracle databases to Google Kubernetes Engine (GKE).

Answer: A

QUESTION 4
Your customer has a global chat application that uses a multi-regional Cloud Spanner instance.
The application has recently experienced degraded performance after a new version of the application
was launched. Your customer asked you for assistance. During initial troubleshooting, you observed
high read latency.
What should you do?

A. Use query parameters to speed up frequently executed queries.
B. Change the Cloud Spanner configuration from multi-region to single region.
C. Use SQL statements to analyze SPANNER_SYS.READ_STATS* tables.
D. Use SQL statements to analyze SPANNER_SYS.QUERY_STATS* tables.

Answer: B

QUESTION 5
Your company has PostgreSQL databases on-premises and on Amazon Web Services (AWS). You are
planning multiple database migrations to Cloud SQL in an effort to reduce costs and downtime. You
want to follow Google-recommended practices and use Google native data migration tools. You also
want to closely monitor the migrations as part of the cutover strategy. What should you do?

A. Use Database Migration Service to migrate all databases to Cloud SQL.
B. Use Database Migration Service for one-time migrations, and use third-party or partner tools for change data capture (CDC) style migrations.
C. Use data replication tools and CDC tools to enable migration.
D. Use a combination of Database Migration Service and partner tools to support the data migration strategy.

Answer: B

QUESTION 6
You are setting up a Bare Metal Solution environment. You need to update the operating system to
the latest version. You need to connect the Bare Metal Solution environment to the internet so you
can receive software updates. What should you do?

A. Setup a static external IP address in your VPC network.
B. Set up bring your own IP (BYOIP) in your VPC.
C. Set up a Cloud NAT gateway on the Compute Engine VM.
D. Set up Cloud NAT service.

Answer: C

Google Professional Google Workspace Administrator Exam

Professional Google Workspace Administrator
A Professional Google Workspace Administrator transforms business objectives into tangible configurations, policies, and security practices as they relate to users, content, and integrations. Through their understanding of their organization’s infrastructure, Google Workspace Administrators enable people to work together, communicate, and access data in a secure and efficient manner. Operating with an engineering and solutions mindset, they use tools, programming languages, and APIs to automate workflows. They look for opportunities to educate end users and increase operational efficiency while advocating for Google Workspace and the Google toolset.

Related job roles: IT systems administrator, cloud solutions engineer, collaboration engineer, systems engineer

The Professional Google Workspace Administrator exam assesses your ability to:
Plan and implement Google Workspace authorization and access
Manage user, resource, and shared drive life cycles
Control and configure Google Workspace services
Configure and manage endpoint access
Monitor organizational operations
Advance Google Workspace adoption and collaboration

About this certification exam
Length: 2 hours
Registration fee: $200 (plus tax where applicable)
Languages: English, Japanese
Exam format: 50-60 multiple choice and multiple select questions

Exam Delivery Method:
a) Take the online-proctored exam from a remote location. Review the online testing requirements.
b) Take the onsite-proctored exam at a testing center. Locate a test center near you.

Prerequisites: None
Recommended experience: 3+ years of industry experience including 1+ year Google Workspace (formerly G Suite) administration experience.

Certification Renewal / Recertification: Candidates must recertify in order to maintain their certification status. Unless explicitly stated in the detailed exam descriptions, all Google Cloud certifications are valid for two years from the date of certification. Recertification is accomplished by retaking the exam during the recertification eligibility time period and achieving a passing score. You may attempt recertification starting 60 days prior to your certification expiration date.

Exam overview
1. Review the exam guide

The exam guide contains a complete list of topics that may be included on the exam. Review the exam guide to determine if your skills align with the topics on the exam.

Section 1: Object management
1.1 Manage user lifecycles with provisioning and deprovisioning processes. Considerations include:
● Adding users (e.g., individual, bulk, automated)
● Removing users (e.g., suspending, deleting, recovering)
● Editing user attributes (e.g., renaming, passwords, aliases)
● Creating administrative roles (e.g., default roles, custom roles)

1.2 Configure shared drives
1.3 Manage calendar resources
1.4 Configure and manage Google Groups for Business. Considerations include:
● Configuring Google Groups
● Adding users to groups
● Implications of current Google Workspace APIs to development efforts
● Using Apps Script to automate tasks

Examkingdom Professional Google Workspace Administrator Exam pdf,

MCTS Training, MCITP Trainnig

Best Professional Google Workspace Administrator Free downloads , Professional Google Workspace Administrator Dumps at Certkingdom.com

Section 2: Service configuration

2.1 Implement and manage Google Workspace configurations based on corporate policies. Considerations include:
● Managing company profile settings
● Modifying OU policies
● Managing rollout of new Google functionality to end users
● Troubleshooting Google Workspace services (e.g., performance issues for services suite, apps for OUs)
● Scanning email with Data Loss Prevention (DLP)
● Managing content compliance rules
● Configuring security and data region
● Monitoring security health check
● Configuring security settings
● Creating security records
● Designing security integration and addressing objections

2.2 Demonstrate how to set up and configure Gmail. Considerations include:
● Enabling email delegation for an OU
● Managing Gmail archives

Section 3: Troubleshooting

3.1 Troubleshoot user reports of mail delivery problems
3.2 Collect log files or reports needed to engage with support
3.3 Classify and mitigate basic email attacks. Considerations Include:
● Configuring attachment compliance
● Configuring blocked senders
● Configuring email allowlist
● Configuring objectionable content
● Configuring phishing settings
● Configuring spam settings
● Managing admin quarantine
● Configuring Secure Transport compliance
● Configuring safety settings

3.4 Troubleshoot workspace access and performance

Section 4: Data access and authentication

4.1 Configure policies for all devices (mobile, desktop, Chrome OS, Meet, Chrome Browser). Considerations include:
● Company-owned vs. personal devices
● Configuring personal device settings (e.g., password, Android, iOS, advanced, device approvals, app management, insights

4.2 Configure and implement data governance policies
4.3 Describe how to manage third-party applications. Considerations include:
● Configuring third-party SSO for Google Workspace
● Integrating with third-party for provisioning
● Integrating third-party marketplace apps to specific OUs in Google Workspace
● Granting API access to applications that need access
● Revoking third-party author access
● Removing connected applications and sites

4.4 Configure user authentication. Considerations include:
● Basic user security controls (e.g., password length enforcement and 2-Step Verification)
● Security aspects of identity, perimeter security, and data protection

Section 5: Support business initiatives

5.1 Use Vault to assist legal teams. Considerations Include:
● Setting retention rules (e.g., Setting retention rules, placing legal holds, searching your domain’s data by user account, OU, date, or keyword, exporting data for additional processing and review, auditing reports)
● Holding and exporting data
● Running Vault audit reports

5.2 Interpret reports for the business
5.3 Describe how to import and export data


Sample Questions

QUESTION 1
Madeupcorp.com is in the process of migrating from a third-party email system to Google
Workspace. The VP of Marketing is concerned that her team already administers the corporate
AdSense, AdWords, and YouTube channels using their @madeupcorp.com email addresses, but has
not tracked which users have access to which service. You need to ensure that there is no disruption.
What should you do?

A. Run the Transfer Tool for Unmanaged users.
B. Use a Google Form to survey the Marketing department users.
C. Assure the VP that there is no action required to configure Google Workspace.
D. Contact Google Enterprise Support to identify affected users.

Answer: A

QUESTION 2
Your company has an OU that contains your sales team and an OU that contains your market
research team. The sales team is often a target of mass email from legitimate senders, which is
distracting to their job duties. The market research team also receives that email content, but they
want it because it often contains interesting market analysis or competitive intelligence. Constant
Contact is often used as the source of these messages. Your company also uses Constant Contact for
your own mass email marketing. You need to set email controls at the Sales OU without affecting
your own outgoing email or the market research OU.
What should you do?

A. Create a blocked senders list as the Sales OU that contains the mass email sender addresses, but bypass this setting for Constant Contact emails.
B. Create a blocked senders list at the root level, and then an approved senders list at the Market Research OU, both containing the mass email sender addresses.
C. Create a blocked senders list at the Sales OU that contains the mass email sender addresses.
D. Create an approved senders list at the Market Research OU that contains the mass email sender addresses.

Answer: C

QUESTION 3
Your organization is part of a highly regulated industry with a very high turnover. In order to recycle
licenses for new employees and comply with data retention regulations, it has been determined that
certain Google Workspace data should be stored in a separate backup environment.
How should you store data for this situation?

A. Use routing rules to dual-deliver mail to an on-premises SMTP server and Google Workspace.
B. Write a script and use Google Workspace APIs to access and download user data.
C. Use a third-party tool to configure secure backup of Google Workspace data.
D. Train users to use Google Takeout and store their archives locally.

Answer: C

QUESTION 4
Your organization is on Google Workspace Enterprise and allows for external sharing of Google Drive
files to facilitate collaboration with other Google Workspace customers. Recently you have had
several incidents of files and folders being broadly shared with external users and groups. Your chief
security officer needs data on the scope of external sharing and ongoing alerting so that external
access does not have to be disabled.
What two actions should you take to support the chief security officer’s request? (Choose two.)

A. Review who has viewed files using the Google Drive Activity Dashboard.
B. Create an alert from Drive Audit reports to notify of external file sharing.
C. Review total external sharing in the Aggregate Reports section.
D. Create a custom Dashboard for external sharing in the Security Investigation Tool.
E. Automatically block external sharing using DLP rules.

Answer: BD

QUESTION 5
Your organization’s Sales Department uses a generic user account (sales@company.com) to manage
requests. With only one employee responsible for managing the departmental account, you are
tasked with providing the department with the most efficient means to allow multiple employees
various levels of access and manage requests from a common email address.
What should you do?

A. Configure a Google Group as an email list.
B. Delegate email access to department employees.
C. Configure a Google Group as a collaborative inbox.
D. Configure a Google Group, and set the Access Level to Announcement Only.

Answer: C

QUESTION 6
Your employer, a media and entertainment company, wants to provision Google Workspace
Enterprise accounts on your domain for several world-famous celebrities. Leadership is concerned
with ensuring that these VIPs are afforded a high degree of privacy. Only a small group of senior
employees must be able to look up contact information and initiate collaboration with the VIPs using
Google Workspace services such as Docs, Chat, and Calendar.
You are responsible for configuring to meet these requirements. What should you do?

A. In the Users list, find the VIPs and turn off the User setting oeDirectory Sharing.
B. Create a Group for the VIPs and their handlers, and set the Group Access Level to Restricted.
C. In Directory Settings, disable Contact Sharing.
D. Create separate Custom Directories for the VIPs and regular employees.

Answer: D

Latest Google Professional Cloud Network Engineer Exam Dumps PDF

Professional Cloud Network Engineer
Length: 2 hours
Languages: English
Exam format: 50-60 multiple choice and multiple select questions

Exam delivery method:
a) Take the online-proctored exam from a remote location, review the online testing requirements
b) Take the onsite-proctored exam at a testing center, locate a test center near you

Examkingdom Google-Professional-Cloud-Network-Engineer Exam pdf, Certkingdom Google-Professional-Cloud-Network-Engineer Exam PDF

MCTS Training, MCITP Trainnig

Best Google-Professional-Cloud-Network-Engineer Certification, Google-Professional-Cloud-Network-Engineer PDF Training at Certkingdom.com

Prerequisites: None
Recommended experience: 3+ years of industry experience including 1+ years designing and managing solutions using Google Cloud

Certification Renewal / Recertification: Candidates must recertify in order to maintain their certification status. Unless explicitly stated in the detailed exam descriptions, all Google Cloud certifications are valid for two years from the date of certification. Recertification is accomplished by retaking the exam during the recertification eligibility time period and achieving a passing score. You may attempt recertification starting 60 days prior to your certification expiration date.

Certification exam guide
A Professional Cloud Network Engineer implements and manages network architectures in Google Cloud. This individual may work on networking or cloud teams with architects who design cloud infrastructure. The Cloud Network Engineer uses the Google Cloud console and/or command line interface, and leverages experience with network services, application and container networking, hybrid and multi-cloud connectivity, implementing VPCs, and security for established network architectures to ensure successful cloud implementations.

The Professional Cloud Network Engineer exam assesses your ability to:
Design, plan, and prototype a Google Cloud network
Implement Virtual Private Cloud (VPC) instances
Configure network services
Implement hybrid interconnectivity
Manage, monitor, and optimize network operations

Exam overview

1. Review the exam guide
The exam guide contains a complete list of topics that may be included on the exam, helping you determine if your skills align with the topics on the exam.

Section 1: Designing, planning, and prototyping a Google Cloud network

1.1 Designing an overall network architecture. Considerations include:
● High availability, failover, and disaster recovery strategies
● DNS strategy (e.g., on-premises, Cloud DNS)
● Security and data exfiltration requirements
● Load balancing
● Applying quotas per project and per VPC
● Hybrid connectivity (e.g., Google private access for hybrid connectivity)
● Container networking
● IAM roles
● SaaS, PaaS, and IaaS services
● Microsegmentation for security purposes (e.g., using metadata, tags, service accounts)

1.2 Designing Virtual Private Cloud (VPC) instances. Considerations include:
● IP address management and bring your own IP (BYOIP)
● Standalone vs. Shared VPC
● Multiple vs. single
● Regional vs. multi-regional
● VPC Network Peering
● Firewalls (e.g., service account-based, tag-based)
● Custom routes
● Using managed services (e.g., Cloud SQL, Memorystore)
● Third-party device insertion (NGFW) into VPC using multi-NIC and internal load balancer as a next hop or equal-cost multi-path (ECMP) routes

1.3 Designing a hybrid and multi-cloud network. Considerations include:
● Dedicated Interconnect vs. Partner Interconnect
● Multi-cloud connectivity
● Direct Peering
● IPsec VPN
● Failover and disaster recovery strategy
● Regional vs. global VPC routing mode
● Accessing multiple VPCs from on-premises locations (e.g., Shared VPC, multi-VPC peering topologies)
● Bandwidth and constraints provided by hybrid connectivity solutions
● Accessing Google Services/APIs privately from on-premises locations
● IP address management across on-premises locations and cloud
● DNS peering and forwarding

1.4 Designing an IP addressing plan for Google Kubernetes Engine. Considerations include:
● Public and private cluster nodes
● Control plane public vs. private endpoints
● Subnets and alias IPs
● RFC 1918, non-RFC 1918, and privately used public IP (PUPI) address options

Section 2: Implementing Virtual Private Cloud (VPC) instances

2.1 Configuring VPCs. Considerations include:
● Google Cloud VPC resources (e.g., networks, subnets, firewall rules)
● VPC Network Peering
● Creating a Shared VPC network and sharing subnets with other projects
● Configuring API access to Google services (e.g., Private Google Access, public interfaces)
● Expanding VPC subnet ranges after creation

2.2 Configuring routing. Considerations include:
● Static vs. dynamic routing
● Global vs. regional dynamic routing
● Routing policies using tags and priority
● Internal load balancer as a next hop
● Custom route import/export over VPC Network Peering

2.3 Configuring and maintaining Google Kubernetes Engine clusters. Considerations include:
● VPC-native clusters using alias IPs
● Clusters with Shared VPC
● Creating Kubernetes Network Policies
● Private clusters and private control plane endpoints
● Adding authorized networks for cluster control plane endpoints

2.4 Configuring and managing firewall rules. Considerations include:
● Target network tags and service accounts
● Rule priority
● Network protocols
● Ingress and egress rules
● Firewall rule logging
● Firewall Insights
● Hierarchical firewalls

2.5 Implementing VPC Service Controls. Considerations include:
● Creating and configuring access levels and service perimeters
● VPC accessible services
● Perimeter bridges
● Audit logging
● Dry run mode

Section 3: Configuring network services

3.1 Configuring load balancing. Considerations include:
● Backend services and network endpoint groups (NEGs)
● Firewall rules to allow traffic and health checks to backend services
● Health checks for backend services and target instance groups
● Configuring backends and backend services with balancing method (e.g., RPS, CPU, Custom), session affinity, and capacity scaling/scaler
● TCP and SSL proxy load balancers
● Load balancers (e.g., External TCP/UDP Network Load Balancing, Internal TCP/UDP Load Balancing, External HTTP(S) Load Balancing, Internal HTTP(S) Load Balancing)
● Protocol forwarding
● Accommodating workload increases using autoscaling vs. manual scaling

3.2 Configuring Google Cloud Armor policies. Considerations include:
● Security policies
● Web application firewall (WAF) rules (e.g., SQL injection, cross-site scripting, remote file inclusion)
● Attaching security policies to load balancer backends

3.3 Configuring Cloud CDN. Considerations include:
● Enabling and disabling
● Cloud CDN
● Cache keysInvalidating cached objects
● Signed URLs
● Custom origins

3.4 Configuring and maintaining Cloud DNS. Considerations include:
● Managing zones and records
● Migrating to Cloud DNS
● DNS Security Extensions (DNSSEC)
● Forwarding and DNS server policies
● Integrating on-premises DNS with Google Cloud
● Split-horizon DNS
● DNS peering
● Private DNS logging

3.5 Configuring Cloud NAT. Considerations include:
● Addressing
● Port allocations
● Customizing timeouts
● Logging and monitoring
● Restrictions per organization policy constraints

3.6 Configuring network packet inspection. Considerations include:
● Packet Mirroring in single and multi-VPC topologies
● Capturing relevant traffic using Packet Mirroring source and traffic filters
● Routing and inspecting inter-VPC traffic using multi-NIC VMs (e.g., next-generation firewall appliances)
● Configuring an internal load balancer as a next hop for highly available multi-NIC VM routing

Section 4: Implementing hybrid interconnectivity

4.1 Configuring Cloud Interconnect. Considerations include:
● Dedicated Interconnect connections and VLAN attachments
● Partner Interconnect connections and VLAN attachments

4.2 Configuring a site-to-site IPsec VPN. Considerations include:
● High availability VPN (dynamic routing)
● Classic VPN (e.g., route-based routing, policy-based routing)

4.3 Configuring Cloud Router. Considerations include:
● Border Gateway Protocol (BGP) attributes (e.g., ASN, route priority/MED, link-local addresses)
● Custom route advertisements via BGP
● Deploying reliable and redundant Cloud Routers

Section 5: Managing, monitoring, and optimizing network operations

5.1 Logging and monitoring with Google Cloud’s operations suite. Considerations include:
● Reviewing logs for networking components (e.g., VPN, Cloud Router, VPC Service Controls)
● Monitoring networking components (e.g., VPN, Cloud Interconnect connections and interconnect attachments, Cloud Router, load balancers, Google Cloud Armor, Cloud NAT)

5.2 Managing and maintaining security. Considerations include:
● Firewalls (e.g., cloud-based, private)
● Diagnosing and resolving IAM issues (e.g., Shared VPC, security/network admin)

5.3 Maintaining and troubleshooting connectivity issues. Considerations include:
● Draining and redirecting traffic flows with HTTP(S) Load Balancing
● Monitoring ingress and egress traffic using VPC Flow Logs
● Monitoring firewall logs and Firewall Insights
● Managing and troubleshooting VPNs
● Troubleshooting Cloud Router BGP peering issues

5.4 Monitoring, maintaining, and troubleshooting latency and traffic flow. Considerations include:
● Testing network throughput and latency
● Diagnosing routing issues
● Using Network Intelligence Center to visualize topology, test connectivity, and monitor performance


Sample Questions:


QUESTION 1
You need to restrict access to your Google Cloud load-balanced application so that only specific IP addresses can connect.
What should you do?

A. Create a secure perimeter using the Access Context Manager feature of VPC Service Controls and restrict access to the source IP range of the allowed clients and Google health check IP ranges.
B. Create a secure perimeter using VPC Service Controls, and mark the load balancer as a service restricted to the source IP range of the allowed clients and Google health check IP ranges.
C. Tag the backend instances “application,” and create a firewall rule with target tag “application” and the source IP range of the allowed clients and Google health check IP ranges.
D. Label the backend instances “application,” and create a firewall rule with the target label “application” and the source IP range of the allowed clients and Google health check IP ranges.

Explanation:

Answer: C
https://cloud.google.com/load-balancing/docs/https/setting-up-https#sendtraffic


QUESTION 2
Your end users are located in close proximity to us-east1 and europe-west1. Their workloads need to communicate with each other. You want to minimize cost and increase network efficiency.
How should you design this topology?

A. Create 2 VPCs, each with their own regions and individual subnets. Create 2 VPN gateways to establish connectivity between these regions.
B. Create 2 VPCs, each with their own region and individual subnets. Use external IP addresses on the instances to establish connectivity between these regions.
C. Create 1 VPC with 2 regional subnets. Create a global load balancer to establish connectivity between the regions.
D. Create 1 VPC with 2 regional subnets. Deploy workloads in these subnets and have them communicate using private RFC1918 IP addresses.

Explanation:

Answer: D
https://cloud.google.com/vpc/docs/using-vpc#create-auto-network
We create one VPC network in auto mode that creates one subnet in each Google Cloud region automatically. So, region us east1 and europe-west1 are in the same network and they can communicate using their internal IP address even though they are in different Regions. They take advantage of Google’s global fiber network.


QUESTION 3
Your organization is deploying a single project for 3 separate departments. Two of these departments require network connectivity between each other, but the third department should remain in isolation. Your design should create separate network administrative domains between these departments. You want to minimize operational overhead.
How should you design the topology?

A. Create a Shared VPC Host Project and the respective Service Projects for each of the 3 separate departments.
B. Create 3 separate VPCs, and use Cloud VPN to establish connectivity between the two appropriate VPCs.
C. Create 3 separate VPCs, and use VPC peering to establish connectivity between the two appropriate VPCs.
D. Create a single project, and deploy specific firewall rules. Use network tags to isolate access between the departments.

Answer: C

Explanation:
https://cloud.google.com/vpc/docs/vpc-peering


QUESTION 4
You are migrating to Cloud DNS and want to import your BIND zone file.
Which command should you use?

A. gcloud dns record-sets import ZONE_FILE –zone MANAGED_ZONE
B. gcloud dns record-sets import ZONE_FILE –replace-origin-ns –zone MANAGED_ZONE
C. gcloud dns record-sets import ZONE_FILE –zone-file-format –zone MANAGED_ZONE
D. gcloud dns record-sets import ZONE_FILE –delete-all-existing –zone MANAGED ZONE

Explanation:

Answer: C
https://cloud.google.com/sdk/gcloud/reference/dns/record-sets/import


QUESTION 5
You created a VPC network named Retail in auto mode. You want to create a VPC network named
Distribution and peer it with the Retail VPC.
How should you configure the Distribution VPC?

A. Create the Distribution VPC in auto mode. Peer both the VPCs via network peering.
B. Create the Distribution VPC in custom mode. Use the CIDR range 10.0.0.0. Create the necessary subnets, and then peer them via network peering.
C. Create the Distribution VPC in custom mode. Use the CIDR range 10.128.0.0. Create the necessary subnets, and then peer them via network peering.
D. Rename the default VPC as “Distribution” and peer it via network peering.

Answer: B

Google Cloud Certified Professional Data Engineer Exam

Professional Data Engineers enable data-driven decision making by collecting, transforming, and publishing data. A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance; scalability and efficiency; reliability and fidelity; and flexibility and portability. A Data Engineer should also be able to leverage, deploy, and continuously train pre-existing machine learning models.

The Professional Data Engineer exam assesses your ability to:
Design data processing systems
Operationalize machine learning models
Ensure solution quality
Build and operationalize data processing systems

About this certification exam
Length: 2 hours
Registration fee: $200 (plus tax where applicable)
Languages: English, Japanese.
Exam format: Multiple choice and multiple select taken remotely or in person at a test center. Locate a test center near you.

Exam Delivery Method:
a. Take the online-proctored exam from a remote location
b. Take the onsite-proctored exam at a testing center

Prerequisites: None
Recommended experience: 3+ years of industry experience including 1+ years designing and managing solutions using Google Cloud.

Certification Renewal / Recertification: Candidates must recertify in order to maintain their certification status. Unless explicitly stated in the detailed exam descriptions, all Google Cloud certifications are valid for two years from the date of certification. Recertification is accomplished by retaking the exam during the recertification eligibility time period and achieving a passing score. You may attempt recertification starting 60 days prior to your certification expiration date.

Exam overview
1. Review the exam guide

The exam guide contains a complete list of topics that may be included on the exam, helping you determine if your skills align with the topics on the exam.
2. Training

Professional Data Engineer

Certification exam guide
A Professional Data Engineer enables data-driven decision-making by collecting, transforming, and publishing data. A data engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance; scalability and efficiency; reliability and fidelity; and flexibility and portability. A data engineer should also be able to leverage, deploy, and continuously train pre-existing machine learning models.
Section 1. Designing data processing systems

1.1 Selecting the appropriate storage technologies. Considerations include:
Mapping storage systems to business requirements
Data modeling
Trade-offs involving latency, throughput, transactions
Distributed systems
Schema design

1.2 Designing data pipelines. Considerations include:
Data publishing and visualization (e.g., BigQuery)
Batch and streaming data (e.g., Dataflow, Dataproc, Apache Beam, Apache Spark and Hadoop ecosystem, Pub/Sub, Apache Kafka)
Online (interactive) vs. batch predictions
Job automation and orchestration (e.g., Cloud Composer)

1.3 Designing a data processing solution. Considerations include:
Choice of infrastructure
System availability and fault tolerance
Use of distributed systems
Capacity planning
Hybrid cloud and edge computing
Architecture options (e.g., message brokers, message queues, middleware, service-oriented architecture, serverless functions)
At least once, in-order, and exactly once, etc., event processing

1.4 Migrating data warehousing and data processing. Considerations include:
Awareness of current state and how to migrate a design to a future state
Migrating from on-premises to cloud (Data Transfer Service, Transfer Appliance, Cloud Networking)
Validating a migration

Section 2. Building and operationalizing data processing systems

2.1 Building and operationalizing storage systems. Considerations include:
Effective use of managed services (Cloud Bigtable, Cloud Spanner, Cloud SQL, BigQuery, Cloud Storage, Datastore, Memorystore)
Storage costs and performance
Life cycle management of data

2.2 Building and operationalizing pipelines. Considerations include:
Data cleansing
Batch and streaming
Transformation
Data acquisition and import
Integrating with new data sources

2.3 Building and operationalizing processing infrastructure. Considerations include:
Provisioning resources
Monitoring pipelines
Adjusting pipelines
Testing and quality control

Section 3. Operationalizing machine learning models

3.1 Leveraging pre-built ML models as a service. Considerations include:
ML APIs (e.g., Vision API, Speech API)
Customizing ML APIs (e.g., AutoML Vision, Auto ML text)
Conversational experiences (e.g., Dialogflow)

3.2 Deploying an ML pipeline. Considerations include:
Ingesting appropriate data
Retraining of machine learning models (AI Platform Prediction and Training, BigQuery ML, Kubeflow, Spark ML)
Continuous evaluation

3.3 Choosing the appropriate training and serving infrastructure. Considerations include:
Distributed vs. single machine
Use of edge compute
Hardware accelerators (e.g., GPU, TPU)

3.4 Measuring, monitoring, and troubleshooting machine learning models. Considerations include:
Machine learning terminology (e.g., features, labels, models, regression, classification, recommendation, supervised and unsupervised learning, evaluation metrics)
Impact of dependencies of machine learning models
Common sources of error (e.g., assumptions about data)

Section4. Ensuring solution quality

4.1 Designing for security and compliance. Considerations include:
Identity and access management (e.g., Cloud IAM)
Data security (encryption, key management)
Ensuring privacy (e.g., Data Loss Prevention API)
Legal compliance (e.g., Health Insurance Portability and Accountability Act (HIPAA), Children’s Online Privacy Protection Act (COPPA), FedRAMP, General Data Protection Regulation (GDPR))

4.2 Ensuring scalability and efficiency. Considerations include:
Building and running test suites
Pipeline monitoring (e.g., Cloud Monitoring)
Assessing, troubleshooting, and improving data representations and data processing infrastructure
Resizing and autoscaling resources

4.3 Ensuring reliability and fidelity. Considerations include:
Performing data preparation and quality control (e.g., Dataprep)
Verification and monitoring
Planning, executing, and stress testing data recovery (fault tolerance, rerunning failed jobs, performing retrospective re-analysis)
Choosing between ACID, idempotent, eventually consistent requirements

4.4 Ensuring flexibility and portability. Considerations include:
Mapping to current and future business requirements
Designing for data and application portability (e.g., multicloud, data residency requirements)
Data staging, cataloging, and discovery

QUESTION 1
Your company built a TensorFlow neutral-network model with a large number of neurons and layers.
The model fits well for the training data. However, when tested against new data, it performs poorly.
What method can you employ to address this?

A. Threading
B. Serialization
C. Dropout Methods
D. Dimensionality Reduction

Answer: C

QUESTION 2
You are building a model to make clothing recommendations. You know a user’s fashion preference is likely to
change over time, so you build a data pipeline to stream new data back to the model as it becomes available.
How should you use this data to train the model?

A. Continuously retrain the model on just the new data.
B. Continuously retrain the model on a combination of existing data and the new data.
C. Train on the existing data while using the new data as your test set.
D. Train on the new data while using the existing data as your test set.

Answer: B

QUESTION 3
You designed a database for patient records as a pilot project to cover a few hundred patients in three clinics.
Your design used a single database table to represent all patients and their visits, and you used self-joins to
generate reports. The server resource utilization was at 50%. Since then, the scope of the project has
expanded. The database must now store 100 times more patient records. You can no longer run the reports,
because they either take too long or they encounter errors with insufficient compute resources. How should
you adjust the database design?

A. Add capacity (memory and disk space) to the database server by the order of 200.
B. Shard the tables into smaller ones based on date ranges, and only generate reports with prespecified date ranges.
C. Normalize the master patient-record table into the patient table and the visits table, and create other ecessary tables to avoid self-join.
D. Partition the table into smaller tables, with one for each clinic. Run queries against the smaller table pairs,
and use unions for consolidated reports.

Answer: C

QUESTION 4
You create an important report for your large team in Google Data Studio 360. The report uses Google
BigQuery as its data source. You notice that visualizations are not showing data that is less than 1 hour old.
What should you do?

A. Disable caching by editing the report settings.
B. Disable caching in BigQuery by editing table details.
C. Refresh your browser tab showing the visualizations.
D. Clear your browser history for the past hour then reload the tab showing the virtualizations.

Answer: A
 

Examkingdom Google Cloud Certified Professional Data Engineer Exam pdf, Certkingdom Google Cloud Certified Professional Data Engineer PDF

MCTS Training, MCITP Trainnig

Best Google Cloud Certified Professional Data Engineer Certification, Google Cloud Certified Professional Data Engineer Training at certkingdom.com

Google Certified Professional Cloud Architect (GCP) Exam

Length: 2 hours
Registration fee: $200 (plus tax where applicable)
Languages: English, Japanese
Exam format: Multiple choice and multiple select, taken remotely or in person at a test center. Locate a test center near you.

Professional Cloud Architects enable organizations to leverage Google Cloud technologies. With a thorough understanding of cloud architecture and Google Cloud, they design, develop, and manage robust, secure, scalable, highly available, and dynamic solutions to drive business objectives.

The Professional Cloud Architect certification exam assesses your ability to:
Design and plan a cloud solution architecture
Manage and provision the cloud solution infrastructure
Design for security and compliance
Analyze and optimize technical and business processes
Manage implementations of cloud architecture
Ensure solution and operations reliability

Exam delivery method:
a. Take the online-proctored exam from a remote location
b. Take the onsite-proctored exam at a testing center
Prerequisites: None

Recommended experience: 3+ years of industry experience including 1+ years designing and managing solutions using Google Cloud

Certification Renewal / Recertification: Candidates must recertify in order to maintain their certification status. Unless explicitly stated in the detailed exam descriptions, all Google Cloud certifications are valid for two years from the date of certification. Recertification is accomplished by retaking the exam during the recertification eligibility time period and achieving a passing score. You may attempt recertification starting 60 days prior to your certification expiration date.

The exam guide contains a complete list of topics that may be included on the exam, helping you determine if your skills align with the exam.
2. Start training

Professional Cloud Architect

Certification exam guide
A Google Cloud Certified Professional Cloud Architect enables organizations to leverage Google Cloud technologies. Through an understanding of cloud architecture and Google technology, this individual designs, develops, and manages robust, secure, scalable, highly available, and dynamic solutions to drive business objectives. The Cloud Architect should be proficient in all aspects of enterprise cloud strategy, solution design, and architectural best practices. The Cloud Architect should also be experienced in software development methodologies and approaches including multi-tiered distributed applications which span multicloud or hybrid environments.

Case studies
During the exam for the Cloud Architect Certification, some of the questions may refer you to a case study that describes a fictitious business and solution concept. These case studies are intended to provide additional context to help you choose your answer(s). Review the case studies that may be used in the exam.

EHR Healthcare
Helicopter Racing League
Mountkirk Games
TerramEarth

Section 1. Designing and planning a cloud solution architecture

1.1 Designing a solution infrastructure that meets business requirements. Considerations include:
Business use cases and product strategy
Cost optimization
Supporting the application design
Integration with external systems
Movement of data
Design decision trade-offs
Build, buy, modify, or deprecate
Success measurements (e.g., key performance indicators [KPI], return on investment [ROI], metrics)
Compliance and observability

1.2 Designing a solution infrastructure that meets technical requirements. Considerations include:
High availability and failover design
Elasticity of cloud resources with respect to quotas and limits
Scalability to meet growth requirements
Performance and latency

1.3 Designing network, storage, and compute resources. Considerations include:
Integration with on-premises/multicloud environments
Cloud-native networking (VPC, peering, firewalls, container networking)
Choosing data processing technologies
Choosing appropriate storage types (e.g., object, file, databases)
Choosing compute resources (e.g., preemptible, custom machine type, specialized workload)
Mapping compute needs to platform products

1.4 Creating a migration plan (i.e., documents and architectural diagrams). Considerations include:
Integrating solutions with existing systems
Migrating systems and data to support the solution
Software license mapping
Network planning
Testing and proofs of concept
Dependency management planning

1.5 Envisioning future solution improvements. Considerations include:
Cloud and technology improvements
Evolution of business needs
Evangelism and advocacy

Section 2. Managing and provisioning a solution infrastructure

2.1 Configuring network topologies. Considerations include:
Extending to on-premises environments (hybrid networking)
Extending to a multicloud environment that may include Google Cloud to Google Cloud communication
Security protection (e.g. intrusion protection, access control, firewalls)

2.2 Configuring individual storage systems. Considerations include:
Data storage allocation
Data processing/compute provisioning
Security and access management
Network configuration for data transfer and latency
Data retention and data life cycle management
Data growth planning

2.3 Configuring compute systems. Considerations include:
Compute resource provisioning
Compute volatility configuration (preemptible vs. standard)
Network configuration for compute resources (Google Compute Engine, Google Kubernetes Engine, serverless networking)
Infrastructure orchestration, resource configuration, and patch management
Container orchestration

Section 3. Designing for security and compliance

3.1 Designing for security. Considerations include:
Identity and access management (IAM)
Resource hierarchy (organizations, folders, projects)
Data security (key management, encryption, secret management)
Separation of duties (SoD)
Security controls (e.g., auditing, VPC Service Controls, context aware access, organization policy)
Managing customer-managed encryption keys with Cloud Key Management Service
Remote access

3.2 Designing for compliance. Considerations include:
Legislation (e.g., health record privacy, children’s privacy, data privacy, and ownership)
Commercial (e.g., sensitive data such as credit card information handling, personally identifiable information [PII])
Industry certifications (e.g., SOC 2)
Audits (including logs)

Section 4. Analyzing and optimizing technical and business processes

4.1 Analyzing and defining technical processes. Considerations include:
Software development life cycle (SDLC)
Continuous integration / continuous deployment
Troubleshooting / root cause analysis best practices
Testing and validation of software and infrastructure
Service catalog and provisioning
Business continuity and disaster recovery

4.2 Analyzing and defining business processes. Considerations include:
Stakeholder management (e.g. influencing and facilitation)
Change management
Team assessment / skills readiness
Decision-making processes
Customer success management
Cost optimization / resource optimization (capex / opex)

4.3 Developing procedures to ensure reliability of solutions in production (e.g., chaos engineering, penetration testing)

Section 5. Managing implementation

5.1 Advising development/operation team(s) to ensure successful deployment of the solution. Considerations include:
Application development
API best practices
Testing frameworks (load/unit/integration)
Data and system migration and management tooling

5.2 Interacting with Google Cloud programmatically. Considerations include:
Google Cloud Shell
Google Cloud SDK (gcloud, gsutil and bq)
Cloud Emulators (e.g. Cloud Bigtable, Datastore, Spanner, Pub/Sub, Firestore)

Section 6. Ensuring solution and operations reliability

6.1 Monitoring/logging/profiling/alerting solution

6.2 Deployment and release management

6.3 Assisting with the support of deployed solutions

6.4 Evaluating quality control measures



QUESTION 1
The JencoMart security team requires that all Google Cloud Platform infrastructure is deployed using a least
privilege model with separation of duties for administration between production and development resources.
What Google domain and project structure should you recommend?

A. Create two G Suite accounts to manage users: one for development/test/staging and one for production.
Each account should contain one project for every application
B. Create two G Suite accounts to manage users: one with a single project for all development applications
and one with a single project for all production applications
C. Create a single G Suite account to manage users with each stage of each application in its own project
D. Create a single G Suite account to manage users with one project for the development/test/staging
environment and one project for the production environment

Answer: D

QUESTION 2
A few days after JencoMart migrates the user credentials database to Google Cloud Platform and shuts down
the old server, the new database server stops responding to SSH connections. It is still serving database
requests to the application servers correctly.
What three steps should you take to diagnose the problem? (Choose three.)

A. Delete the virtual machine (VM) and disks and create a new one
B. Delete the instance, attach the disk to a new VM, and investigate
C. Take a snapshot of the disk and connect to a new machine to investigate
D. Check inbound firewall rules for the network the machine is connected to
E. Connect the machine to another network with very simple firewall rules and investigate
F. Print the Serial Console output for the instance for troubleshooting, activate the interactive console, and investigate

Answer: C,D,F

QUESTION 3
JencoMart has decided to migrate user profile storage to Google Cloud Datastore and the application servers
to Google Compute Engine (GCE). During the migration, the existing infrastructure will need access to Datastore to upload the data.
What service account key-management strategy should you recommend?

A. Provision service account keys for the on-premises infrastructure and for the GCE virtual machines (VMs)
B. Authenticate the on-premises infrastructure with a user account and provision service account keys for the VMs
C. Provision service account keys for the on-premises infrastructure and use Google Cloud Platform (GCP) managed keys for the VMs
D. Deploy a custom authentication service on GCE/Google Kubernetes Engine (GKE) for the on-premises infrastructure and use GCP managed keys for the VMs

Answer: C

Examkingdom Google Certified Professional Cloud Architect Exam pdf, Certkingdom Google Certified Professional Cloud Architect PDF

MCTS Training, MCITP Trainnig

Best Google Certified Professional Cloud Architect Certification, Google Certified Professional Cloud Architect Training at certkingdom.com

Google Looker Business Analyst Exam

Looker Business Analyst
A Looker Business Analyst uses Looker daily to create and curate content, develop reports, and use visualizations to represent data. A Business Analyst should be able to use Looker to query data and create actionable metrics, validate data accuracy, and apply procedural concepts to identify error sources. A Business Analyst can build Looker dashboards to meet business requirements, deliver reports for data consumers, and use appropriate visualizations to meet analysis requirements. A Business Analyst can apply procedural concepts to curate content for intuitive navigation and control it for security.

The Looker Business Analyst exam assesses your knowledge of:
Scheduling and sharing Looks and dashboards
Table calculations and Looker expressions
Customize and advanced filters
The impacts of pivoting
Best practices around designing dashboards
The fundamentals of caching


About this certification exam
Length: 100 minutes
Registration fee: $250
Languages: English
Exam format: Multiple choice and multiple select taken remotely or in person at a test center. Locate a test center near you.

Exam delivery method:
1. Take the online-proctored exam from a remote location, review the online testing requirements.
2. Take the onsite-proctored exam at a testing center, locate a test center near you.

Prerequisites: None
Recommended experience: Business analysts with 5+ months of experience using Looker for report development, data visualization, and dashboard best practices.

Exam overview

Step 1: Understand what’s on the exam
The exam guide contains a complete list of topics that may be included on the exam. Review the exam guide to determine if your knowledge aligns with the topics on the exam.

Looker Business Analyst

Certification exam guide
A Looker Business Analyst uses Looker daily to create and curate content, develop reports, and use visualizations to represent data. A Business Analyst should be able to use Looker to query data and create actionable metrics, validate data accuracy, and apply procedural concepts to identify error sources. A Business Analyst can build Looker dashboards to meet business requirements, deliver reports for data consumers, and use appropriate visualizations to meet analysis requirements. A Business Analyst can apply procedural concepts to curate content for intuitive navigation, and control it for security.

Section 1: Analyze

1.1 Use Looker Explores to query data and create actionable metrics in a given scenario. For example:
Utilize requirements and create queries using fields (e.g., dimensions, measures, filters, pivots)
Determine additional metrics needed and construct custom metrics using table calculations
Determine how to utilize filters (e.g., standard filters, matches advanced filters, and custom filters)
Determine which fields to use merge results for joining across different Explores and data sources

1.2 Use Looker to validate data accuracy in a given scenario. For example:
Investigate data results to determine accuracy (e.g., using SQL, drilling, A/B testing, comparisons)
Investigate discrepancies by viewing row-level data using Explores (e.g., review individual dimension values that make up the result of a measure)

1.3 Apply procedural concepts to identify error sources. For example:
Utilize Looker’s features to determine the cause of the error (e.g., read error message to get context)
Interpret error message to identify the source (e.g., caused by the database, query, LookML code, permissions, visualizations)

Section 2: Build
2.1 Build dashboards to meet business requirements. For example:

Construct dashboards to meet requirements (e.g., using dashboard filters, merged results)
Apply procedural concepts to design impactful dashboards (e.g., storytelling, tile organization, use of text tiles, amount of data per dashboard)

2.2 Deliver reports for data consumers. For example:
Determine appropriate report delivery methods (e.g., file format, destination, delivery cadence, recipients, scheduling, sending, downloading, test delivery)
Determine appropriate download configurations (e.g., no option for unlimited downloads, table calculations, pivots, lack of permissions, database limitations)

2.3 Use visualization types to meet analysis requirements in a given scenario. For example:
Select appropriate visualizations to illustrate data results (e.g., bar, line, scatter, column, pie)
Determine which visualization settings to use (e.g., conditional formatting, subtotals, double axis, value label format using spreadsheet functions, grouping)

Section 3: Curate

3.1 Apply procedural concepts to curate content for intuitive navigation. For example:
Determine appropriate setups for folders and boards (e.g., structures, subfolders, hierarchy)
Apply naming conventions to identify folders, boards, or other content for users (e.g., clear titles, description fields, naming folders, content, and conventions)

3.2 Apply procedural concepts to control content access for security. For example:
Utilize appropriate Explores based on audience to prevent data leak (e.g., restricting sensitive data to specific users)
Assign folders and boards permissions to organize content based on user groups

Examkingdom Google Looker Business Analyst Exam pdf, Certkingdom Google Looker Business Analyst PDF

MCTS Training, MCITP Trainnig

Best Google Looker Business Analyst Certification, Google Looker Business Analyst Training at certkingdom.com

Google LookML-Developer Looker LookML Developer Exam

About this certification exam
Length: 100 minutes
Registration fee: $250
Language: English
Exam format: Multiple choice and multiple select taken remotely or in person at a test center. Locate a test center near you.

Exam Delivery Method:
a. Take the online-proctored exam from a remote location, review the online testing requirements.
b. Take the onsite-proctored exam at a testing center: locate a test center near you.

Prerequisites: None
Recommended experience: Business analysts with 5+ months of experience using Looker for report development, data visualization, and dashboard best practices.

Looker LookML Developer
A Looker LookML Developer works with datasets and LookML and is familiar with SQL and BI tools. LookML Developers are proficient in model management, including troubleshooting existing model errors, implementing data security requirements, creating LookML objects, and maintaining LookML project health. LookML Developers design new LookML dimensions and measures and build Explores for users to answer business questions. LookML developers are skilled at quality management, from implementing version control to assessing code quality to utilizing SQL runner for data validation.

The Looker LookML Developer exam assesses your ability to:
Maintain and debug LookML code
Build user-friendly Explores
Design robust models
Define caching policies
Understand various datasets and associated schemas
Use Looker tools such as Looker IDE, SQL Runner, & LookML Validator

Exam overview
Step 1: Understand what’s on the exam


The exam guide contains a complete list of topics that may be included on the exam. Review the exam guide to determine if your knowledge aligns with the topics on the exam.

Certification exam guide
A Looker LookML Developer works with datasets and LookML and is familiar with SQL and BI tools. LookML Developers are proficient in model management, including troubleshooting existing model errors, implementing data security requirements, creating LookML objects, and maintaining LookML project health. LookML Developers design new LookML dimensions and measures, and build Explores for users to answer business questions. LookML developers are skilled at quality management, from implementing version control to assessing code quality to utilizing SQL runner for data validation.

Section 1: Model management

1.1 Troubleshoot errors in existing data models. For example:

Determine error sources
Apply procedural concepts to resolve errors

1.2 Apply procedural concepts to implement data security requirements. For example:


Implement permissions for users
Decide which Looker features to use to implement data security (e.g., access filters, field-level access controls, row-level access controls)

1.3 Analyze data models and business requirements to create LookML objects. For example:

Determine which views and tables to use
Determine how to join views into Explores
Build project-based needs (e.g., data sources, replication, mock reports provided by clients)

1.4 Maintain the health of LookML projects in a given scenario. For example:

Ensure existing contents are working (e.g., use Content Validator, audit, search for errors)
Resolve errors

Section 2: Customization

2.1 Design new LookML dimensions or measures with given requirements. For example:

Translate business requirements (specific metrics) into the appropriate LookML structures (e.g., dimensions, measures, and derived tables)
Modify existing project structure to account for new reporting needs
Construct SQL statements to use with new dimensions and measures

2.2 Build Explores for users to answer business questions. For example:

Analyze business requirements and determine LookML code implementation to meet requirements (e.g., models, views, join structures)
Determine which additional features to use to refine data (e.g., sql_always_where, always_filter, only showing certain fields using hidden: fields:, etc.)

Section 3: Optimization

3.1 Apply procedural concepts to optimize queries and reports for performance. For example:


Determine which solution to use based on performance implications (e.g., Explores, merged results, derived tables)
Apply procedural concepts to evaluate the performance of queries and reports
Determine which methodology to use based on the query and reports performance sources (e.g., A/B testing, SQL principles)

3.2 Apply procedural concepts to implement persistent derived tables and caching policies based on requirements. For example:

Determine appropriate caching settings based on data warehouse’s update frequency (e.g., hourly, weekly, based on ETL completion)
Determine when to use persistent derived tables based on runtime and complexity of Explore queries, and on users’ needs
Determine appropriate solutions for improving data availability (e.g., caching query data, persisting tables, combination solutions)

Section 4: Quality

4.1 Implement version control based on given requirements. For example:


Determine appropriate setup for Git branches (e.g., shared branches, pull from remote production)
Reconcile merge conflicts with other developer branches (e.g., manage multiple users)
Validate the pull request process

4.2 Assess code quality. For example:

Resolve validation errors and warnings
Utilize features to increase usability (e.g., descriptions, labels, group labels)
Use appropriate coding for project files (e.g., one view per file)

4.3 Utilize SQL Runner for data validation in a given scenario. For example:

Determine why specific queries return results by looking at the generated SQL in SQL Runner
Resolve inconsistencies found in the system or analysis (e.g., different results than expected, non-unique primary keys)
Optimize SQLs for cost or efficiency based on business requirements

QUESTION 1
Business users report that they are unable to build useful queries because the list of fields in the Explore is too long to find what they need.
Which three LookML options should a developer use to curate the business user?s experience? (Choose three.)

A. Add a description parameter to each field with context so that users can search key terms.
B. Create a separate project for each business unit containing only the fields that the unit needs.
C. Add a group_label parameter to relevant fields to organize them into logical categories.
D. Use the hidden parameter to remove irrelevant fields from the Explore.
E. Use a derived table to show only the relevant fields.

Answer: A,C,E

QUESTION 2
A user reports that a query run against the orders Explore takes a long time to run. The query includes only
fields from the users view. Data for both views is updated in real time. The developer runs the following query
in SQL Runner and quickly receives results:
SELECT * FROM users.
What should the developer do to improve the performance of the query in the Explore?

A. Create an Explore with users as the base table.
B. Create a persistent derived table from the user?s query.
C. Create an ephemeral derived table from the user?s query.
D. Add persist_for: ?24 hours? to the orders Explore.

Answer: A

QUESTION 3
A developer has User Specific Time Zones enabled for a Looker instance, but wants to ensure that queries run
in Looker are as performant as they can be. The developer wants to add a datatype: date parameter to all
dimension_group definitions without time data in a table-based view, so that time conversions don?t occur for these fields.
How can the developer determine to which fields this parameter should be applied through SQL Runner?

A. Open the Explore query in SQL Runner and validate whether removing the conversion from date fields changes the results.
B. Open the Explore query in SQL Runner to determine which fields are converted.
C. Use the CAST function in SQL Runner to ensure that all underlying fields are dates and conversions are not applied.
D. Use the Describe feature in SQL Runner to determine which fields include time data.

Answer: C

Examkingdom Google LookML Developer pdf, Certkingdom Google LookML Developer PDF

MCTS Training, MCITP Trainnig

Best Google Cloud-Digital-Leader Certification, Google LookML Developer Training at certkingdom.com

Cloud-Digital-Leader Google Cloud Digital Leader Exam

Cloud Digital Leader
A Cloud Digital Leader can distinguish and evaluate the various capabilities of Google Cloud core products and services and how they can be used to achieve desired business goals. A Cloud Digital Leader is well-versed in basic cloud concepts and can demonstrate a broad application of cloud computing knowledge in a variety of applications.

The Cloud Digital Leader exam is job-role independent. The exam assesses the knowledge and skills of individuals who want or are required to understand the purpose and application of Google Cloud products.

The Cloud Digital Leader exam assesses your knowledge in three areas:
General cloud knowledge (approximately 15-25% of the exam)
General Google Cloud knowledge (approximately 25-35% of the exam)
Google Cloud products and services (approximately 45-55% of the exam)

About this certification exam
Length: 90 minutes
Registration fee: $99
Language: English, Japanese
Exam format: Multiple choice and multiple select

Exam Delivery Method:
a. Take the online-proctored exam from a remote location, review the online testing requirements.
b. Take the onsite-proctored exam at a testing center, locate a test center near you.

Prerequisites: None
Recommended experience: Experience collaborating with technical professionals

Exam overview

Step 1. Understand what’s on the exam
The exam contains multiple choice and multiple-select questions, including real-world technical scenarios to assess your ability to identify the appropriate cloud solutions and Google Cloud products.

The exam guide contains a list of topics that may be assessed on the exam. Review the exam guide to determine if your knowledge aligns with the topics on the exam.

Step 2. Expand your knowledge with training

Cloud Digital Leader
A Cloud Digital Leader can articulate the capabilities of Google Cloud core products and services and how they benefit organizations. The Cloud Digital Leader can also describe common business use cases and how cloud solutions support an enterprise. The Cloud Digital Leader exam is job-role agnostic and does not require hands-on experience with Google Cloud.
Section 1. General cloud knowledge

1.1 Define basic cloud technologies. Considerations include:
Differentiate between traditional infrastructure, public cloud, and private cloud
Define cloud infrastructure ownership
Shared Responsibility Model
Essential characteristics of cloud computing

1.2 Differentiate cloud service models. Considerations include:
Infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS)
Describe the trade-offs between level of management versus flexibility when comparing cloud services
Define the trade-offs between costs versus responsibility
Appropriate implementation and alignment with given budget and resources

1.3 Identify common cloud procurement financial concepts. Considerations include:
Operating expenses (OpEx), capital expenditures (CapEx), and total cost of operations (TCO)
Recognize the relationship between OpEx and CapEx related to networking and compute infrastructure
Summarize the key cost differentiators between cloud and on-premises environments

Section 2. General Google Cloud knowledge
2.1 Recognize how Google Cloud meets common compliance requirements. Considerations include:
    Locating current Google Cloud compliance requirements
    Familiarity with Compliance Reports Manager

2.2 Recognize the main elements of Google Cloud resource hierarchy. Considerations include:
    Describe the relationship between organization, folders, projects, and resources

2.3 Describe controlling and optimizing Google Cloud costs. Considerations include:
    Google Cloud billing models and applicability to different service classes
    Define a consumption-based use model
    Application of discounts (e.g., flat-rate, committed-use discounts [CUD], sustained-use discounts [SUD])

2.4 Describe Google Cloud’s geographical segmentation strategy. Considerations include:
    Regions
    Zones
    Regional resources
    Zonal resources
    Multiregional resources

2.5 Define Google Cloud support options. Considerations include:
    Distinguish between billing support, technical support, role-based support, and enterprise support
    Recognize a variety of Service Level Agreement (SLA) applications

Section 3. Google Cloud products and services

3.1 Describe the benefits of Google Cloud virtual machine (VM)-based compute options. Considerations include:
    Compute Engine, Google Cloud VMware Engine, and Bare Metal
    Custom versus standard sizing
    Free, premium, and custom service options
    Attached storage/disk options
    Preemptible VMs

3.2 Identify and evaluate container-based compute options. Considerations include:
    Define the function of a container registry
    Distinguish between VMs, containers, and Google Kubernetes Engine

3.3 Identify and evaluate serverless compute options. Considerations include:
    Define the function and use of App Engine, Cloud Functions, and Cloud Run
    Define rationale for versioning with serverless compute options
    Cost and performance tradeoffs of scale to zero

3.4 Identify and evaluate multiple data management offerings. Considerations include:
    Describe the differences and benefits of Google Cloud’s relational and non-relational database offerings (e.g., Cloud SQL, Cloud Spanner, Cloud Bigtable, BigQuery)
    Describe Google Cloud’s database offerings and how they compare to commercial offerings

3.5 Distinguish between ML/AI offerings. Considerations include:
    Describe the differences and benefits of Google Cloud’s hardware accelerators (e.g., Vision API, AI Platform, TPUs)
    Identify when to train your own model, use a Google Cloud pre-trained model, or build on an existing model

3.6 Differentiate between data movement and data pipelines. Considerations include:
    Describe Google Cloud’s data pipeline offerings (e.g., Pub/Sub, Dataflow, Cloud Data Fusion, BigQuery, Looker)
    Define data ingestion options

3.7 Apply use cases to a high-level Google Cloud architecture. Considerations include:
    Define Google Cloud’s offerings around the Software Development Life Cycle (SDLC)
    Describe Google Cloud’s platform visibility and alerting offerings

3.8 Describe solutions for migrating workloads to Google Cloud. Considerations include:
    Identify data migration options
    Differentiate when to use Migrate for Compute Engine versus Migrate for Anthos
    Distinguish between lift and shift versus application modernization

3.9 Describe networking to on-premises locations. Considerations include:
    Define Software-Defined WAN (SD-WAN)
    Determine the best connectivity option based on networking and security requirements
    Private Google Access

3.10 Define identity and access features. Considerations include:
    Cloud Identity, Google Cloud Directory Sync, and Identity Access Management (IAM)

QUESTION 1
You are migrating workloads to the cloud. The goal of the migration is to serve customers worldwide as quickly
as possible According to local regulations, certain data is required to be stored in a specific geographic area,
and it can be served worldwide. You need to design the architecture and deployment for your workloads.
What should you do?

A. Select a public cloud provider that is only active in the required geographic area
B. Select a private cloud provider that globally replicates data storage for fast data access
C. Select a public cloud provider that guarantees data location in the required geographic area
D. Select a private cloud provider that is only active in the required geographic area

Answer: D

QUESTION 2
Your organization needs a large amount of extra computing power within the next two weeks.
After those two weeks, the need for the additional resources will end.
Which is the most cost-effective approach?

A. Use a committed use discount to reserve a very powerful virtual machine
B. Purchase one very powerful physical computer
C. Start a very powerful virtual machine without using a committed use discount
D. Purchase multiple physical computers and scale workload across them

Answer: C

QUESTION 3
Your organization needs to plan its cloud infrastructure expenditures.
Which should your organization do?

A. Review cloud resource costs frequently, because costs change often based on use
B. Review cloud resource costs annually as part of planning your organization’s overall budget
C. If your organization uses only cloud resources, infrastructure costs are no longer part of your overall budget
D. Involve fewer people in cloud resource planning than your organization did for on-premises resource planning

Answer: B

Examkingdom Google Cloud-Digital-Leader pdf, Certkingdom Google Cloud-Digital-Leader PDF

MCTS Training, MCITP Trainnig

Best Google Cloud-Digital-Leader Certification, Google Cloud-Digital-Leader Training at certkingdom.com

Google Associate Android Developer Google Developers Certification – Associate Android Developer (Kotlin and Java Exam)

Earning this certification will be a definitive step on your path to a career as an Android developer.

The Associate Android Developer Exam demonstrates the type of skill that an entry-level Android Developer should have as they begin their career. By passing this performance-based exam and earning the Associate Android Developer Certification, you prove that you’re competent and skilled in tasks that a developer typically performs. The exam is now also offered to be taken in Kotlin, as well as Java!

Android proficiency

The exam is designed to test the skills of an entry-level Android developer. Therefore, to take this exam, you should have this level of proficiency, either through education, self-study, your current job, or a job you have had in the past. Assess your proficiency by reviewing “Exam Content.” If you’d like to take the exam, but feel you need to prepare a bit more, level up your Android knowledge with some great Android training resources.

Language
The exam consists of a coding project and an exit interview. Both of these exam components are available only in English at this time.

Android Studio
You must use the latest version of Android Studio to complete the Associate Android Developer Certification Exam.
Age requirement

If you are under 13, you are not eligible to take the exam or to become certified. If you are between 13-17 years of age, you may test and receive certification with parental consent. If you are participating in a location that requires by law a lower minimum age for entry into such programs, then the minimum age limit for that person will be the stated minimum required age. Individuals 18 years or older are eligible for certification without any age-related restrictions.

ID verification

You must be able to present government-issued photo identification from a non-embargoed country. (See “U.S.-embargoed countries,” below.) For Canada, United States, France, United Kingdom, Ireland, Netherlands, and Switzerland, you may use a driver’s license. For all other countries you must provide a current passport.

You will scan and upload a photo of your ID using your webcam, or you will upload a copy of your ID that you’ve previously scanned in JPG, PNG, or BMP format.

App functionality

Construct apps that use Android’s messaging, multitasking, connectivity and media services to design full-featured apps primarily for mobile devices.

User interface
Quickly create apps with clean, effective user interfaces that take advantage of Android’s rich UI frameworks.

Data management
Leverage Android’s effective frameworks and techniques to perform or schedule data retrieval/storage efficiently in a mobile environment.

Debugging
Understand the debugging tools in Android Studio and create more reliable and robust apps.

Testing
Be able to test the execution of a running program with the intent of finding errors and abnormal or unexpected behavior.
 



QUESTION 1
What is a correct part of an Implicit Intent for sharing data implementation?


A. val sendIntent = Intent(this, UploadService::class.java).apply {
putExtra(Intent.EXTRA_TEXT, textMessage)


B.
val sendIntent = Intent().apply {
type = Intent.ACTION_SEND;


C. val sendIntent = Intent(this, UploadService::class.java).apply {
data = Uri.parse(fileUrl)


D. val sendIntent = Intent().apply {
action = Intent.ACTION_SEND


Correct Answer: D

Explanation:
Create the text message with a string


val sendIntent = Intent().apply {
action = Intent.ACTION_SEND
putExtra(Intent.EXTRA_TEXT, textMessage)
type = “text/plain”
}


QUESTION 2
By default, the notification’s text content is truncated to fit one line. If you want your notification to be longer,
for example, to create a larger text area, you can do it in this way:
 

A. var builder = NotificationCompat.Builder(this, CHANNEL_ID)
.setContentText(“Much longer text that cannot fit one line…”)
.setStyle(NotificationCompat.BigTextStyle()
.bigText(“Much longer text that cannot fit one line…”))



B. var builder = NotificationCompat.Builder(this, CHANNEL_ID)
.setContentText(“Much longer text that cannot fit one line…”)
.setLongText(“Much longer text that cannot fit one line…”))


C. var builder = NotificationCompat.Builder(this, CHANNEL_ID)
.setContentText(“Much longer text that cannot fit one line…”)
.setTheme(android.R.style.Theme_LongText);


Correct Answer: A

What is a correct part of an Implicit Intent for sharing data implementation?

A. Intent sendIntent = new Intent(this, UploadService.class)
sendIntent.putExtra(Intent.EXTRA_TEXT, textMessage);

B. Intent sendIntent = new Intent();
sendIntent.setType(Intent.ACTION_SEND);

C. Intent sendIntent = new Intent(this, UploadService.class)
sendIntent.setData(Uri.parse(fileUrl));

D. Intent sendIntent = new Intent();
sendIntent.setAction(Intent.ACTION_SEND);


Correct Answer: D

QUESTION 3
“workManager” is an instance of WorkManager. Select correct demonstration of WorkRequest cancellation:
 

A. workManager.enqueue(new OneTimeWorkRequest.Builder(FooWorker.class).build());

B. WorkRequest request = new OneTimeWorkRequest.Builder(FooWorker.class).build();
workManager.enqueue(request);
LiveData<WorkInfo> status = workManager.getWorkInfoByIdLiveData(request.getId
());
status.observe(…);

C. WorkRequest request = new OneTimeWorkRequest.Builder(FooWorker.class).build();
workManager.enqueue(request);
workManager.cancelWorkById(request.getId());

D. WorkRequest request1 = new OneTimeWorkRequest.Builder(FooWorker.class).build();
WorkRequest request2 = new OneTimeWorkRequest.Builder(BarWorker.class).build
();
WorkRequest request3 = new OneTimeWorkRequest.Builder(BazWorker.class).build
();
workManager.beginWith(request1, request2).then(request3).enqueue();

E. WorkRequest request = new OneTimeWorkRequest.Builder(FooWorker.class).build();
workManager.enqueue(request);
workManager.cancelWork(request);
 

Correct Answer: C

Actualkey Google Associate Android Developer Exam pdf, Certkingdom Google Associate Android Developer PDF

MCTS Training, MCITP Trainnig

Best Google Associate Android Developer Certification, Google Associate Android Developer Training at certkingdom.com

Google Cloud Certified Professional Data Engineer Exam

Professional Data Engineer
A Professional Data Engineer enables data-driven decision making by collecting, transforming, and publishing data. A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance; scalability and efficiency; reliability and fidelity; and flexibility and portability. A Data Engineer should also be able to leverage, deploy, and continuously train pre-existing machine learning models.

The Professional Data Engineer exam assesses your ability to:
Design data processing systems
Build and operationalize data processing systems
Operationalize machine learning models
Ensure solution quality

About this certification exam
Length: 2 hours
Registration fee: $200 (plus tax where applicable)
Languages: English, Japanese.
Exam format: Multiple choice and multiple select, taken in person at a test center. Locate a test center near you.
Prerequisites: None
Recommended experience: 3+ years of industry experience including 1+ years designing and managing solutions using GCP.

Hands-on practice
This exam is designed to test technical skills related to the job role. Hands-on experience is the best preparation for the exam. If you feel you may need more experience or practice, use the hands-on labs available on Qwiklabs as well as the GCP free tier to level up your knowledge and skills.

GCP free tier
GCP always free products
GCP essentials quest
Data engineering quest

4. Practice exam
Check your readiness to take the exam.
Not feeling quite ready? Check out the additional resources listed below and get more hands-on practice with Qwiklabs.

5. Additional resources
In-depth discussions on the concepts and critical components of GCP:
Google Cloud documentation
Google Cloud solutions

6. Schedule your exam
Register and find a location near you.

1. Designing data processing systems
1.1 Selecting the appropriate storage technologies. Considerations include:
Mapping storage systems to business requirements
Data modeling
Tradeoffs involving latency, throughput, transactions
Distributed systems
Schema design

1.2 Designing data pipelines. Considerations include:
Data publishing and visualization (e.g., BigQuery)
Batch and streaming data (e.g., Cloud Dataflow, Cloud Dataproc, Apache Beam, Apache Spark and Hadoop ecosystem, Cloud Pub/Sub, Apache Kafka)
Online (interactive) vs. batch predictions
Job automation and orchestration (e.g., Cloud Composer)

1.3 Designing a data processing solution. Considerations include:
Choice of infrastructure
System availability and fault tolerance
Use of distributed systems
Capacity planning
Hybrid cloud and edge computing
Architecture options (e.g., message brokers, message queues, middleware, service-oriented architecture, serverless functions)
At least once, in-order, and exactly once, etc., event processing

1.4 Migrating data warehousing and data processing. Considerations include:
Awareness of current state and how to migrate a design to a future state
Migrating from on-premises to cloud (Data Transfer Service, Transfer Appliance, Cloud Networking)
Validating a migration

2. Building and operationalizing data processing systems

2.1 Building and operationalizing storage systems. Considerations include:
Effective use of managed services (Cloud Bigtable, Cloud Spanner, Cloud SQL, BigQuery, Cloud Storage, Cloud Datastore, Cloud Memorystore)
Storage costs and performance
Lifecycle management of data

2.2 Building and operationalizing pipelines. Considerations include:
Data cleansing
Batch and streaming
Transformation
Data acquisition and import
Integrating with new data sources

2.3 Building and operationalizing processing infrastructure. Considerations include:
Provisioning resources
Monitoring pipelines
Adjusting pipelines
Testing and quality control

3. Operationalizing machine learning models

3.1 Leveraging pre-built ML models as a service. Considerations include:
ML APIs (e.g., Vision API, Speech API)
Customizing ML APIs (e.g., AutoML Vision, Auto ML text)
Conversational experiences (e.g., Dialogflow)

3.2 Deploying an ML pipeline. Considerations include:
Ingesting appropriate data
Retraining of machine learning models (Cloud Machine Learning Engine, BigQuery ML, Kubeflow, Spark ML)
Continuous evaluation

3.3 Choosing the appropriate training and serving infrastructure. Considerations include:
Distributed vs. single machine
Use of edge compute
Hardware accelerators (e.g., GPU, TPU)

3.4 Measuring, monitoring, and troubleshooting machine learning models. Considerations include:
Machine learning terminology (e.g., features, labels, models, regression, classification, recommendation, supervised and unsupervised learning, evaluation metrics)
Impact of dependencies of machine learning models
Common sources of error (e.g., assumptions about data)

4. Ensuring solution quality

4.1 Designing for security and compliance. Considerations include:
Identity and access management (e.g., Cloud IAM)
Data security (encryption, key management)
Ensuring privacy (e.g., Data Loss Prevention API)
Legal compliance (e.g., Health Insurance Portability and Accountability Act (HIPAA), Children’s Online Privacy Protection Act (COPPA), FedRAMP, General Data Protection Regulation (GDPR))

4.2 Ensuring scalability and efficiency. Considerations include:
Building and running test suites
Pipeline monitoring (e.g., Stackdriver)
Assessing, troubleshooting, and improving data representations and data processing infrastructure
Resizing and autoscaling resources

4.3 Ensuring reliability and fidelity. Considerations include:
Performing data preparation and quality control (e.g., Cloud Dataprep)
Verification and monitoring
Planning, executing, and stress testing data recovery (fault tolerance, rerunning failed jobs, performing retrospective re-analysis)
Choosing between ACID, idempotent, eventually consistent requirements

4.4 Ensuring flexibility and portability. Considerations include:
Mapping to current and future business requirements
Designing for data and application portability (e.g., multi-cloud, data residency requirements)
Data staging, cataloging, and discovery

QUESTION 1
Your company built a TensorFlow neutral-network model with a large number of neurons and layers.
The model fits well for the training data. However, when tested against new data, it performs poorly.
What method can you employ to address this?

A. Threading
B. Serialization
C. Dropout Methods
D. Dimensionality Reduction

Correct Answer: C

QUESTION 2
You are building a model to make clothing recommendations. You know a user’s fashion preference is likely to change over time, so you build a data pipeline to stream new data back to the model as it becomes available.
How should you use this data to train the model?

A. Continuously retrain the model on just the new data.
B. Continuously retrain the model on a combination of existing data and the new data.
C. Train on the existing data while using the new data as your test set.
D. Train on the new data while using the existing data as your test set.

Correct Answer: B

QUESTION 3
You designed a database for patient records as a pilot project to cover a few hundred patients in three clinics.
Your design used a single database table to represent all patients and their visits, and you used self-joins to
generate reports. The server resource utilization was at 50%. Since then, the scope of the project has
expanded. The database must now store 100 times more patient records. You can no longer run the reports,
because they either take too long or they encounter errors with insufficient compute resources.
How should you adjust the database design?

A. Add capacity (memory and disk space) to the database server by the order of 200.
B. Shard the tables into smaller ones based on date ranges, and only generate reports with prespecified date ranges.
C. Normalize the master patient-record table into the patient table and the visits table, and create other necessary tables to avoid self-join.
D. Partition the table into smaller tables, with one for each clinic. Run queries against the smaller table pairs, and use unions for consolidated reports.

Correct Answer: C

QUESTION 4
You create an important report for your large team in Google Data Studio 360. The report uses Google
BigQuery as its data source. You notice that visualizations are not showing data that is less than 1 hour old.
What should you do?

A. Disable caching by editing the report settings.
B. Disable caching in BigQuery by editing table details.
C. Refresh your browser tab showing the visualizations.
D. Clear your browser history for the past hour then reload the tab showing the virtualizations.

Correct Answer: A

QUESTION 5
An external customer provides you with a daily dump of data from their database. The data flows into Google
Cloud Storage GCS as comma-separated values (CSV) files. You want to analyze this data in Google
BigQuery, but the data could have rows that are formatted incorrectly or corrupted. How should you build this pipeline?

A. Use federated data sources, and check data in the SQL query.
B. Enable BigQuery monitoring in Google Stackdriver and create an alert.
C. Import the data into BigQuery using the gcloud CLI and set max_bad_records to 0.
D. Run a Google Cloud Dataflow batch pipeline to import the data into BigQuery, and push errors to another dead-letter table for analysis.

Correct Answer: D

 

Actualkey Google Cloud Certified Professional Data Engineer Exam PDF, Certkingdom Google Cloud Certified Professional Data Engineer Exam PDF

MCTS Training, MCITP Trainnig

Best Google Cloud Certified Professional Data Engineer Exam Certification, Google Cloud Certified Professional Data Engineer Exam Training at certkingdom.com