Deploying and implementing Cymbal Superstore’s cloud recommended solutions
1. Deploying and implementing Cymbal Superstore’s cloud recommended solutions
Way to go! You've planned and configured cloud solutions for Cymbal Superstore's application requirements. It's time to think about how you can deploy and implement the resources needed to realize the company's goals. You've worked hard to make sure the resource entities and policies are set up correctly for Cymbal Superstore's cloud architecture. You've also selected cloud products for the applications Cymbal Superstore has decided to migrate to the cloud. Solution deployment is a critical part of your role. As an Associate Cloud Engineer, you're expected to have the knowledge to implement specific compute solutions, including Compute Engine, Kubernetes Engine, Cloud Run and Cloud Run functions. Understanding availability, concurrency, connectivity, and access options of these services are keys to success as you deploy them to support your needs. Solutions you implement in Google Cloud will also require data stores. Google Cloud's data solutions include products that utilize relational and no SQL data structures. There are different products that support transactional and analytical use cases. Some solutions are optimized for low latency and global availability. Properly implementing software-defined networking will ensure your application frontends are accessible, and your application backends are secured. A common devops practice is to deploy your infrastructure in a declarative way, and source control configuration files. Deploying resources through infrastructure as code reduces human error, and speeds up resource allocation. Knowing how to do this in the context of your role as an Associate Cloud Engineer is yet another tool you have at your disposal. As a review, here are Cymbal Superstore's proposed solutions. Their ecommerce solution, based on container management provided by Google Kubernetes Engine, data, provided by the globally available, horizontally scalable capabilities of Spanner, and external Application Load Balancer for user access. This use case also has a need for historical sales data to be analyzed by BigQuery, Google Cloud's modern data warehouse implementation. The transportation management cloud solution monitors Pub/Sub for incoming sensor data, triggers a Cloud Run function as new messages are posted to a specific topic, and starts a Dataflow job to transform data and save it into Bigtable. Finally, the supply chain application implements managed instance groups in Compute Engine. The backend store for this solution is Cloud SQL. Connectivity between the backend database and the Compute Engine instances is via TCP internal to the VPC. For the supply chain app, external access will be achieved via a regional HTTPS load balancer. Three ways you can interact with Google Cloud to work with and deploy services are via the Cloud console, the command line, and programmatically. Let's look at these in a little more detail. You want to implement a compute instance for the Cymbal Superstore development team to start developing code on. One of the ways you can do this is via the Google Cloud console. The screenshot shows some of the settings you'll need to specify as you create this instance: The name of the instance, the region and zone where the instance resides, the machine configuration, the boot disk, and network settings and other persistent disks you're going to attach to them. Cymbal Superstore's supply chain app needs a Cloud SQL backend. Here's an example of how you would do this via the CLI. Notice the parameters required include the name, resources, and region specified. Remember, you can access the CLI by loading the Google Cloud SDK on your local machine. You can also use Cloud Shell, a cloud-based terminal with the gcloud CLI already installed on it. The transportation management system is using Cloud Run functions. Cloud Run functions gives you the option of deploying your function code from the local directory where it resides. Here's an example of the command to deploy a Cloud Run function with a Pub/Sub trigger from a directory on your local machine. trans_mg_function is going to be the name of the deployed function based on the logic in the directory. The runtime flag specifies the Python interpreter you want to use as you parse the function. The trigger-topic flag is the Pub/Sub topic you want to monitor. The data sent to your function includes the Pub/Sub event data and metadata.2. Let's practice!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.