Get startedGet started for free

Managing containers

1. Managing containers

Containers improve agility, enhance security, optimize resources and simplify managing applications in the cloud. Many organizations have a mix of virtual machines and containers. However, as their It infrastructure setup becomes more complex, they often need a way to manage their services and machines. For example, an organization can have millions and millions of containers. This require as keeping them secure and ensuring that they operate efficiently can require significant oversight and management. Kubernetes, originally developed by Google, is an open-source platform for managing containerized workloads and services. It makes it easy to orchestrate many containers on many hosts, scale them, and easily deploy rollouts and rollbacks. This improves application reliability and reduces the time and resources needed to spend on management and operations. Google Kubernetes Engine or GKE is a Google hosted, managed Kubernetes service in the Cloud. The GKE environment consists of multiple machines, specifically compute engine instances grouped to form a cluster. GKE clusters can be customized, and they support different machine types, numbers of nodes, and network settings. GKE makes it easy to deploy applications by providing an API and a Web based console. Applications can be deployed in minutes and can be scaled up or down as needed. GKE also provides many features that can help monitor applications, manage resources, and troubleshoot problems. Let's explore how Ubie, a Japan based healthcare technology startup, reduced their infrastructure costs and maintenance requirements with Google Kubernetes Engine. Founded in 2017, Ubie's goal is to get people the right medical care when they need it, and it does this with products designed for hospitals and individuals. Ubie for hospital, their flagship product, is AI powered questionnaire software that lets patients provide medical details before an appointment. Ubie initially relied on an alternative, Cloud, to make Ubie for Hospital available in Japan. As the business added new customers, they needed an infrastructure that could support daily deployments and provide a secure gateway to connect Ubie to a wide range of customer networks and settings. Ubie evaluated available options and decided to use Kubernetes in Google Kubernetes Engine. Google Kubernetes Engine Autopilot, a mode that enables full management of an entire cluster's infrastructure and provides per-pod billing, presented a compelling option for the business to run Ubie for Hospital more efficiently and cost effectively. With GKE Autopilot, Ubie could eliminate the need to configure and monitor clusters while only paying for running pods. The shift reduced Ubie's infrastructure costs by 20%, and GKE Autopilot has helped the business eliminate Ubie for Hospital infrastructure maintenance and upgrade tasks that could take hours and days to complete. Another popular option for running containerized applications on Google Cloud is Cloud Run. Cloud Run is a fully managed serverless platform to deploy and run containerized applications without needing to worry about the underlying infrastructure. After your application code is containerized and deployed to Cloud Run, Google Cloud takes care of scaling and managing the infrastructure automatically. Cloud Run is ideal for running stateless applications that need to scale up and down quickly in response to traffic. This makes cloud run most suitable for simple and lightweight applications such as web applications. In summary, GKE is ideal when lots of control is required over a Kubernetes Environment and there are complex applications to run. Alternatively, Cloud Run is ideal for when a simple, fully managed serverless platform that can scale up and down quickly is required.

2. Let's practice!

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.