Get startedGet started for free

Serverless computing

1. Serverless computing

Another option for modernizing Cloud applications is serverless computing. Serverless computing doesn't mean there's no server, it means that resources like compute power are automatically provisioned in the background as needed. The advantage here is that organizations won't pay for compute power unless they're running a query or application. At its simplest definition, serverless means that businesses provide the code for whatever function they want and the public Cloud provider does everything else. Imagine you provide software to businesses that help employees manage their corporate expenses. You want to add a feature that lets users upload an image with their expense receipt. In this case, the ability to upload an image is called a function. You, as the software development company, write the code for that function directly into your public Cloud platform. From there, the public Cloud provider manages everything else. One type of serverless computing solution is called function as a service. Some functions are a response to specific events, like file uploads to Cloud storage, or changes to database records. You write the code that defines the response to those events and the Cloud provider does everything else. Google Cloud offers many serverless computing products. The first is Cloud Run, which is a fully managed environment for running containerized applications. With this product, you don't have to worry about the underlying infrastructure. Then there is Cloud Run functions, which is the platform for hosting simple, single-purpose functions that are attached to events emitted from your Cloud infrastructure and services. For example, sending a notification to a mobile device when a new order is placed on a website. There is also App Engine, which is a service to build and deploy web applications. Serverless computing has many benefits. Reduced operational costs. The Cloud provider is responsible for the infrastructure and its maintenance. Therefore, the application owner does not need to invest in the infrastructure or the human resources required to manage it. Scalability. Serverless computing provides automatic scaling of computing resources based on the applications demand. The Cloud provider manages the scaling process and the application owner only pays for the resources they use. Faster time to market. The need for infrastructure setup and configuration is eliminated, which reduces the time required to deploy applications. This feature lets the application owner focus on writing code and quickly deploying new features. Reduce development costs. The development process is simplified because developers can focus on the application's logic and not on the underlying infrastructure. Improved resilience. Serverless computing offers improved resilience and availability as the Cloud provider automatically manages the infrastructure's failover and disaster recovery capabilities. Pay-per-use pricing model. The application owner only pays for the computing resources they use. This reduces the cost of unused resources and helps optimize costs. How might an organization benefit from Cloud computing infrastructure technology? Let's explore an example. Specializing in educational technology, Mashme.io provides video collaboration experiences for over 3 million users in 73 countries. Connecting 250 full HD live video streams in real time is a major technical challenge. Latencies need to be kept very low to achieve the face-to-face experience, and continuous integration in deployment is vital to avoid disruptive downtime for global clients. Meanwhile, costs have to be kept to a minimum to keep the solution affordable for a growing start up. To meet those needs, Mashme.io chose to use Google Kubernetes Engine. Every teacher we speak to tells us that latency is the most important thing for educational video conferencing, says Mashme.io founder Victor Sanchez Belmar. Low latency means having servers close to every student that connects to Mashme.io. With students connecting from around the world, Google Cloud has the global network to make that happen. The view was that setting up data centers around the world with your own hardware is a good way for a start up to never start. Instead, Mashme.io started using Google's global network with App Engine before moving to Google Cloud with their own docker containers, and finally, to Google Kubernetes Engine. This allowed them to update their nodes and services in an almost continuous way without disruption, so students didn't lose an hour, or even a second, of class.

2. Let's practice!

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.