A common barrier between development teams and getting code out the door is their underlying, server-centric infrastructure. Serverless computing was created to allow developers to build and run applications without relying on servers.
Every major cloud player offers a serverless computing tool. In 2019, Google Cloud introduced Cloud Run. Unlike AWS Lambda, Azure Functions, and Google Cloud Functions — which are primarily used for serverless code deployment (FaaS) — Cloud Run is used for serverless container deployment.
But what is Cloud Run exactly, and how does it work? Let’s go over the basics.
What Is Cloud Run?
Powered by Knative, Cloud Run is Google Cloud’s answer to serverless container deployment and execution. It allows developers to run pre-built applications by taking a Docker (OCI) container image and running it as a stateless, autoscaling HTTP service.
Unlike the source-based AWS Lambda and Azure Functions, Cloud Run is focused on container-based development, allowing you to run applications serving multiple endpoints on a larger scale and with fewer architectural restrictions.
Cloud Run is a fully managed platform.
Why Use Cloud Run?
When it comes to server management, development and IT teams spend large chunks of their time focused on preventing and addressing problems around scaling and provisioning. Cloud Run eliminates a lot of this work and frees up time for developers with a few key features. The main benefits include:
- Easier deployment: Developers can deploy with a single command without requiring any additional service-specific configuration.
- Advanced scalability: Cloud Run scales automatically based on demand, and you’re only charged for what you use. When there aren’t any requests, it uses zero resources.
- Configuration flexibility: Cloud Run offers a consistent and straightforward developer experience. Every container is implemented as a Docker image. Developers can use any coding language, binary, or framework they want.
How Does Cloud Run Work?
Cloud Run automates most of the tasks related to managing server resources and overhead. Once your application is packaged in Docker, all you need to do is tell Cloud Run where your Docker container with your pre-built app is, and then Cloud Run will deploy it.
Before deploying, Cloud Run does require:
- A specified allotment of memory and CPU resources.
- The logic inside the container to be stateless. Any data stored within the container will be deleted unless a persistent volume or other data transfer method is attached.
From there, Cloud Run will provision a new app within minutes. While your container is running, Cloud Run automatically adds or removes servers based on demand. It also provides the user the HTTP endpoint, container usage information, and control over the container’s billing.
What Is Cloud Run for Anthos?
Also powered by Knative, Cloud Run for Anthos is the same as Cloud Run, except they run in all the different places supported by Anthos (e.g., on-prem, in a telco edge, colo, Azure, and AWS). Cloud Run for Anthos is an integration that supports serverless workloads and development on Google Kubernetes Engine (GKE). It provides custom machine types, VPC networking, and integration with existing Kubernetes-based solutions.
Why might you deploy via Cloud Run for Anthos over other environments? Two reasons. If you have existing Kubernetes clusters, you may need additional features like namespacing and pod control. Running on GKE can also offer more customization. But again, it comes back to team and business needs.
Cloud Run is one of many Google Cloud tools that can help you scale your cloud environment quickly and more efficiently.
Interested in learning more? Schedule a consultation with our cloud experts to learn about which cloud tools are the best fit for your business and team.