How to Maximize the Scalability of Your Containerized Environment

Wercker is a Docker-Native CI/CD Automation platform for Kubernetes & Microservice Deployments

Chris Tozzi
Chris Tozzi
May 14, 2018

One main reason to use containers is that they help make apps, services and environments highly scalable. But that doesn’t happen magically. In order to take full advantage of the benefits of containers, you have to build your stack and configure your environment in ways that maximize scalability.

Below, I take a look at some strategies that can ensure that your containers and the software they host are as scalable as they can be.


Defining Scalability

First, though, let’s spend a moment discussing what scalability means, exactly, and what it looks like in practice.

Scalability can take multiple forms:

  • Being able to increase or decrease the capacity of an application without adding or subtracting deployments. For example, perhaps your web app has 10,000 users per day today and you want it to be able to handle 20,000 without creating a new instance of the app. You could scale in this way by assigning more resources to the app.
  • Increasing or decreasing the number of distinct deployments of your app. This is another way to change the capacity of your app, but in this case you change the number of instances rather than the resource allocation of one instance.
  • Changing the size of the environment that hosts your app (e.g., by reducing or increasing the number of host servers) with or without affecting the capacity of your app. This type of scalability involves scaling the size of the infrastructure that hosts your app. It can be important when, for example, you want to consolidate servers.
  • Adding or removing services from an app (if it’s a microservices-based app). An app that is composed of multiple services can scale in the sense of having more or fewer services added to or subtracted from it. Today, the trend in this sense is toward adding more services as monoliths which are broken into microservices. And microservices apps are divided still further to be even more modular.
  • Changing the size or number of deployments of the complementary tools and technologies (such as management and security tools) that help to run your app. Your app can’t scale effectively if the tools that you require to run it can’t also scale.

    It’s worth emphasizing, too, that scalability is about more than just scaling up. When people talk about the importance of scalability, they often have in mind the ability to add capacity in order to meet increases in demand or guarantee availability. Equally important, however, is the ability to scale down when demand decreases. If you can’t scale down, you end up consuming more resources than you need, which is inefficient from a cost and management standpoint.

 

Achieving Scalability when Using Containers

Containers provide a framework for building an environment that can scale in many of the ways described above. However, as I noted in the introduction, achieving that scalability requires more than simply migrating your application to run inside containers and then calling it done. You have to make sure that you architect your environment and technology stack in a way that allows you to take full advantage of the opportunities containers provide to be scalable.

The following list includes tips and best practices which can guide you in order to ensure that you gain a maximal ability to scale your containerized environment:

  • Make sure you have flexible hosting infrastructure. Being able to add or subtract compute, networking and storage resources from the infrastructure that hosts your containers is the most basic requirement for achieving scalability. This is generally easy to do if you run containers in the cloud. It’s more difficult when your infrastructure runs on-premises, but it’s feasible there with the right architecture. Virtual machines can help in this respect by providing an agile abstraction layer between your on-premises bare-metal servers and the container host environment.
  • Automate provisioning and deployment. When the provisioning and deployment of your container environment are fully automated, you can scale the number of deployments easily because you don’t have to perform manual configuration before adding a new deployment.
  • Use a cloud-based container registry. This isn’t strictly necessary, but it can help you to scale because it provides a central location for retrieving container images. If you can pull images from anywhere, you can move deployments quickly to different infrastructure in order to meet scalability requirements.
  • Make your application as modular as possible. Generally speaking, the more microservices you have inside your app, the easier it will be to scale. That’s because you can scale on a per-service basis in order to optimize resource consumption and minimize disruption to your application when changing resource allocation. Making the application modular requires refactoring or re-architecting the codebase, which takes some doing, but it’s worth it if scalability is a key priority.
  • Use scalable management tools. If your containers run in a massively scalable cloud-based environment, but your orchestrator is hosted on a single bare-metal server, you shoot yourself in the foot because your orchestrator can’t scale as quickly as the rest of the environment. The same holds true for other types of auxiliary tools (registries, security scanners, monitoring applications and so on) that you would use to build a mature, production-ready container environment.


Efficient scalability is one of the key selling points of containers. Make sure that you are able to take full advantage of this benefit by designing your stack and environment in ways that leverage scalability opportunities completely.

 

About the Author

Chris Tozzi has worked as a journalist and Linux systems administrator. He has particular interests in open source, agile infrastructure and networking. He is Senior Editor of content and a DevOps Analyst at Fixate IO.

 

Like Wercker?

We’re hiring! Check out the careers page for open positions in Amsterdam, London and San Francisco.

As usual, if you want to stay in the loop follow us on twitter @wercker or hop on our public slack channel. If it’s your first time using Wercker, be sure to tweet out your #greenbuilds, and we’ll send you some swag!       

 

Topics: Containers