DevOps means a lot of things to a lot of people. But one thing that is consistent with those who are DevOps fans is that DevOps is a way of operating, not a definition of any particular setup.
This is annoying because it does not give the answers—just ideas. But it’s extremely useful because it allows you to establish the principles by which you architect your delivery chain today, and into the future. Tools like Docker are what bring the principles into reality.
Path to execution
In the early days of DevOps, the pitch was that developers and ITOps could stop being enemies. The path to execution was limited to just a few things:
- Structural changes: Get all teams working together by changing objectives and reporting structure.
- Maxing out existing release automation: leveraging basic release tools to do a lot more, starting with continuous integration (CI) and maybe continuous deployment and delivery (CD). Application Release Automation (ARA) and Application Lifecycle Management (ALM) were on the right path, but they maxed out when it came to production.
These approaches were not enough to become DevOps. There was clearly a gap in tooling that kept dreams of DevOps from coming to fruition. Rapidly, new tooling and new features for old tools began to appear in the market. More advanced release automation was one of the biggest. Release automation in particular was built bottom-up with DevOps in mind, and could support all the modern release practices but also offered more flexible artifact management, application security, cloud services, etc.
Yet there was one area where a huge blind spot remained—virtual machines (VMs). Sure, VMs were more flexible than physical servers. But in most environments they were treated the same. Developers and IT were getting along better, but all that meant is that getting a VM took only two days versus two weeks. Still too long. VMs can be treated as immutable containers, but then, they were huge! And hard to move—with a lot of host dependencies to run correctly.
It didn’t matter at that time how fast your code shipping was. The infrastructure was still a major bottleneck, and DevOps people who bought into the full-stack deployment and immutable infrastructure dream just did not have the tools to get there.
Docker goes the final mile
One could be disappointed by the fact that containers via LXC was not new, and developers/DevOps should have seen the light much sooner. But at the pace that developers move, the additional work required to customize LXC to work as a fully containerized solution is way too much for most teams. It’s better suited for high-security environments that are set up, and left alone.
The engineers at DocCloud invested the effort, and built a container solution that was exactly what a developer would want. What later became Docker was a more advanced version of containers tailored to application development. It spoke developer-speak, and had the interfaces like a simple-to-use CLI to make adoption of containers and integration into delivery chains much easier.
Today, Docker is the solution that takes the idea of full-stack deployments and immutable infrastructure to the next level, and thus has made microservices something more than a novel idea, which means that DevOps fanatics don’t just need to talk about it. They can show fully automated delivery chains, and the death of waterfall and monolithic applications in practice.
DevOps = Tool happy
The reason adoption has been so prolific is because developers are tinkerers, and they know a good thing when they see it. Because Docker did such a great job with their CLI, developers took very, very little time to figure out how to use it. Once they pulled, ran, and deployed their first container, they were sold, and enthusiastic, sharing the news with all their peers.
The negative side of developer technology adoption is that if the tool does not keep up with a cadence of releases and maintain its usefulness, developers can leave quickly. Docker has been able to maintain a rapid release cycle to address issues quickly, and offer even more in terms of integration with other modern tools.
For many, the path to DevOps was hindered by the lack of examples and the inability to execute. Even as the DevOps movement was kicking off, the tools could not keep pace with the concepts. The market stepped up its game, and eventually, advanced tooling like release automation stepped in to make the DevOps mecca possible. But it was not until Docker’s advanced incarnation of containers that the entirety of what a DevOps environment could be could come true.
That is not to say that other container technology is not coming. I suspect that cloud providers and tailor-made container technology will be developed. But today, Docker is the darling of the DevOps professional’s delivery chain.
About the Author
Chris Riley (@HoardingInfo) is a technologist who has spent 12 years helping organizations transition from traditional development practices to a modern set of culture, processes and tooling. In addition to being a research analyst, he is an O’Reilly author, regular speaker, and subject matter expert in the areas of DevOps strategy and culture. Chris believes the biggest challenges faced in the tech market are not tools, but rather people and planning.
We’re hiring! Check out the careers page for open positions in Amsterdam, London and San Francisco.
As usual, if you want to stay in the loop follow us on twitter @wercker or hop on our public slack channel. If it’s your first time using Wercker, be sure to tweet out your #greenbuilds, and we’ll send you some swag!