June 25, 2017

Docker containers make life easier. But they also make it more complicated. This is the Docker paradox. Here’s what it means.

If you’re familiar with Docker, you likely already understand how it simplifies the lives of developers and admins. Containers keep environments consistent. They make deployment faster. They make your apps run more efficiently by reducing environment overhead, thereby reducing the amount of physical infrastructure you have to maintain.

The Docker Paradox

But it’s also undeniable that Docker also makes environments much more complicated in the respect that containers add many more moving parts compared to virtual machines or bare-metal servers.

Think about it. When you migrate to a Docker environment, you no longer have just apps and physical or virtual servers to worry about. With a Dockerized app, you have (usually) a series of microservices. You have a container registry. You have an orchestrator. You have the Docker daemon, which runs inside an operating system, which runs probably inside a virtual machine, which runs on a bare-metal server. And you have dozens or hundreds or thousands of individual containers.

That’s a long list of moving parts—and we haven’t even mentioned overlay networks or software-defined storage systems.

The Docker Paradox in Historical Perspective

If you think about the history of computing, the fact that Docker is more complex than what came before it makes sense. Virtual machines were more complex and more difficult to manage than bare-metal servers.

Going back further, task-switching operating systems were more complicated than ones that only let you do one thing at a time. The introduction of networking created new management challenges for computer admins. So did the introduction of cheap persistent storage, peripheral input devices and so on.

In other words, the history of computing has been shaped by a long trend toward more complex systems that provide more flexibility and functionality. Docker is just the latest phase in that trend.

Complexity as the Trade-Off for Agility

It’s from the complexity of a Dockerized environment that Docker’s chief selling-point—its agility—derives. If Docker environments weren’t highly distributed and rooted in software-defined infrastructure, they wouldn’t be as massively scalable and reliable as they are.

This means there’s no escaping the Docker paradox. If you want to use Docker to greatest advantage, you have to prepare to manage the complexity that comes with it.

Managing the complexity means understanding that the way you monitor containerized services, applications, storage, networks and so on requires a different approach than the one you took to conventional infrastructure. Docker is a whole different game.

Christopher Tozzi

Christopher Tozzi has covered technology and business news for nearly a decade, specializing in open source, containers, big data, networking and security. He is currently Senior Editor and DevOps Analyst with Fixate.io and Sweetcode.io.