April 28, 2017

People tend to talk about containers as a radically innovative type of technology. Yet, it might be better to think of containers as merely the next incremental step in a long trend toward modularity, which stretches back decades. Keep reading for an explanation of what I mean.

For many people, container platforms such as Docker are a revolutionary type of technology. True, containers are not actually a very new idea; container technology stretches back decades. But most people only started paying attention to containers when Docker launched in 2013 and began eying the enterprise market.

It’s also true, of course, that containers build upon the same ideas as virtual machines. Yet, they implement those ideas in a different way. Containers may look like virtual machines, but they feel very different once you get under the hood.

Viewed from the perspectives outlined above, containers seem pretty novel.

Containers and Modularity

If you measure from the perspective of the longstanding trend toward modularity within the programming and infrastructure worlds, however, containers seem much less new—at least from a conceptual point of view.

In other words, programmers and admins for decades have been gravitating toward practices and methodologies that prioritize modular, agile architectures for software and infrastructure design. Consider the following innovations:

  • Object-oriented programming. This programming technique, which dates in some ways to as early as the 1960s but became mainstream in the 1990s, encourages developers to design apps in such a way that code blocks are modular and portable.
  • Microkernels. If you were paying attention to operating system design in the late 1980s and early 1990s—or if you have read about it, at least—you know that microkernels were all the rage back then. Microkernel architecture is a way of designing an operating system so that the kernel is broken down into many small, modular programs. It’s the opposite of a monolithic kernel.
  • Software-defined infrastructure. Virtualization and related technologies that entered the enterprise market in the 2000s made it easy to break big, monolithic, bare metal servers into smaller virtual pieces.
  • DevOps. The DevOps movement, which emphasizes small, continuous changes to code as the key to a healthy workflow, in many ways goes hand in hand with Docker. But DevOps actually predated the introduction of Docker containers by a few years. So you can think of it as a precursor to the container revolution.

In many respects, containers represent the next stage in the trend toward modular systems of software and infrastructure design. They expand upon the same principles as the technologies described above.

Modularity and the Enterprise

Why does the way you think about containers matter? It’s because viewing containers as simply the next step on an evolutionary path that organizations have been following for years makes it easier to understand how containers fit into the enterprise.

If you think of containers as a radically new type of technology, they can seem scarier—if you’re an executive, at least. New technologies generally mean unpredictability, high acquisition costs, compliance challenges and other issues.

But if you think of containers as a step forward on a familiar path—or you can get your boss to think of them that way—then they become easier to sell.

So, if you’re wondering what it will take to make containers mainstream within the enterprise, an emphasis on modularity is a key part of the answer.

Christopher Tozzi

Christopher Tozzi has covered technology and business news for nearly a decade, specializing in open source, containers, big data, networking and security. He is currently Senior Editor and DevOps Analyst with Fixate.io and Sweetcode.io.