Containers are a hot trend and they are transforming the way companies develop, deploy, and maintain apps. The very existence of ContainerJournal—a site dedicated to news, education, and community building related to containers—is a testament to the strength of the container movement. As great as containers are, though, they don’t work well in a vacuum. You have to have a solid IT infrastructure in place in order for the containers to work their magic.
The containers are like the engine of a car and the IT infrastructure the containers run on is the car itself. It doesn’t do you any good to have a great engine if the car itself is rusted out and has a broken axel or flat tires. You can have a precision engine with massive horsepower capable of propelling a Formula One race car, but if you put it in a beat up Yugo (Google it) it won’t do you much good.
What makes up the infrastructure then? What is the foundation the containers depend on? The foundation is comprised of the servers and network resources that run and deliver the cloud apps. It might be physical servers in an on-premise datacenter, but in most DevOps environments the foundation is a combination of virtual servers running in the cloud.
There are a variety of cloud providers and virtualization platforms to choose from: Amazon Web Services, Rackspace, and Microsoft Azure are major players in the cloud market but there are also smaller companies like ProfitBricks that specialize in delivering a cloud platform optimized for containers.
Depending on the size of your organization and the resources and expertise you have available to manage the infrastructure itself a cloud platform optimized for containers can be a big advantage. Developers don’t need to use standalone applications or custom scripts to scale resources because those capabilities are integrated seamlessly into the ProfitBricks cloud offering.
The other pillar of the container foundation is virtualization. There are a variety of options available for virtualization as well, but the two primary competitors are Microsoft and VMware. Both are working closely with container ecosystems and pursuing container initiatives of their own.
The value of containers lies in the portability, agility, and interoperability across various platforms and operating systems. One thing that helps immensely is the partnership across the industry with the Open Container Project (OCP). By establishing standards for containers OCP prevents businesses from painting themselves into a proprietary corner that limits which container platforms or infrastructure environments they can use.
Jim Zemlin, executive director of the Linux Foundation, explained, “With the Open Container Project, Docker is ensuring that fragmentation won’t destroy the promise of containers. Users, vendors and technologists of all kinds will now be able to collaborate and innovate with the assurance that neutral open governance provides. We applaud Docker and the other founding members for having the will and foresight to get this done.”
OCP provides some peace of mind for organizations developing with containers. If a cloud provider or virtualization platform can’t meet the organization’s needs the standards developed through OCP will ensure that it can take its containers and move relatively seamlessly to a different solution. It is great to have that freedom because the cloud and virtualization foundation you put your container “engine” into can make a big difference.