November 23, 2017

The open-source world is quickly taking large strides into containerization, specifically consolidating its efforts behind solutions such as Kubernetes.

Containers increasingly are becoming attractive options for running applications, thanks in large part to their speed and versatility. But, there’s a clear generational gap emerging on this front. Many of the early and enthusiastic adopters of Kubernetes, for instance, are smaller, innovative companies building their own platforms. For them, the appeal is clear: If you’re hedging bets that containerization will be the next generation of IT infrastructure, then technologies such as Kubernetes provide the ecosystem of tools an organization needs to operate in that space. Built-in capabilities including automatic scaling and deployment for applications make these especially appealing for the innovators.

The larger, more established enterprises are another matter. By and large, enterprises seem downright allergic to containers, citing, among other reasons, security concerns as their biggest objection. But, these concerns, while understandable, are misplaced. Containers actually give enterprises more, not less, autonomy over their security needs.

Containers Make Enterprises Nervous About Security

The Equifax data breach has especially crystallized enterprise fears around security concerns, and particularly around the wide-ranging implications of a breach caused by inadequate cybersecurity. Consequently, enterprises don’t want to rock the boat and may see investing in some new security protocol, such as containers, as doing just that.

In particular, enterprises may be wary about the relative lack of isolation that goes hand in hand with containers compared to an alternative, such as virtual machines. Containers may be isolated from one another by design, but they share the same OS. So, a flaw anywhere in the stack can still end up compromising the entire structure. As a result, these companies may feel that VMs, which provide an additional layer of isolation to insulate critical assets from breaches and attacks, may be more reliably secure than containers.

Containerization Puts Security Back in the Hands of the Enterprise

That feeling, though, doesn’t hold in practice. Enterprises may feel as if VMs are a more secure alternative to containers, but the truth is that containers are no more or less risky than VMs and vice versa. All IT infrastructure shares the same level of security risk. As long as you’re providing a level of root access to users, there’s no getting around that as a vulnerability. In other words, you can’t control for root access as an inherent issue; it’s already baked into the cake. The real security risk that you can control for is in how you produce containers.

Failing to keep containers regularly updated with new security definitions can introduce new vulnerabilities waiting to be exploited, undermining your containers from the start. Keeping these containers updated at a reasonable pace helps ensure that you will be well-covered from security risks.

This goes both ways. Security requires a shift in thinking, specifically when it comes to investing in the idea that containers are not long-lived, impregnable bubbles. This dual effort—of both implementing container security updates and adjusting your company’s mindset toward containers and security at large—effectively empowers containerization as a way of giving the enterprise more agency and autonomy over their security. It puts the responsibility for security directly in the hands of the enterprise.

To be sure, this can be scary new ground to break. But, this push for containerization is a good thing in the long run and ensures that people will be responsible for imparting a new way of thinking about security to their dev teams.

IT Must Take Greater Responsibility for Securing its Containers

That responsibility starts with investing in new architectures, such as OpenStack, to serve the future infrastructure of your enterprise. And, from there, building one OS cloud that spans multiple clusters—designating clusters to different departments, so everyone has their own versions and IT doesn’t have to make sweeping changes to keep up the pace of upgrades.

This responsibility also means getting over the mindset of thinking that every time a previous paradigm sunsets, it becomes irrelevant. That’s not true—look at mainframes, bare metal, VMs, etc., all of which are still very much relevant today. Accepting the role of containers as part of your IT infrastructure is the biggest step IT must take in ensuring a successful enterprisewide implementation.

Enterprises like predictability and standardized foundations from which to work and they are wary of investing resources into anything that will too drastically upend their current model for managing security. This is especially true at a time when breaches and compromised security protocols are turning some of the world’s biggest brands into public embarrassments.

But, that wariness is not going to stem the rising tide of containerization. Rather than bucking this trend, enterprises should and must accept the responsibility of having their hands on the wheel when it comes to adopting and securing their containers.

About the Author / Marco Ceppi

Marco Ceppi works on the Product Strategy team at Canonical. He spends most of his time focusing on Cloud, Containers and the Cloud Native landscape. Connect with him on Twitter and LinkedIn.

Lorum Ipsums asdfasdfasdf