Container Q & A with Apcera CEO, Derek Collison

Derek CollisonI recently had a chance to speak with Apcera CEO Derek Collision about his views on where the market is around containers. Below is the Q & A of our conversation.  Also please be sure to take a few minutes and help us by taking the new survey we are running on DevOps, ITOps and PaaS, sponsored by Apecera.

1. What tangible ways are you seeing containers disrupt IT from a business perspective?

Containers are emerging as the preferred way for application developers to build and run software, causing them to think of applications in a modern way, where the focus is more on assembling rather than reinventing the wheel. However, containers are really a means to an end and organizations have to remain agile because they can’t depend on technologies sticking around forever. Right now containers enable organizations to use server resources more efficiently and achieve greater cost savings than virtual machines, but there will be a bigger and better thing that follows. It’s the nature of technology and therefore it’s imperative to look at modern infrastructure holistically rather than focusing on a single piece to the puzzle.  

2. What myths do you think are being propagated about containers? 

There is a myth or prediction that containers will replace virtual machines (VMs) entirely. Virtualization enables a physical server to be carved up into a number of smaller virtual machines. A container and a VM can coexist to provide an additional layer of isolation, security and control and it’s much more likely that VMs will become the new legacy app rather than dying off completely. Secondly, there is this idea floating around that containers are assumed to be secure because of their design, which is completely false. Security is one of those words that carries a lot of connotations and there are endless elements that go into making an application truly secure. Our priority at Apcera is ensuring that trust, policy and governance are at the core of our platform so that a large portion of the security boxes can be checked off. 

3. There’s no doubt that there has been an explosion of new technologies impacting IT, with containers being top of mind. How can companies keep pace with the rate of change in technology in order to be successful?

Technology curves are shortening, which means companies need to adapt at a rapid pace, but more importantly, they need to ensure that innovation today enables infrastructure to scale securely for the future. Enterprise IT teams will rely on third parties and partnerships to rapidly learn, use and integrate softwares, including container technologies.  

4. There is a lot of industry debate about how secure containers are – in fact some people argue that containers are inherently secure by nature. Do you agree? And if not, what components do you think are missing?

When a single container is hacked, the attacker has a smaller view of the world than if they hacked a normal server. Even though the attack surface area is reduced by using containers, an organization’s infrastructure is still at risk. The essential component for both containers and infrastructure is deep and explicit policy. There has to be a way to express policy for specific realms of control (e.g. who can launch a process or who can read an audit log) and for specific environmental scopes (dev, production or project X). Policy must live in one place, but pervade everything that teams want to manage, independent of where the decisions are being made (on-prem or in the cloud). And, policy has to be automatically enforced by the system without manual oversight or intervention.

5. In a recent interview you mentioned that organizations need be more mindful of assembling systems, as opposed to building them, can you elaborate on this idea?

While organizations may be building some of their applications, they are assembling more by buying “off the shelf” software, combining open source components and implementing more SaaS services. The software vendors working with these organizations are also adopting containers to create a distribution/deployment unit—similar to how ISVs only embraced VM appliances after enterprises standardized on VMs. A microservices based architecture and infrastructure not only allows enterprise teams to assemble more and build less, it is also more resilient to failure and easily reusable. 

Alan Shimel

As Editor-in-chief of DevOps.com and Container Journal, Alan Shimel is attuned to the world of technology. Alan has founded and helped several technology ventures, including StillSecure, where he guided the company in bringing innovative and effective networking and security solutions to the marketplace. Shimel is an often-cited personality in the security and technology community and is a sought-after speaker at industry and government conferences and events. In addition to his writing on DevOps.com and Network World, his commentary about the state of technology is followed closely by many industry insiders via his blog and podcast, "Ashimmy, After All These Years" (www.ashimmy.com). Alan has helped build several successful technology companies by combining a strong business background with a deep knowledge of technology. His legal background, long experience in the field, and New York street smarts combine to form a unique personality.

Alan Shimel has 54 posts and counting. See all posts by Alan Shimel

2 thoughts on “Container Q & A with Apcera CEO, Derek Collison

Comments are closed.