Container Q & A with Apcera CEO, Derek Collison
I recently had a chance to speak with Apcera CEO Derek Collision about his views on where the market is around containers. Below is the Q & A of our conversation. Also please be sure to take a few minutes and help us by taking the new survey we are running on DevOps, ITOps and PaaS, sponsored by Apecera.
1. What tangible ways are you seeing containers disrupt IT from a business perspective?
Containers are emerging as the preferred way for application developers to build and run software, causing them to think of applications in a modern way, where the focus is more on assembling rather than reinventing the wheel. However, containers are really a means to an end and organizations have to remain agile because they can’t depend on technologies sticking around forever. Right now containers enable organizations to use server resources more efficiently and achieve greater cost savings than virtual machines, but there will be a bigger and better thing that follows. It’s the nature of technology and therefore it’s imperative to look at modern infrastructure holistically rather than focusing on a single piece to the puzzle.
2. What myths do you think are being propagated about containers?
There is a myth or prediction that containers will replace virtual machines (VMs) entirely. Virtualization enables a physical server to be carved up into a number of smaller virtual machines. A container and a VM can coexist to provide an additional layer of isolation, security and control and it’s much more likely that VMs will become the new legacy app rather than dying off completely. Secondly, there is this idea floating around that containers are assumed to be secure because of their design, which is completely false. Security is one of those words that carries a lot of connotations and there are endless elements that go into making an application truly secure. Our priority at Apcera is ensuring that trust, policy and governance are at the core of our platform so that a large portion of the security boxes can be checked off.
3. There’s no doubt that there has been an explosion of new technologies impacting IT, with containers being top of mind. How can companies keep pace with the rate of change in technology in order to be successful?
Technology curves are shortening, which means companies need to adapt at a rapid pace, but more importantly, they need to ensure that innovation today enables infrastructure to scale securely for the future. Enterprise IT teams will rely on third parties and partnerships to rapidly learn, use and integrate softwares, including container technologies.
4. There is a lot of industry debate about how secure containers are – in fact some people argue that containers are inherently secure by nature. Do you agree? And if not, what components do you think are missing?
When a single container is hacked, the attacker has a smaller view of the world than if they hacked a normal server. Even though the attack surface area is reduced by using containers, an organization’s infrastructure is still at risk. The essential component for both containers and infrastructure is deep and explicit policy. There has to be a way to express policy for specific realms of control (e.g. who can launch a process or who can read an audit log) and for specific environmental scopes (dev, production or project X). Policy must live in one place, but pervade everything that teams want to manage, independent of where the decisions are being made (on-prem or in the cloud). And, policy has to be automatically enforced by the system without manual oversight or intervention.
5. In a recent interview you mentioned that organizations need be more mindful of assembling systems, as opposed to building them, can you elaborate on this idea?
While organizations may be building some of their applications, they are assembling more by buying “off the shelf” software, combining open source components and implementing more SaaS services. The software vendors working with these organizations are also adopting containers to create a distribution/deployment unit—similar to how ISVs only embraced VM appliances after enterprises standardized on VMs. A microservices based architecture and infrastructure not only allows enterprise teams to assemble more and build less, it is also more resilient to failure and easily reusable.
Pingback:DevOps, ITOps and PaaS: What do you think? - DevOps.comDevOps.com
Pingback:Survey on Modern IT - We Need Your Opinion! - DevOps.comDevOps.com