DockerCon15 – Docker for Grown Ups

What does Docker want to be? An amazing tool built by developers for developers, or the

modern enterprise pipeline, where containers drive everything? While containers are not new,

there is no question that Docker has taken them to the next level. For the enterprise however,

containers cannot stand alone, and are still a rather immature tool. At DockerCon15 I got to see

how Docker plans to grow up.

 

Attending DockerCon15 was a great experience. The info hungry audience was too big for a

hotel, and just too small for Moscone center. However at the rate the interest is growing, I would

not be surprised if next year it was.

 

The makeup of the crowd was surprising. While I was expecting developers from mid to large

companies, I should have known better. Developers have Docker down, but not a lot of ways to

justify an event like this. Instead the show consisted of product teams from large tech

companies like Cisco. And IT Ops from a lot of traditional businesses like financial services and

healthcare. You could hardly call the audience Docker fanatics, but they see enough value to

open up their ears and learn.

 

Setting the Stage: Handicapped for the enterprise

The perception is that the Docker community and user base is very large. But what are they

doing with it? Are they integrating it into the delivery chain?

 

I have yet to encounter a use case that spanned beyond onesie twosie developers. When I talk

to companies including the 20 surveyed at the show, there is only one clear use case. It is ad-

hoc developer by developer containers. They use them as a local library of images to deploy

and test on, as well as do some experimental testing. And the widely available existing

containers/configurations hub is a huge plus. This really is just sandboxing supercharged. But it

stops there, the pipeline has not standardized on Docker containers, and rarely do they make

into production.

 

So the next question is how can you take what is available today and make it work team wide?

How do you make it fit into the enterprise? Or do you just wait? The key areas of functionality

that Docker has really been lacking in order to satisfy the enterprise are:

 

1.) Security

2.) Global container visibility and metadata

3.) Container Management

4.) Release Automation

5.) Governance and Auditing

6.) Limited to front-ends and small applications

 

The lack of some of these are what hackers, and lawsuits thrive on. And in the enterprise

careers live and die by. It is not a small list. However DockerCon15 had several things to offer.

Features and strategies that are encouraging for enterprise adoption, and address the above

weak spots. Starting with some key announcements.

 

Docker Grows Up

They keynote had a strong polarization to the community, and besides their great graphics, and

giveaways, was pretty lackluster. However there were a few things that I believe will help move

the Dockerized pipeline forward.

 

  • Notary: Notary is a trusted publishing system, that runs a filter on all newly created

container instances. This is awesome. It mitigates one of my big fears about containers,

which is container sprawl. Many containers in the wild where configurations are

unknown, or old. Notary can be used to make sure there is a standard that is enforced

before containers go out the door, so fewer mystery containers will exist. But also the

process of creating filters alone forces teams to think about security, which educates

devs, and breaks some barriers with IT. The only path to a sustainable environment.

Notary is currently in beta, and has been open-sourced, which I think is a good and bad

thing. With all the code out there it is possible that exploits will be slightly easier to

identify both for committers to fix and hackers to leverage.

 

  • Docker Machine: Docker machine is not new, but it is more advanced, with built in

functionality to run notary automatically on containers.

 

  • Open Source RunC and Docker Open Standard: Docker has partnered with the Linux

Foundation as an independent body to manage the source code of RunC and containers

in general. This is an exciting announcement that will make containers possible on many

platforms, and integrate into existing systems. Which I believe could be a commercial

threat to Docker in the next few years. They will feel the pressure to transform into an

organization that embraces IT, and the Enterprise, along with keeping up with

Developers.

 

  • Networking: Honestly even the new networking functionality is weak. And not yet close

to satisfying the networking requirements for enterprise IT. But it is still nice to see

Docker is thinking about it. Some of the new functionality makes it easier to connect

containers, but I would suspect that the current solution would still involve software

defined networking SDN agents, that are part of the scripting and then provisioning.

 

  • Microsoft + Docker: Containers in some form will make it to Windows Server. What i’m

not clear on, nor is it announced, is how compatible the Windows container system will

be with ones running on Linux hosts. It looks like the updates are a combination of

changes to the operating system itself. Maybe LXC look alike? Update the file system to

support isolation. And services to control provisioning and container management. I

assume this will be a component/offering in Azure as well.

 

Reliant on Partners

It was not only what Docker had to say that got my attention. In addition to the announcements,

there was a lot to be had from the exhibitors. A few of the software vendors in the developer

tooling space have clearly identified the enterprise Docker gap. They have jumped on the

opportunity to up level containers, and ride the movements massive wave.

 

Companies like IBM with Bluemix have added containers to their pipeline management tool (

some might consider it a release automation tool). Log analysis companies like Logentries,

Sumologic ( both customers of mine ), and Loggly, make it much easier to add system logging

on Docker containers, and host operating systems.

 

Companies like DCHQ provide a great and much needed oversight and auditing system. And

tools like fugue.it to automatically replace containers that don’t meet requirements or have old

configurations.

 

But that is not all it takes. Professional services companies like Contino and nebulaworks are

going to be needed as well by most enterprises, to assess, plan, and implement containers.

Because not having the immediate resources or expertise is a huge inhibitor for an enterprise to

even get started, and started from a holistic point of view.

 

The trend is going to continue with new partners announcing easier ways to work with Docker

and there tool. I would be interested to see companies like OpenLogic and Maven/Sonatype

jump in, to add more credibility to what is deployed on the containers.

 

Docker cannot currently survive beyond the ad-hoc developer use case. However the direction

of the Docker product team is good, and the partners are doing a lot to make it possible for

enterprises to build that dream pipeline.

 

One thing I still struggle with is all the moving parts. In order to complete the enterprise Docker

story you need: log analysis for containers and hosts, auditing, release automation,

SDN/Networking, infrastructure scripting like Chef and Puppet, professional services, better

private library management, and the ability to automatically replace containers on the fly. If you

were to build a fully automated pipeline where you are moving only containers, you would need

to invest in a lot of oversight, release automation tools and coding to wire it up. Which adds

additional points of failure, and a lot to change when updates inevitably come.

 

What I got from DockerCon15 was that the company is serious about going beyond the

individual non-paying developer. And when this happens either by Docker, or some other

software vendor, containers at the very least will be the standard for application front-ends, and

PaaS applications. But hopefully the entire delivery chain.

Chris Riley

Chris Riley

Chris Riley (@HoardingInfo) is a technologist and DevOps analyst for Fixate IO. Helping organizations make the transition from traditional development practices to a modern set of culture, tooling, and processes that increase the release frequency and quality of software. He is an O'Reilly author, speaker, and subject matter expert in the area of DevOps Strategy, Machine Learning, and Information Management.

Chris Riley has 7 posts and counting. See all posts by Chris Riley