Puppet Automates Managing Docker Container Images at Scale

Building a Docker container image is not especially difficult. But building and deploying hundreds or even thousands of Docker container images as new applications get developed or continuously upgraded is whole other matter.

At the PuppetConf 2016 conference this week Puppet announced the general availability of Puppet Docker Image Build, which extends the Puppet IT automation framework in a way that allows IT organizations to manage container images at scale using the same framework they use to manage the rest of the data center.

Derived from a Project Blueshift initiative that Puppet launched to specifically address container management requirements, Puppet Docker Image Build is designed to enable IT operations teams to build a Docker Image and group microservices based on containers together so updates to those microservices can occur simultaneously or in a specific order defined by the IT organization, says Tim Zonca, director of product marketing for Puppet.

While there is clearly a lot of enthusiasm for microservices, Zonca says most organizations have not thought through what will be required to manage what essentially are much more granular sets of IT services that are subject to frequent changes. Developers that employ Docker containers update their applications by replacing entire Docker containers with new ones housing the functions they added to the application. That represents a major challenge for IT operations teams that might be confronted with applications appearing at faster rates than ever and what amounts to frequently redeploying various microservice elements of an existing application, says Zonca.

Puppet Docker Image Build is designed to complement support for automating the management of container clusters based on either Kubernetes, Mesos or platforms from Docker Inc.

Rather than managing containers in isolation, Zonca says it’s clear IT organizations will need to be able to manage containers alongside all their legacy IT environments. To make it possible to comprehensively rise to that challenge, Puppet also updated Puppet Enterprise 2016 to add support integration with the Jenkins continuous integration platform. That integration, he says, effectively now makes it possible for an IT organization to define an entire Jenkins pipeline as code to programmatically create any given workflow needed.

Collectively, Zonca says all the capabilities that Puppet is imbuing in its namesake framework provide IT organizations with more context and situational awareness at a time when the overall IT environment is becoming more complex to manage. Armed with that information IT organizations can separate duties between developers and IT operations as they best see fit for their organization, he says.

The degree to which developers will actually be engaged in IT operations varies widely by organization. The one clear thing is just about every organization wants developers and IT operations teams to work hand in glove to deploy applications faster. The challenge now is finding a way to manage IT at a more granular level that winds up actually making the overall IT environment more resilient, even though technically there are more moving parts than ever.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1620 posts and counting. See all posts by Mike Vizard