November 23, 2017

Docker Inc. plans to give customers the option to deploy one of several open-source serverless computing frameworks on top of the Docker Enterprise Edition (EE) container-as-a-service (CaaS) environment. Rather than attempting to lock customers into a specific serverless computing framework, IT organizations should be free to choose from multiple frameworks that can be deployed on-premises or in one of several public clouds, says Docker Inc. founder and CT Salomon Hykes.

Hykes says monolithic approaches to serverless computing such as the Lambda service provided by Amazon Web Services (AWS) are contrary of any multicloud strategy any IT organization is trying to implement. Docker Inc. plans to reveal what serverless computing frameworks it intends to support as building blocks on top of Docker EE in the months ahead, he says.

Docker envisions that the functional programming models used to develop applications on a serverless framework can either be deployed as a container or accessed as a sub-process via an application programming interface (API), he says.

Hykes says most of the research and development effort at Docker Inc. will focus on optimizing use cases for Docker Enterprise Edition, the CaaS environment his company is positioning as a lighterweight alternative to traditional platform-as-a-service (PaaS) enivironments. In addition to serverless computing, internet of things (IoT) applications represents another emerging opportunity, Hykes says, adding Docker Inc. will also focus on improving support for graphical processor units (GPUs).

Finally, Hykes envisions Docker EE to host applications that incorporate machine learning algorithms, noting Docker Inc. plans to infuse machine learning algorithms into the core Docker EE platform to automate operations.

In the near future, Hykes says organizations will find themselves trying to manage millions of objects distributed across a CaaS environment. In some cases, one instance of Docker EE could span more than a billion objects. Machine learning algorithms will be employed to both identify patterns and then optimize the overall environment.

To help standardize the management of those environments Docker Inc. is also collaborating on a Container Storage Interface that would create a standard application programming interface for connecting storage systems to both Docker Swarm and Kubernetes.

Docker Inc. claims there are now more than 21 million Docker hosts running production applications. It won’t be long before IT organizations will need to address a slew of DevOps challenges spanning both new applications and legacy applications that are being lifted and shifted into the cloud using containers. Once that occurs, IT organizations will be extending those applications using microservices before eventually deconstructing them to run a cloud-native microservices. However, it may take years for that transition to play out.

In the meantime, IT organizations would be well-advised to take a long look at their existing DevOps processes. Adoption of containers on some level is all but inevitable. But instead of managing individual applications, those IT teams are going to need to think in terms of managing fleets of microservices.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.