IBM Allies with GitLab to Advance AI Development on Red Hat OpenShift Platform

GitLab and IBM today announced that GitLab’s continuous integration/continuous delivery (CI/CD) platform will become an element of IBM’s Cloud Pak offerings. IBM specifically created Cloud Paks to enable IT teams to build applications that leverage artificial intelligence (AI) capabilities. Cloud Paks are a series of microservices-based frameworks, built using containers, that make it simpler for developers to consume IBM software. As part of this alliance, IBM will be providing support for the GitLab Ultimate for IBM Cloud Paks offering.

Vick Kelkar, director of alliances for GitLab, says this latest alliance extends GitLab’s existing relationship with IBM into the realm of applications that use AI technologies made available through the IBM Watson portfolio.

Kelkar says that approach will enable IT organizations to deploy an instance of the GitLab CI/CD platform alongside the Red Hat OpenShift application development and deployment platform, which is based on Kubernetes. AI-enabled applications can then be developed and deployed on multiple clouds, as well as in on-premises IT environments running, for example, an IBM Z series mainframe, Kelkar says.

The alliance with GitLab comes as IBM beefs up its application development and deployment prowess. IBM’s November 2020 move to acquire Instana was seen as a way to shore up its application performance management (APM) capabilities. Much of that effort can be traced to the $34 billion IBM spent in 2019 to acquire Red Hat. IBM, of course, has its own application release automation tools, but the alliance with GitLab provides IBM with an integrated DevOps platform that can more flexibly support a wider range of application development and deployment scenarios.

IBM is betting that future applications, to varying degrees, will have embedded AI capabilities. Most of those AI-enabled applications are built in containers. While there are many options for embedding AI technologies in applications, as the Red Hat OpenShift platform gains traction, it’s clear IBM is trying to make it simpler for DevOps teams to embed Watson technologies within their workflows.

In the meantime, CI/CD platform providers are jockeying for position in the race to become the standard on which next-generation applications are built. The biggest challenge, however, may not be the AI technologies themselves as much as melding machine learning operations (MLOps) processes – which data science teams rely on to construct AI models – with the DevOps processes needed to build applications.

At present, most AI models take about six months to create and are updated infrequently. Over time, the rate at which AI models are updated will increase. But at the moment, each time an AI model is updated it must be reintegrated into applications that often are updated multiple times a month.

As AI models start to proliferate across DevOps workflows, a better mechanism for melding these processes will have to be decided.

Mike Vizard

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1055 posts and counting. See all posts by Mike Vizard