Red Hat Enlists ABB to Push OpenShift to the Industrial Edge

Red Hat and ABB today announced they have allied to drive the adoption of the Red Hat OpenShift platform based on Kubernetes in industrial edge computing environments.

Nick Barcet, senior director for technology strategy within the CTO organization at Red Hat, says this alliance is the latest in what will become a series of partnerships to extend container applications to edge computing platforms that have 2GB of RAM to process data.

ABB has a long history of automating operation technology (OT) processes. The goal is to make it simpler to integrate IT and OT processes in a way that causes the least amount of disruption, says Barcet. The easiest way is via gateways running Red Hat Open Shift and Red Hat Enterprise Linux (RHEL) and ABB Ability Edgenius and ABB Ability Genix Industrial Analytics and AI Suite software that integrate with edge computing devices, he notes.

As edge computing continues to evolve, organizations are moving to process and analyze data in near real-time at the point where it is created and consumed versus transferring every bit of raw data to the cloud or a local data center. Instead, only the aggregated results need to be transferred to applications that, for example, analyze historical data.

That approach not only consumes less bandwidth but enables organizations to reduce the amount of carbon they are creating to transfer raw data across a wide area network (WAN) to be processed elsewhere, says Barcet.

It’s not clear just how many application workloads will be processed and on what type of edge computing platform. There are already five distinct classes of edge computing platforms that could run on the Red Hat OpenShift platform, says Barcet.

The biggest challenge, of course, is making it simpler for OT and IT professionals to collaborate as those applications are rolled out. OT teams typically want to be able to run their applications with as little outside interference from IT teams as possible, says Barcet.

Containers running on Kubernetes clusters make it a lot easier to deploy applications on edge computing platforms with limited compute capabilities. Depending on the type of edge computing platform, those containers can be deployed on bare-metal infrastructure. The challenge that immediately arises is finding a way to centrally manage the fleets of Kubernetes clusters that span an edge computing environment.

One way or another, containers and Kubernetes clusters are heading toward the edge in increasing numbers. There may even come a day when there are more containers and Kubernetes clusters running at the edge than there are in the cloud.

In the meantime, IT teams familiar with containers should start educating their OT brethren about edge computing’s possibilities. The practice of OT teams deploying an isolated platform that is manually updated every few years is giving way to a new era of IT that will see applications at the edge continuously updated using containers that are simple to rip and replace whenever necessary.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1620 posts and counting. See all posts by Mike Vizard