Red Hat Pushes OpenShift to the Network Edge

At the Mobile World Congress, Red Hat today extended its effort to make the Kubernetes-based Red Hat OpenShift platform a foundation for edge computing by extending alliances with NVIDIA and partnerships with Samsung, Airspan, Druid Software and OMRON.

NVIDIA will work with Red Hat using NVIDIA converged accelerators and the NVIDIA Aerial software development kit (SDK) for software-defined 5G virtual radio access networks (vRAN) deployments on industry-standard servers across hybrid and multi-cloud computing environments. NVIDIA graphical processing units (GPUs), data processor units (DPUs) and converged accelerators, which combine a NVIDIA Ampere GPU with an NVIDIA BlueField-2 processor in a single package, are designed to disaggregate compute, storage and networking by isolating processing tasks.

Similarly, Red Hat is pledging to work with Samsung to create a vRAN platform based on Red Hat OpenShift, Red Hat Enterprise Linux (RHEL), Red Hat Advanced Cluster Management for Kubernetes and the Red Hat Ansible Automation Platform. In addition, Red Hat is working with Airspan and Druid Software to validate interoperability between Red Hat OpenShift and container network functions (CNFs).

Finally, OMRON has committed to using Red Hat OpenShift to employ Kubernetes as the control platform to manage industrial systems and processes.

Azhar Sayeed, global chief architect for Red Hat, said as more workloads are deployed at the network edge, organizations are building new classes of response-and-react applications. These event-driven applications typically invoke cloud resources in near-real-time after data has first been processed and analyzed at the network edge, he notes.

It’s still early days as far as deployments of Kubernetes at the edge are concerned, but the number of use cases involving stateful applications infused with machine learning algorithms that process large amounts of data is already increasing. As a result, the number of applications that will need to be built and deployed at the edge using DevOps best practices appears set to grow at an exponential pace.

As IT environments become more distributed, it’s also inevitable that organizations will rely more on artificial intelligence to manage IT operations (AIOps) especially as application workloads become more geographically distributed, says Sayeed. It’s not going to be feasible for already shorthanded IT teams to manage those environments without relying on AI, he adds.

It’s fairly clear that as the relationship between edge and cloud computing continues to evolve, there is a new era of IT dawning. There may even come a day when there are more workloads running at the network edge than in the cloud. Many of those workloads, however, will be able to dynamically invoke additional cloud resources as required on demand.

In the meantime, IT teams will need to determine how best to organize themselves as they are increasingly augmented by AI. It’s not likely IT professionals will be replaced by AI; instead, they will be able to manage IT at unprecedented levels of scale. The challenge and the opportunity now is building and deploying new classes of applications that are, by definition, a lot more complex than anything that has gone before.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1605 posts and counting. See all posts by Mike Vizard