Red Hat Bolsters Cloud-Native and Edge Portfolio

Red Hat today at the KubeCon + CloudNativeCon North America conference unfurled a bevy of updates that promise to both make it easier to build and deploy applications on fleets of clusters based on its distribution of Kubernetes and manage the overall environment.

The latest version of the Red Hat OpenShift platform makes generally available a single node option for deploying a cluster at the network edge. The company already provides a three-node cluster and remote worker node option.

Stu Miniman, director of market insight for cloud platforms at Red Hat, says the single node option is designed for edge computing deployments that require a lighter-weight approach to deploying a full-feature cluster that is not dependent on a remote instance of Kubernetes to provide a control plane.

As part of that focus on edge computing, the latest version of Red Hat Advanced Cluster Management for Kubernetes adds a preview of an ability to now manage close to 2,000 single node OpenShift clusters via a single Red Hat Advanced Cluster Management hub. Version 2.4 of Red Hat Advanced Cluster Management for Kubernetes also adds support for IPv6 dual stack protocol, the ability to create templates for implementing policies and a preview of an installer for the management framework that enables zero-touch provisioning. The company is also making available validated patterns for configuring edge computing deployments.

Red Hat also revealed today it has updated several of the tools it provides for building and deploying cloud-native applications. The Red Hat Quarkus framework for building Java applications now includes a continuous testing capability, a revamped command line interface (CLI) that eliminates the need for developers to use Gradle or Maven commands; an updated graphical user interface (GUI) and tools that automatically create, configure and connect to databases, message queues and other backend services. The latest version of Quarkus also provides previews of integrations with Funqy, a portable function application programming interface (API), and Knative, open source software for integrating Kubernetes with serverless computing frameworks.

OpenShift GitOps and OpenShift Pipelines have also been updated to provide deeper integration with portable open source pipelines built using Tekton tools being advanced under the auspices of the Continuous Delivery Foundation. In addition, OpenShift Pipelines is now verified to work in fully managed environments such as Red Hat OpenShift Service on Amazon Web Services (AWS) and Red Hat OpenShift Dedicated. OpenShift GitOps, meanwhile, has been extended to better secure software supply chains by adding enhanced security, simplifying multitenancy configurations and the ability to use OpenShift credentials for logging into the open source ArgoCD platform.

Improvements have also been made to the security of the Red Hat OpenShift Serverless platform, which now includes the encryption of in-flight data through OpenShift Service Mesh. Red Hat has also made available a technology preview of OpenShift Serverless Functions, a development framework and tooling built on top of Knative that the company donated to the Knative project earlier this year.

The OpenStack unit of IBM has also updated the Red Hat OpenStack Platform to provide tighter integration with the Red Hat OpenShift platform based on Kubernetes. Red Hat OpenStack Platform 16.2 also adds life cycle management tools, the ability to run virtual machines and cloud-native applications in parallel; support for third-generation Intel Xeon Scalable processors; expanded storage options and support for Open Virtual Networking (OVN).

Finally, earlier this week Red Hat updated its Kube by Example training curriculum to include courses on using operators with Helm, Ansible and Go to package applications; using open source Konveyor tools to modernize and migrate applications and a forthcoming artificial intelligence (AI) and machine learning (ML) for Kubernetes module.

It’s too early to say just how big the fleets of Kubernetes clusters organizations may deploy from the edge to the cloud will get. However, regardless of size, the tools to build, deploy and manage cloud-native applications at that level of distributed scale are finally available.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1621 posts and counting. See all posts by Mike Vizard