Containers can cut costs and improve flexibility

Containerization is part of the move to mode 2 computing, says Colin McCabe, director of consulting and training at Red Hat. It is being used not only for new applications being developed, but also for deploying packaged software – so it is important to talk to software vendors about their containerization strategy and how it fits in with yours.

More digital-focused organizations such as banks, airlines and retailers are deploying containerization faster, with the goal of getting business ideas into production more quickly, he says, and this is especially true when containers are used in combination with PaaS offerings such as Red Hat OpenShift.

Containers are largely about abstracting the application from the operating system, so that a container (and therefore the application within it) can be moved from one platform to another, for example from a virtual machine on a local server to a PaaS.

Traditionally, virtual machines have been tied to a specific virtualization platform, explains McCabe. While tools have emerged to overcome this problem, moving between platforms isn’t as seamless as people would like. Containers get around this issue, making it easy to move workloads around, whether that’s from physical servers to cloud, from one cloud provider to another, or some other combination.

“The cost of running any application is falling dramatically,” he says, and to take further advantage of the situation you can get quotes for running a given container on various platforms. The portability of containers makes migrations between providers viable.

When considering the adoption of containers, it is important to think about your organisation’s position with regard to bimodal IT, he says. Mode 1 is traditional IT; mode 2 is new projects where modern approaches (such as agile and DevOps) and technologies (eg, containers) can be put to work.

Mode 2 is “where the containers and microservices are going to come to the fore,” says McCabe. Organizations should make a decision that “all new IT projects will use this methodology,” he says, and then invest in the training of their staff.

But McCabe is not recommending the development of high levels of in-house expertise in every area. Rather, he suggests bringing in specialist advice (eg, regarding architecture or engineering) where necessary, and seeking input and assistance from vendors.

The management of containers – especially with regard to security – is very important. McCabe suggests containers should use a hardened version of Linux (eg, using SELinux), and care should be taken to ensure all the settings match the organization’s security policies. “The base Linux container is the piece people need to be concerned about” and it needs to be enterprise-grade, he says.

Microservices – pieces of code that provide a specific function for multiple applications and services – have become an established part of modern IT. Since such an environment consists of loosely coupled services, it provides an opportunity to move individual microservices around in a way that would be difficult or impossible with more traditional tightly coupled services.

Deploying microservices in microcontainers increases this flexibility, McCabe says, by allowing them to be moved to any platform that supports the container technology and that can reach the service bus used to link the various parts of the system.

Stephen Withers

Stephen Withers is one of Australia¹s most experienced IT journalists, covering everything from gadgets to enterprise systems. In previous lives he has been an academic, a systems programmer, an IT support manager, and an online services manager. Stephen holds an honours degree in Management Sciences, a PhD in Industrial and Business Studies, and is a senior member of the Australian Computer Society.

Stephen Withers has 2 posts and counting. See all posts by Stephen Withers