How Infor Cloud uses containers

Infor offers more than 45 of its enterprise applications in the cloud. Infor Cloud has achieved in excess of 99.9% historical uptime, has more than 2800 customers with some 35 million users. Infor Cloud provides a high performance experience for its end users and rapidly delivers application feature enhancements with the aid of Docker based deployments.

About four years ago, Infor started moving its cloud deployments from various datacenters to Amazon Web Services. Today, all Infor applications are deployed on AWS and use a deployment architecture that spans three availability zones to ensure high availability for its end users. The use of three AWS availability zones allows Infor to maintain uptime even in the event of a disaster that results in the complete loss of a zone.

Vice president of development operations Amul Merchant said the complete Infor Cloud datacenter is automated, from setting up the infrastructure and deploying applications, right through the complete application lifecycle management process. Infor’s completely software defined data center has minimized variability, increased speed and improved security. Introduction of containers has further optimized DevOps processes by reducing the variability between development and production environment.

While there is an upfront labor cost in creating these end-to-end automated processes for application deployment and lifecycle management, they can be reused for worldwide deployment resulting in efficiency, repeatability, speed and consistency each time. Automation has “a significant benefit in terms of velocity,” he said, and the rapid delivery of enhancements to end users would not be possible without automation. Some developers are initially skeptical that the costs will outweigh the benefits, but are persuaded when they see the results for themselves.

As an example of the use of containers within Infor Cloud, Merchant described the way the Infor BI application is required to scale rapidly to process large numbers – sometimes millions – of business object documents in the form of XML files within a short time. This wide fluctuation in volume coupled with a requirement that processing must be done efficiently and without being a ‘noisy neighbor’ to other tenants meant that a highly reactive microservices-based architecture was appropriate. Each microservice is deployed in its own container, and handles tasks including shredding the XML objects into database records, converting the data into dimensions and building a hierarchy. The use of containerization allows each microservice to be individually and rapidly scaled up or down as required.

Mesos is used for resource management; monitoring servers; tracking memory, CPU, disk and port usage; launching Docker containers; and essentially to make the server farm look like one big server. Marathon provides a health check of services, working with Mesos to efficiently utilize the available resources and distribute services across AWS availability zones. ZooKeeper is used to ensure high availability of these various resources and leader selection.

Since the load on this multi-tenant BI application is highly fluctuating and unpredictable, it was important to use a highly reactive architecture that could respond quickly to changing demands. Docker containers met this requirement as they can be started and stopped in much less time than the five minutes or so it takes to spin up a new EC2 virtual machine. Merchant said “The application architecture being broken down into microservices allowed scaling of individual microservices component as required by the amount of work queued for each component instead of scaling up or down a monolithic application that would have resulted in less resiliency and more wastage of resources.”

The combination of microservices architecture and Docker ecosystem of supporting applications “was an easy decision for us,” he said. When building applications for web scale deployment, implementing functions as microservices that run in Docker containers “is almost a no-brainer.”

This approach also enabled continuous delivery by allowing developers to deliver prebuilt containers that could be deployed in staging and production without the traditional problems of variability of configuration between the various environments. “The requirement to deliver containers puts the onus on our developers to deliver working and tested environments,” said Merchant. As a side benefit, developers are heavily engaged in how the production systems work: “everybody feels more ownership.”

Stephen Withers

Stephen Withers is one of Australia¹s most experienced IT journalists, covering everything from gadgets to enterprise systems. In previous lives he has been an academic, a systems programmer, an IT support manager, and an online services manager. Stephen holds an honours degree in Management Sciences, a PhD in Industrial and Business Studies, and is a senior member of the Australian Computer Society.

Stephen Withers has 2 posts and counting. See all posts by Stephen Withers