Actifio Simplifies Copying Data Using Containers

Actifio today made available an update to its namesake copy management software to make it easier to clone databases consisting of terabytes of data using containers.

Chandra Reddy, senior vice president of product marketing, says application development teams increasingly are making use of containers to clone production databases. That approach makes it easier for those teams to work with the data any application they are building would need to access.

Actifio 10c adds eight capabilities that make it easier to not only move large amounts of data between databases but also to back up and restore virtual machines, physical servers and databases across both on-premises and cloud computing environments. In addition to on-premises environments, Actifio 10c now provides support for seven different cloud computing platforms, he notes.

Reddy says Actifio 10c can also improve solid-state drive (SSD) performance at 20% of the cost by applying intelligent read/write caching to object storage after instant mount and recovery directly from S3-compatible object storage. In many cases, applications are now directly accessing data stored in multiple cloud services using the S3 application programming interface (APIs) developed by Amazon Web Services (AWS) as a de-facto standard.

With the rise of containers, Reddy says copying data between platforms is becoming easier. The challenge organizations face now is putting the tools and processes in place to keep track of what data has been copied for what purpose.

Copy data management software is increasingly becoming an extension of the DevOps toolchain, he adds. Many organizations already leverage the REST APIs that Actifio provides to integrate its tools with continuous integration/continuous delivery (CI/CD) platforms such as Jenkins.

The amount of data that DevOps teams are struggling to manage as part of an application development process continues to grow. Most DevOps teams don’t have a dedicated member of the team focused solely on data management. Longer-term, it’s still a little unclear who will manage data in the age of DevOps as the separation of duties between storage administrators, database administrators and DevOps teams remains in flux.

Regardless, each development team needs to be able to copy large amounts of data in a way that doesn’t slow down the application development process. At the same time, many organizations need to be able to show the chain of custody for any set of data to pass audits that are conducted regularly in highly regulated industries. That becomes increasingly difficult to accomplish when DevOps teams accelerate the rate at which applications are being built across an increasingly extended enterprise.

At the same time, the amount of structured data pouring into databases running on containers continues to increase. IT organizations are building more stateful containerized applications that require access to persistent forms of storage. In theory, it should be simple to move databases embedded within containers across platforms assuming DevOps teams have access to tools that enable them to copy the data residing in those databases. Of course, however, actual DevOps mileage will vary depending on the quality of the data management tool employed.

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Mike Vizard has 1620 posts and counting. See all posts by Mike Vizard