IBM’s Container Efforts Powerful, But Pose Security Challenges

IT organizations may not have the requisite security controls to take advantage of new container-related tools and services IBM announced this month.

Traditional on-premises security tools, even those configured to ward off modern-day threats, will largely lack the monitoring capabilities to handle the jump in data traffic and complexities running on new container services launched by IBM, as well as by Amazon, Microsoft and others, says Rani Osnat, vice president, Marketing at Aqua Security.

“Traditional security tools that weren’t created to handle containers and microservices lack the visibility and enforcement points to handle them, leaving IT blind to their activity,” he says.

Security aside, IBM sought to entice developers with a number of announcements it made this month that championed boosts in container performance. IBM said during the IBM Think 2018 conference last week, for example, IBM Cloud Private will offer new container versions of IBM app development, management and testing tools, including API Connect, UrbanCode and Netcool for multiple clouds and on-premises systems. IBM also said during the conference IBM Cloud Private will offer new support later this year for Windows containers running .Net apps, to complement its existing support for Docker containers and Cloud Foundry.

Rani Osnat, vice president, marketing, for Aqua Security

IBM announced earlier in the month it has begun to enable Kubernetes containers to run as managed services directly on bare-metal servers on cloud networks. This capability, at least in theory, should improve data processing speeds in containers that run on bare-metal servers, instead of having to share resources with applications running on virtual machines, for example.

However, as mentioned above, IBM’s new container support services, as well as other relatively new high-powered architectures such as AWS Fargate and Microsoft Azure Container Instances (ACI), lack comprehensive security tools IT organizations require to deploy them, Osnat says.

A host-based endpoint security solution, for example, might be adequate to monitor container processes running on a host, but will lack the application context to analyze and monitor containers running on different infrastructures, he notes.

“Traditional network security solutions may be able to monitor connections between containers and microservices on the same host to some extent, but will also have a hard time coping with ephemeral and highly dynamic network traffic between them on different virtual machines and servers. This is because they may use network overlays that aren’t familiar with those tools,” Osnat says. “The consequence might be lack of visibility and control, as well as false positives banning legitimate traffic, which is just as bad.”

In parallel, network attacks are also increasing in number and in scope, underscoring the importance of properly vetted security protocols and tools for new container services, as challenging as that might be. Large-scale distributed denial of service (DDoS) attacks, for example, which have posed a threat for a number of years, are becoming more frequent and destructive, according to the IBM X-Force Threat Intelligence Index 2017. While 100Gbps attacks were only until recently considered exceptional, attacks of such magnitudes were considered almost routine by 2016, according to the report.

Security tools specific to container deployments are available, yet remain inadequate. By default, open source Kubernetes integrates several security and monitoring tools built into it. However, Kubernetes’ ability to segment networks and enforce role-based access controls “are not very secure and require a lot of work to configure properly,” Osnat says.

Deploying containers requires a security rethink in many ways, compared to what on-premises security management involves. For example, security monitoring is required that can also handle multiple deployments that scale across different infrastructures simultaneously. Tools for testing, authentication, authorization, hardening and other best practices must be automated and able to handle the complexity of containers, such as when they are deployed over multiple infrastructures, Osnat says.

Enterprise culture and practices are also key. A well-functioning DevOps team, for example, should be able to integrate the necessary comprehensive security tools and protocols when using a third-party service for containers. These might include deployments using the services described above, such as IBM’s new container tools and cloud Kubernetes on bare-metal or AWS Fargate.  “I wouldn’t see any issues [with proper DevOps deployments],” notes Holger Mueller, an analyst for Constellation Research. “More security tools for Kubernetes are also being developed.”

B. Cameron Gain

B. Cameron Gain

B. Cameron Gain first began writing about technology when he hacked the Commodore 64 family computer in the early 1980s and documented his exploit. Since his misspent youth, he has put his obsession with software development to better use by writing thousands of papers, manuals, and articles for both online and print. His byline has appeared in Wired, PCWorld, Technology Review, Popular Science, EEtimes, and numerous other media outlets.

B. Cameron Gain has 9 posts and counting. See all posts by B. Cameron Gain