More

    VMs and Containers: The Newest Enterprise Frenemies

    Containers add another layer of virtualization to an already virtualized data environment, but they are by no means a complete replacement for the virtual machine that first broke the dependency between software and hardware.

    But that leads to the question: When is it appropriate to use containers and when should the enterprise stick to the tried and true VM?

    Increasingly, the answer is turning out to be, whenever the need for VM management and overhead starts to impede performance. As the latest wave of container optimization tools shows, the focus is turning to building and deploying containerized environments quickly and moving them across hybrid infrastructure in a highly dynamic fashion. Sometimes that requires help from a VM, sometimes it doesn’t.

    Microsoft, for example, recently launched the Azure Container Instances platform that allows organizations to deploy containers in seconds using a set of Role-Based Access Control (RBAC) options that take the place of full VM management tiers and cluster orchestration tools. Users simply choose the number of virtual CPUs, memory and other resources they require and launch the containerized code. The platform provides multiple templates and CLIs to make deployment even easier, with access to Docker Hub or other public or private repositories, as well as the Kubernetes management stack. Oddly, the system only supports Linux containers at the moment, although Windows support is said to be in the works.

    Meanwhile, converged infrastructure developer Datrium is offering a means to run containers within VM-centric infrastructure without losing their agility and scalability. This is particularly crucial for deploying containers on converged and hyperconverged infrastructure, says Silicon Angle’s R. Danes, much of which has been built around VM-style virtualization. Instead of thoroughly mashing compute and storage on the server, Datrium separated the persistent storage layer from I/O processing to enable greater scale of the stateful storage environments that containers need. At the same time, it added new management tools for container orchestration and automated cloning.

    Of course, VMware has a vested interest in keeping traditional virtualized infrastructure alive, but it also sees the potential for containers in key applications, says Server Watch’s Sean Michael Kerner. The company has devised a two-pronged container strategy that targets both legacy virtualization customers and new cloud-facing enterprises. The vSphere Integrated Containers (VIC) platform offers seamless integration of containers into traditional platforms like vSphere and vCenter, while the new Photon platform acts as a host for fully containerized, cloud-native stacks. In this way, the company says it can provide all the advantages of both VM and container functionality without causing undue disruption to existing operations.

    But if containers are so much more portable and agile than VMs, why would anyone want to use both? In many cases, says Tech Republic’s Keith Townsend, housing a container within a VM can enhance its portability, particularly when stateful applications are in play. While a standalone container can run on multiple distributions of Linux, using a VM built on, say, VirtualBox Ubuntu, can extend that portability to public clouds like Amazon and Azure without modifications. When those apps require a stateful environment, VMs can provide that as well, even if the underlying hardware fails.

    Enterprise workloads are growing larger in size and more diverse in their content and infrastructure requirements, so it seems that the days of one technology supplanting another are over. Containers may be a newer form of virtualization, but all indications suggest that they will work within and alongside virtual machines even as they provide bare-metal support in key situations.

    Like an auto mechanic, the enterprise needs a lot of different tools to keep the data engine humming.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles