Hyperconverged infrastructure (HCI) may be the latest form of IT hardware, but in one respect it is a lot like the old: It presents a choice between an all-software deployment on commodity hardware or a pre-integrated, vendor-specific solution.
And just like with traditional infrastructure, both approaches have their merits, otherwise the entire data industry would gravitate toward the better solution.
According to Network World’s John Edwards, the difference lies in the need to keep initial costs as low as possible, in which case a software solution might be best, versus the desire for deployment and operational efficiency, since many integrated approaches are closing in on plug-and-play functionality. A key consideration in all-software approaches is the ability to upgrade hardware under the same operating license, plus it allows greater flexibility when configuring physical infrastructure by allowing you to choose from a wider range of compute, storage and networking options. This requires a certain level of skill, however, first to configure the software for a given box and then to ensure that the entire environment is fully integrated across all disparate pieces of hardware.
There is also a tendency to equate software solutions with open source, but this is not necessarily the case. Red Hat, for instance, offers a community-based approach to HCI, which relies heavily on existing Linux products like the RHEL operating system and the Open Virtual Network (OVN) stack. But the platform still requires a validated server configuration, such as the HPE ProLiant DL360, to achieve optimal performance. In this way, it is not a whole lot different from VMware’s VSAN software-defined storage solution, which requires a qualified VSAN-Ready node, or the NSX virtual networking solution, which is also based on OVN.
In some cases, however, it might make sense to go with an integrated appliance for safety’s sake. Rugged environments that require on-site infrastructure for IoT or other purposes, for instance, might require a buffed-up solution like Pivot3’s Intelligent Edge Command and Control appliance. The device consists of four servers with NVMe SSDs inside ECS’s Loadmaster rackmount case. The system was designed for military and civilian law enforcement applications, but it is basically a data center in a box, so it can support virtually any workload.
As well, some vendors are claiming to take HCI to a new level with solutions that bring both local and cloud-based resources under a single management plane. Datrium’s DVX platform, which is available as all-software or as an integrated server solution, stresses a two-layer infrastructure for hybrid environments that merges VM-level support with real-time analytics. With this approach, the company claims a 10-fold improvement in both speed and scale over existing HCI environments. Datrium recently drew a round of funding from Samsung and Icon Ventures, which the company says will augment efforts to integrate functions like back-up and disaster recovery into the platform’s automated management system.
As a practical matter, IT executives should not get too caught up in the hardware vs. software debate this time. The better strategy these days is to focus on end results and then reverse-engineer the solution from there. For the most part, the final decision will come down to whether cost and flexibility trump speed of deployment and ease of use. It’s likely that most organizations will eventually find themselves with a mix of solutions tailored to specific use cases.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.