HPE Synergy Shows Promise And Progress With Early Adopters

By Patrick Moorhead - August 30, 2016
Hewlett Packard Enterprise has placed a big bet with HPE Synergy—the company is a pioneer in the composable infrastructure market, and is the furthest along in customer enablement. In the rapidly changing world of IT, composable infrastructure could be the next big thing in enterprise infrastructure. Designed to treat hardware like software (what is often referred to as, “infrastructure as code”) it has the ability to allocate the optimal resources for each application—with the goal in lowering infrastructure costs, providing flexibility as a resource, and accelerating time-to- market for customers. HPE Synergy was launched in December 2015, and touted as the first platform in the market purposefully built for composability (read more here). Hewlett Packard Enterprise is making progress with building out the composable infrastructure ecosystem and though it is still too early to say definitively, they are seeing some success with Synergy’s early beta customers. HudsonAlpha Institute for Biotechnology, a nonprofit specializing in genomics research, education, and medical treatment, was HPE Synergy’s first customer (you can read our full case study here). Genomics is a highly data-intensive field (they generate more than one petabyte of data a month), and in order to handle the intense workload demands, HudsonAlpha had to rethink their infrastructure—HPE Synergy promised the flexibility and compute power they needed to get the job done. The solution is well-aligned with the HudsonAlpha’s existing strategy—the institute already manages its infrastructure via resource pools. HudsonAlpha says Synergy’s Direct Attach Storage (DAS) simplifies storage for maximum efficiency—a must, when dealing with such large volumes of data. Hewlett Packard Enterprise’s partnership with Docker is also a selling point—HudsonAlpha views containers as being critical for delivery of microservices. In addition, HudsonAlpha says HPE Synergy delivers the agility needed for collaboration between thousands of researchers worldwide—the platform is quick to get users road-ready and running with new applications. As it currently stands, they are in the beta stage of deployment—but they have started running production-level workloads on HPE Synergy. Testament to the platform’s ease of installation, HudsonAlpha was able to set up the hardware and complete the install process in-house before the HPE support team even arrived. They’re currently using Docker Swarm, Docker Machine, and DevOps tools like Vagrant on top of Synergy. They’ve constructed their own templates for the platform to allow better transitions through tenants, and developers have begun to deploy their own workloads to the hardware without requiring the the assistance of operations. According to Jim Hudson (Co-Founder and Chairman of the institute), an analysis of the human genome that used to take about 2 days to complete can now be accomplished in 45 minutes with HPE Synergy—an impressive jump. As HudsonAlpha’s existing infrastructure is swapped out for Synergy, they say they will continue to measure gains in efficiency and capability through comparison of the two. I think we’re going to continue to see good results. Other early testimonials of HPE’s new solutions have also been positive. Rich Lawson, Senior IT Architect at Dish Network (one of the first 100 Synergy customers), praised Synergy’s flexibility, and its ability to unlock the full potential of the public cloud. Greg Peterson (VP of HPE Solutions at Avnet, Inc.) lauded HPE Hyper Converged 380’s ease-of- deployment and management, saying that “the solution works as advertised.” We’ll continue to monitor as more early adopters report back on their experiences with HPE Synergy, but so far it’s looking pretty good. The other, very important piece of the puzzle is the work that HPE is doing to expand the composable infrastructure ecosystem. I’ve said it before, and I’ll say it again—I think HPE “gets strategic partnering,” even though the company-wide approach is new. They’ve spent the first half of 2016 integrating HPE OneView with tools from their partners—Docker, Chef, nLyte, Eaton, SaltStack, Ansible, and VMTurbo, just to name a few. The crux of the entire composable movement is to make it easier for customers to drive automation with whatever tools they already have. Expanding the composable ecosystem is going to be an ongoing task for years to come, but HPE appears to be making some good strides through the collaborations with their many partners. In conclusion, I’m not quite ready to call HPE Synergy a composable slam-dunk yet—signs are looking positive, but it’s still too early in the testing period to say. I do feel comfortable saying that these proof points are an indicator that HPE can deliver on their promise of composable infrastructure. It’s not just a nice buzz-phrase anymore, it’s actually a viable way of doing things—and it’s only going to get more viable as HPE continues to build out the composable ecosystem. If HPE Synergy’s beta customers continue to report positive results, I think we could be looking at a big shift in enterprise infrastructure.
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.