HPE Discover Conference Day Two: Top Industry Analyst Takeaways

Late last week, I published my Day One overview analysis of the Hewlett Packard Enterprise Discover event, which I attended June 7-9, and today I’ll continue with my overview analysis of Day Two. Day One was dominated by customer testimonials and products announcements enveloped in thought-leadership discussions related to the “Idea Economy” and Digital Transformation. Event “day twos” are usually anti-climatic, but HPE had some interesting things to say. Day Two’s session contained several more product announcements and testimonials, but towards the end it took a more theoretical slant than the prior day’s session.


Here are my top 5 takeaways from Day Two, in order of their occurrence in the session.

1/ Digital Disruption theme: If the theme of Day One’s general session was digital transformation, the focus of Day Two was all about the outcome of that transformation—digital disruption.

While disruption has been happening throughout human history, Hewlett Packard Enterprise CEO Meg Whitman emphasized the difference today is the speed of these disruptions. Thanks to new, affordable, and accessible technology (such as cloud computing, IoT, and big data analytics) new competitors have the ability to spring up virtually overnight, according to Whitman “at a pace and scale we haven’t seen.” She further emphasized that the companies who succeed will be the ones with both the vision and the technological capabilities to respond to threats and turn their ideas into reality.

I tend to agree with this, but explain it differently. Without getting too “Singularity” on you, think of this similarly to the compound value of money. As better technology is creating tomorrow’s technology, it accelerates at an even faster pace. Ecosystems and open source are democratizing the way innovations are brought to the table and then reassembled again, all leading to the increased potential for disruption.

2/ The Machine: The audience got a brief video update on The Machine at the day two keynote— originally unveiled at Discover two years ago.

For those unfamiliar, The Machine is Hewlett Packard Enterprise’s foray into what they’re now calling “memory-driven computing”. It’s a computer architecture built around a large memory pool with workload-specific SoCs, instead of relying on a homogeneous processor, connected by photonic fabric instead of the traditional copper wire. We’ve been following The Machine for years now. It’s a big and bold effort for sure and it’s undergone many twists and turns to say the least, including removing memristers and replacing it with DDR4 memory.


The Machine SoC Module and Fabric interface controller (Photo credit: Patrick Moorhead)

The video shown at Day Two’s session was fairly devoid of specifics, but served the purpose of reminding the audience that something big was coming—just not here quite yet.  I would have liked to have seen a more concrete demonstration of HPE’s progress on the project on-stage, but the video was a good teaser. Whitman promised we would be hearing more about The Machine in the coming months.

I went offline with the HPE Labs folks and got many specifics that filled-in the gaps. First of all, HPE CTO Martin Fink wasn’t on-stage because (I was told) he was testifying in the HPE-Oracle lawsuit, not because he was backing away from The Machine, as some of Hewlett Packard Enterprise’s competitors were claiming.

HPE Labs showed off some impressive demos of their speeds and reliability of Gen1 silicon photonics chips and they promised that Gen2, which just came in, were meeting their speed expectations, but weren’t done enough to show at Discover. Silicon photonics are really hard and quite frankly, no one is getting it exactly right. HPE appears to be in the hunt, though. HPE declined to tell me exactly what SoC was being used, but I’m guessing it’s some kind of ARM Holdings-based variant plus an accelerator or an unreleased Intel Xeon with an integrated Altera FPGA accelerator. Finally, HPE Labs folks told me The Machine would be booting an operating system by the end of the year. They must hit this milestone. I can’t wait.


HPE Labs Rev A silicon reliability demo (Photo credit: Patrick Moorhead)

I believe The Machine’s detractors don’t fully understand the benefits to HPE. Worst case, even if The Machine with its custom HPE SoCs, HPE silicon photonics, HPE fabric, HPE Memrister (now DDR 4) don’t all come together at the same time in the form everyone in the peanut gallery expects, HPE still benefits. I believe HPE, as it crafts The Machine, is building an IP war chest that can be used to monetize and leverage with its suppliers even if they end up buying technology from them down the line. Additionally, if you believe as I do that the future of the datacenter combines workload-optimized accelerators, east-west fabric, composable compute, memory and storage, and some flavor of a massive memory footprint, think of the IP and experience HPE is gaining from piecing together any variant of this. Think of the potential time to market advantage, too. So if you buy into my thesis, all you can do is question is the benefit versus expense of licensing the technology and IP. I’ll leave it at that.

3/ GE IIoT partnership: After Hewlett Packard Enterprise CEO Meg Whitman completed The Machine discussion, she related it to IoT. According to Whitman, the expected pervasiveness of The Machine—that is, its expected usage across platforms (from datacenters and the cloud to personal devices)—would make it an important leap forward in the growth of the IoT. She cited a report by Cisco Systems and DHL estimating that by 2020 there will 50 billion connected devices, communicating with each other and generating large amounts of data as part of the Internet of Things. Whitman reiterated that dealing with these massive amounts of data is a primary concern of HPE and their partners, and with that point, she pivoted to a testimonial from their longtime customer and partner General Electric.

GE is of course a huge and relevant player in the IIoT. GE maintained on-stage that while the last ten years were all about the consumer internet, the next ten years will really be about the industrial IoT (you can read more about IoT Segmentation in the research paper Moor Insights & Strategy published on the topic back in 2014). While I tend to believe this is a bit of a self-serving view, I agree that the IIoT will be huge, bigger than the HIoT (Human IoT). CEO of GE Digital, Bill Ruh, challenged the audience to “imagine a world where nothing ever breaks.” Lofty and aspirational, to say the least, but directionally spot-on. To that end, GE has built the world’s first industrial IoT operating system, for building industrial internet applications, which they call Predix—and they’ve built it atop HPE Proliant hardware. By leveraging edge-to-cloud performance and big data analytics, Predix could help industrial companies become more efficient and productive, and reduce their down time.

This announcement looks on the surface like a very positive thing for Hewlett Packard Enterprise and it will be important that HPE keeps striking deals with more vertically-focused IoT platform and channel partners. The Alliance team will be very busy. GE has struck many deals with other large IT companies and our IoT analysts will be sifting through the details to see if and how this is differentiated.

4/ Edge Computing and “Converged IoT”: Whitman then moved on to edge computing, emphasizing that one of the big challenges of the IoT is securely capturing and analyzing data at the edge—closest to the source of the data. I agree whole-heartedly with her on this point.

Whitman brought Dr. James Truchard, President and CEO of NI, to the stage to rehash on the two companies’ collaboration. On June 1st , a week before Discover started, National Instruments and Hewlett Packard Enterprise announced a collaboration on Big Analog Data solutions, based on NI’s DataFinder Server Edition Software, and HPE Moonshot Systems—according to the press release, a “complete, pre-validated, tested solution to manage and analyze the complexities of file-based sensor data.”


National Instruments PXI card (Photo credit: Patrick Moorhead)

Whitman next brought Antonio Neri, leader of the Hewlett Packard Enterprise enterprise group, back to the stage to announce the launch of HPE’s “Converged IoT Systems” product category. These systems, designed specifically for computing on the edge, possess in Whitman’s words, “three absolutely crucial capabilities”: deep open x86 compute, precision data capture and control, and enterprise class systems and device management. Whitman went on to say that NI’s technology is a key component in their new Edgeline 1000 and 4000 systems. The Converged IoT systems will be capable of deep edge analytics with HPE Vertica, and quick and secure automated IoT access with Aruba ClearPass.


HPE Edgeline EL 1000 Converged IoT (Photo credit: Patrick Moorhead)

In a recent article MI&S IIoT Analyst Mike Krell laid out the three main reasons HPE and the industry is heading in this direction: timeframe of relevance, bandwidth (reducing the amount of data sent upstream), and cost. This is a good move for the company, and I am completely struck at how quickly Hewlett Packard Enterprise is rolling out edge computing capabilities given they were later to launch purpose-built IoT solutions than their competitors. Also a bit surprising is how they integrated Moonshot technology, Aruba networking and security, and NI FPGA capabilities. They’ve come out like a lion. HPE has definitely moved into our “Top 3” in IoT Edge Compute leadership vendors.

5/ Mesosphere, Transparent, Kiva panel: At this point, the session shifted gears away from the technological and back towards the theoretical. Whitman introduced Luke Williams, marketing professor at NYU’s Stern School of Business, author, and globally recognized authority on innovation strategy and leadership) to talk about trends in innovation and disruption, and moderate a panel of industry disruptors.

Williams railed against complacency in the business world, and advised companies to “take the ingredients” they have available and rearrange them in a way that makes them more valuable. It was an engaging and empowering segment. He followed his lecture with his panel of disruptors—Neel Mehta (founder of Transparent software, designed to visualize government PDFs), Tobias Knaup (co-founder and CTO of Mesosphere, a company building a data center operating system), and Premal Shah (founder and president of Kiva, a nonprofit working to alleviate poverty through microfinance). The panel discussed their various motivations, mindsets, and methodologies beyond their role as disruptors.

So what was panel this all about? It’s all about HPE association with famous people and innovative organizations talking about the premise of everything HPE is doing- that is, Digital Transformation in the Idea Economy. Aren’t keynotes grand?

6/ What wasn’t discussed much at HPE Discover: I always come in with some pre-conceived notions of what I expect to hear about at IT industry events. I also know from experience that no one can cram everything into every event, and, as if you say too much, you risk saying nothing a all. In other words, HPE couldn’t say something about everything.

As I said in the day one analysis, networking wasn’t discussed on-stage at all. Not a peep, except if you count Aruba’s inclusion in the Converged IoT launch. I met 1:1 with networking executive management and I’m not concerned. In fact, I left the meetings with a positive impression, particularly on H3C and what HPE is looking to do in the future. That’s all I can say.

I was also surprised Synergy wasn’t discussed much as I believe HPE has an industry head-start in composability. I met with executive management and it was really a matter of company communication priorities. HPE Discover London, which I also attended, was all about Synergy. Therefore, this Discover wasn’t all about that.

Event Wrap up: Whitman wrapped up the Day Two general session and the entire event by stressing that it wasn’t a question of whether your company would be transformed and/or disrupted, but how soon and by whom.

Through testimonials from many of their strategic partnerships and customers (Dropbox, Boeing, New York Genome Center, Home Depot from Day 1, GE and National Instruments from Day 2), HPE demonstrated their product and service lines’ relevance across many industries. Software testimonials should be very important for HPE.

HPE’s stated goal is to help companies “take control” of the current digital transformation and disruption with the help of their new product offerings—additions to the Helion cloud portfolio, machine learning with Haven OnDemand, Converged IoT systems, just to name a few. Anticipation continues to build for The Machine—it’ll be fascinating to see where the project goes in the coming years.

Time will tell if HPE will still be relevant in 250 years as they imagined in their Star Trek trailer, but if HPE Discover 2016 was any indicator, the company certainly seems to be investing the future and doing what it needs to do to pay bills today. HPE showed change, progress, and stability where they needed to and I consider that an event win.

Check out our full analyst coverage of Discover 2016: