Last week Arvind Krishna, IBM’s chief executive and chairman for the past eighteen months, told a virtual roomful of financial and industry analysts that IBM’s strategy is one that’s laser-focused on helping enterprises navigate the next level of digital transformation. This next level of change is fueled by a rich mix of hybrid-cloud and artificial intelligence. Business value exists in an organization’s data. Getting to that value is IBM’s mission.
IBM is not undertaking an easy mission. Nearly every enterprise today exists in a hybrid multi-cloud world. This presents challenges far beyond simply transiting bits through a fast storage array. Data must traverse cloud boundaries. Increasingly, data must also find its way into an enterprise’s processes from the edge. Feeding that data into an AI engine opens new challenges focused on throughput, latency, and even data gravity. The list continues.
The job of IBM’s storage teams is not an easy one. Nearly anyone can build a fast storage array from today’s flash-driven technology. Still, very few technology companies can deliver a cohesive data infrastructure that maps to the shifting needs of data-driven digital transformation. It’s hard. IBM has chosen to build the tools that help make it easier for enterprise IT architects.
This week IBM announced a modest series of updates to multiple elements of its storage portfolio. These announcements collectively demonstrate that IBM is playing to its strategy. There are tweaks to its public cloud integration, updates to help its new Turbonomic team execute better, as well as a few that keep its offerings current and competitive. Let’s dive in.
Boosting Turbonomis with IBM Storage
IBM acquired Turbonmic earlier this year. The acquisition was targeted at bolstering IBM’s portfolio of AIOps tools and services. Turbonomic was a pioneer in both Application Resource Management and Network Performance Management. It delivered AI-enabled tools that help organizations better manage a vast and complex set of resources.
Turbnomic (IBM has kept the branding) brings full-stack observability and management to enterprise operations. Its use of AI to assure performance, resource optimization, and cost control across a range of IT infrastructure make it a valuable addition to any IT administrator’s AIOps tool chest.
IBM this week provided its first integration of Turbonomic with IBM’s storage offerings. IBM announced that Turbonomic will collect a range of data from IBM’s FlashSystem storage. This data will be used to optimize operations protect against unnecessary over-provisioning. It will also be guiding intelligent storage density decisions. IBM said that this integration will allow customers to increase storage density by 30% without any application performance impact.
IBM also announced that Turbonomic will provide observation of Instana, Red Hat OpenShift, or any “major hypervisor” with IBM FlashSystem. This will allow operations teams to visualize and automate corrective actions quickly.
The integration of Turbonomic with the FlashSystem is a fundamental step for IBM as it begins to use the Turbonomic technology to help manage an enterprise’s overall architecture. IBM hasn’t announced more than this, but I expect to see deeper integration of Turbonomic into the broader set of storage offerings enabled by IBM’s Spectrum Storage Software Suite.
IBM Spectrum Virtualization for Azure
IBM has always embraced the idea of a hybrid multi-cloud world. There was a time when that vision was centered on IBM’s own private cloud ambitions. In recent years, the company has aligned that view with the reality that nearly every enterprise includes a major public cloud provider as part of its virtual infrastructure.
IBM delivered a public cloud version of its Spectrum Virtualize software, Spectrum Virtualize for Public Cloud, back in early 2019. It has, until this week, only supported Amazon’s AWS public cloud. This changes with new support for Microsoft Azure.
Teased earlier this year, IBM now formally supports Microsoft’s Azure cloud with its IBM Spectrum Virtualize for Public Cloud on Azure. This brings data migration, disaster recovery, snapshots, and even IBM’s immutable snapshots feature, IBM Safeguarded Copy, to Azure.
This support delivers a solid win for enterprise IT. While Amazon Web Services remains the dominant public cloud offering, Azure is growing faster than the overall public cloud market. Microsoft Corporation indicated in its most recent earnings that Azure revenue grew 51% year-over-year. AWS grew 37% during that same period.
A Flurry of Features
Beyond the Azure and Turbonomics news, IBM also released updates across its portfolio of storage offerings. These are all focused on the core attributes of cyber-resilience, cloud-native, and scalable performance.
IBM continues to build out what it sees as foundational support for cloud-native technologies. For example, the company announced that IBM Spectrum Protect Plus will protect Red Hat OpenShift and Kubernetes data in workloads deployed on Microsoft Azure. The software will also now allow direct backup to S3 object storage.
There’s more object support, as IBM Spectrum Scale implements a new high-performance object interface. On the performance front, a new interface for Nvidia’s GPU Direct Storage interface enables applications that support the technology to run up to 100% faster with IBM Spectrum Scale.
Finally, IBM announced one piece of new hardware. The IBM Elastic Storage System 3200 now includes a 38TB IBM FlashCore Module. This doubles the size of the existing largest size.
The overall impact in density is impressive. IBM reports that the new FlashCore module brings the overall capacity of an ESS 3200 to 912TB across only two rack units.
The Analyst Perspective
It’s no secret that I’m a fan of IBM’s storage technology. The company continues to deliver some of the best technology in the industry. The unfortunate reality is that the company’s brilliance in engineering and building storage solutions often gets muffled in telling IBM’s broader narrative. IBM isn’t interested in telling you an infrastructure story. That’s almost an afterthought.
As IBM reiterated during its recent investor day, the company relentlessly focuses on helping enterprises turn data into business value. A critical part of that solution stack is data management, one requiring an underlying storage architecture that can protect and deliver data to where it needs to be.
It’s very telling that IBM has spent nearly two decades shedding its hardware-centric businesses, yet it keeps storage in its portfolio. It’s critical in addressing the challenges that IBM is trying to help enterprises solve. At the same time, the IBM Corporation isn’t telling a hardware story. It’s telling a much more valuable story about how IBM can help you move your business to the next level of data-driven transformation.
Data is critical to the enterprise. Storage is fundamental to deriving fast and valuable insights from an enterprise’s data across clouds and to the edge. This week’s announcements show IBM continuing to make the investments and deliver the storage infrastructure to build these solutions.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.