Disruptive technologies often serve as a great equalizer. By definition, they change the conversation. Disruptive technologies can punish incumbents. It can also open the door to new opportunities for those daring enough to challenge the status quo.
The rise of containers as an application architecture is one of these disruptive technologies. Containers have become, in a remarkably short period, a cornerstone of modern application delivery.
There are countless surveys and data points available that detail the rise of containers in the enterprise. These statistics all deliver the same message: the preponderance of new applications is being developed in containers. An exponentially increasing number is being deployed into production systems using that same technology. Containers have become the lingua franca of the cloud-native world.
Disruptive technologies, being what they are, are fraught with friction. As enterprises deploy containers into the data center, it exposes a tension between traditional infrastructure and the new technology’s needs. Nowhere is this more apparent than in the enterprise storage space.
Through our company’s analysis, it is clear that Pure Storage is not a company that is afraid of disruptive technologies. Pure, after all, arguably did more to enable the all-flash revolution of the past decade than anyone else. It pushed its larger competitors to deliver all-flash solutions by merely showing the market the full potential of flash storage. This shift has brought untold benefits to nearly every IT organization.
Given that legacy, it’s not a surprise to see Pure Storage acquire Portworx, a container data management leader. There’s friction between traditional enterprise storage and the storage needs of containers. Much as we rearchitected data to benefit from all-flash technology, storage must be rearchitected to better deliver container technology’s full potential value. This is what Pure Storage, with the help of its new crew from Portworx, intends to do.
Challenges of container storage
Containers are a technology that began as a mechanism to safely and efficiently deliver short-lived, ephemeral workloads into rapidly-paced dev-ops environments. The original architects didn’t design containers with persistent storage as a first-class concern. As container technologies, such as docker, gave rise to orchestration solutions like Kubernetes, the technology became attractive to the enterprise data center.
Workloads began to shift. Storage suddenly became a primary concern. To address this, the storage and container communities worked together to deliver a set of specifications. The Container Storage Interface, CSI, is designed to marry traditional enterprise storage with container-based workloads. Every major storage vendor today supports some subset of CSI.
While CSI is a satisfactory solution for many use-cases, the containers’ nature exposes weaknesses in the model. Traditional enterprise storage is designed for relatively static workloads, with rational numbers of stable storage volumes, all delivered by conventional server architectures.
The problem is that containers do not map well to traditional server workloads. A balanced container implementation can deliver up to 10x higher application density over more traditional virtualized architectures. Containers are also highly dynamic, being created and destroyed at an often-dizzying pace. As containers are created and destroyed, so are connections to the underlying storage volumes. It’s a model that stresses traditional enterprise storage. A different answer is needed.
The value of Portworx to the market
Founded in 2014, a period of rapid innovation in the container world, Portworx’s founders wanted to address cloud-native workloads’ persistent-storage needs. Its approach resonated with the enterprises dabbling with container deployments. Portworx found rapid success, becoming a leader in container data management.
What Portworx delivers, very simply stated, is a software-defined storage architecture that integrates the storage control plane with the application control plane. Portworx utilizes block storage, whether served from an on-prem storage array or a cloud block-storage solution, to provide containerized applications with enterprise-class data services.
Portworx allows containers to have efficient access to the kinds of data services that traditional enterprise workloads take for granted. This includes high-availability, data security, backup and replication, thin provisioning, and other storage-related functions. It’s a long list of features, and the interested reader should spend some time on Portworx’s website to understand them all.
Running on every Kubernetes worker node, Portworx can deliver these data services with a much higher efficiency than possible with more traditional approaches. Portworx, for example, virtualizes the underlying attached block storage, allowing a reasonable number of stable provisioned storage volumes to serve a Kubernetes cluster.
Pure Storage’s data strategy
It must be challenging to be one of the last successful stand-alone enterprise storage companies in the industry. Competitors like Dell Technologies and Hewlett Packard Enterprise can bring ruthless amounts of leverage to IT purchasing decisions with full-stack bundles that include servers, storage, and even networking. Full-stack solutions are attractive to IT buyers.
Technology companies like Pure Storage and NetApp must continuously out-innovate their larger and broader competitors to continue to grow and thrive. Pure Storage does this with a strategy that, at its foundation, says that data is a utility.
Data powers the enterprise. This is a power that needs to be delivered cheaply, reliably, and with attributes that map to any given workload’s needs at any given time. It’s important to remember that data is not storage. Storage is simply a persistent instance of data somewhere in the infrastructure.
This strategy differs from how traditional block storage vendors think about data. Storage arrays provide an endpoint to a pool of data. Pure Storage’s vision differs in that it treats persistent data like how a data network treats its traffic. It must adapt to the environment.
This strategy has driven Pure Storage to deliver cloud data services that aren’t necessarily tied to its on-prem physical arrays. It’s this same strategy that makes the Portworx acquisition such a natural play. Data is increasingly consumed by containerized applications. It makes far more sense to adapt storage to that world than limiting the value of containers to make it fit a traditional block or file view of the world.
Pure Storage is far from alone in thinking about containers and storage. Every top-tier storage and compute vendor has some sort of play in the space.
Most directly competitive to Pure is NetApp’s efforts around containers. NetApp has been working in the container space for a few years, suffering a few false starts. NetApp brought to market and then killed, earlier this year, its NKS container offering. Most recently, NetApp has rallied around its Project Astra initiative, which is not yet a product. Project Astra appears to have different goals from Portworx, focusing more on container workload management than data services.
Over the next two weeks, we’ll hear more from NetApp and Dell Technologies as each of those companies hold their respective annual conferences. I expect that containers will be front and center as they tell their stories. It’s a hot topic across the industry.
It’s the early days of container adoption. We’ll see continued rapid evolution for the next few years, and then things will settle out as it just becomes part of the infrastructure. Those next few years, though, are going to be exciting. There hasn’t been this much innovation happening in the storage world in decades, and containers are just a small part of it.
I like Pure Storage’s strategy. It tells me, very simply, that my data will be available wherever I need it. That could be on an on-prem array, delivered on-demand, in the cloud, or to a container someplace. The myriad of data services that Pure pulls together to make that happen, while important, are secondary to just delivering data. Data, after all, is what keeps the enterprise moving forward.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.