Lenovo and ThinkSystem SR650 facing-left front 16HDDs
Some in technology say there are “lies, damned lies, then there are benchmarks”. I hear this typically from those who either don’t understand benchmarks or haven’t been working with them for decades. Don’t get me wrong, there is a lot of “benchmarketing” that goes on, but I’d hate to throw out the entire category of benchmarks as they serve an important purpose, which is to estimate the real-world performance of a system on a certain workload. If those benchmarks express real-world workloads, are repeatable, there’s transparency on what is being tested, how it’s being tested and weighted, I support it. As an aside, I have participated in many benchmark groups over 25 years and have used benchmarks to communicate my company’s products.
Server spike attributes vary widely in the industry
Server companies lean into different attributes to differentiate their wares. These attributes range from best performance, TCO, scalability, simplification, flexibility, security, price, workload optimization, reliability, and serviceability. Lenovo, for one, has spent a lot of time on performance the past 25 years that goes back to the IBM years. Last week Lenovo announced the company had scored an impressive collection of performance benchmarks on Intel’s Xeon Scalable processors, aka Purley, with its ThinkSystem line of servers. These benchmarks are a nice feather in Lenovo’s cap, giving the company the ability to boast best-in-class performance in a number of different areas. I wanted to go into a little more detail about the achievements, and offer my take on them.
Lenovo achieved 46 new world record benchmarks (while a previous 42 benchmarks were maintained), bringing Lenovo’s total of #1 world record claims up to an impressive 88. These benchmarks run the gamut, from 1S to 8S servers, with workloads ranging from Big Data, Web tier, Analytics to Virtualization.
These benchmarks were all set by ThinkSystem servers based on Intel’s Xeon Scalable Family of processors—the ThinkSystem SR950 and SR650 rack servers, the ThinkSystem SD530 dense server, the SR630 and SR 850 rack servers, and the SN550 blade server. The Intel Xeon Scalable processor boasts a new core that the company says is custom built for the diverse workload needs of the modern datacenter. It’s no wonder that Lenovo’s hardware is a good fit with Intel’s chips—the two companies have a long history of collaboration.
I’m not going to go into detail about every single one of the 88 records (for obvious reasons). For in-depth specifics, you can see Lenovo’s press release here
. These are also reinforced on Intel's benchmark site
. However, I would like to provide a brief overview.
Lenovo ThinkSystem servers now hold:
- 5 world records in business processing apps
- 8 world records in infrastructure virtualization apps
- 16 world records in general computing apps
- 3 world records in technical computing apps
- 19 world records in server-side Java apps
- 37 world records in big data analytics apps
While big data analytics is Lenovo’s bread and butter, it’s good to see the success spread out across different areas as well. All in all, Lenovo took home more than double the amount of benchmarks of its closest competitor—not too shabby at all. The gap is indicative of the overall lead Lenovo has achieved in data center performance, an achievement that the company is understandably proud of.
The magic blue crystals- XClarity Provision Manager and optimal hardware configurations
So how is it that Lenovo, who buys components from the same people as their competitors, do so well in performance benchmarks? What are their magic blue crystals? First and foremost, Lenovo invests an inordinate amount of time optimizing firmware, optimizing operating system settings and configuration settings to win the benchmarks. The settings portion are then simplified with "one-touch" using Lenovo XClarity Provision Manager where IT can choose the operating mode they want to run. I don’t consider that “rigging to win”, I see this as knowing how to max out the hardware and platform software. Lenovo wouldn’t share the specific with me as it’s "their IP". I might be skeptical if it weren’t that the combined IBM and Lenovo has been doing so well on these for 25 years. Let me know on Twitter or in the comments section if you think otherwise.
These performance accomplishments are certainly impressive in and of themselves, but at the end of the day, benchmarks are benchmarks. The practical, real-world value in this is what it means for current and potential customers and their end customers.
The workloads Lenovo predominately focuses on fit into five categories: applications, data management, application development, IT infrastructure, and web infrastructure. Reducing complexity for customers and increasing performance in all these areas could potentially translate to big enterprise savings (need less hardware), enterprise making more money (quicker time to information), and increased satisfaction for Lenovo customer’s customers through quicker or better results.
Benchmarks aren’t everything, but they are one very important attribute that goes into one enterprise considering one vendor over another. I think leaning into performance is smart for Lenovo, but it will an education to educate customers and channels how benchmarks relate to workloads and how that improved performance translates to real and tangible business results.