Evident In Its Recent Earnings That Synopsys Reaping The Rewards Of Its Long-Term AI Investments

By Patrick Moorhead - June 14, 2023

As I’ve written about recently, for years Synopsys has been using AI to push the envelope on the electronic design automation (EDA) software that makes today’s semiconductors possible. Now its latest quarterly numbers are out, and if anything I’m even more impressed by what the company is doing and the impact it’s having in the industry.

At a moment when lots of companies are shaky because of broader economic conditions—and when the chip industry is facing lower demand—Synopsys’ numbers are moving up and to the right in every important area. Sure, some of this have to do with the offset between designing chips and building and inventorying them, but it’s still impressive.

The latest results blew past previous guidance, the company hit another record for quarterly revenue, operating margin is up, EPS is up and the company raised its guidance for annual revenue. It’s all a testament to strong execution of its “Smart Everything” strategy that Synopsys launched more than a decade ago.

After the earnings call, I had a chance to talk with CEO Aart de Geus and CFO Shelagh Glaser to get the Synopsys perspective on what’s happening at an interesting time for AI and the tech landscape as a whole. In this piece, I’ll share some of the highlights of that conversation and where they see things heading in the months and years to come. Let’s dig in.

Synopsys beats guidance for revenue, EPS and margins

First, the numbers. Quarterly revenue of $1.395 billion was above the high end of guidance, as was non–GAAP EPS of $2.54. The company generated $703 million in quarterly operating cash flow, and its non-cancelable backlog rose to $7.3 billion.Given how it’s firing on all cylinders, the company now expects revenue for the full year to be about $5.8 billion, or 14 to 15% higher than its $5.08 billion of revenue in fiscal 2022. In line with that, it expects non-GAAP EPS to grow 21 or 22% year over year, and it projects an even bigger improvement in non-GAAP operating margin—150 basis points total—than its previous guidance.

Given how it’s firing on all cylinders, the company now expects revenue for the full year to be about $5.8 billion, or 14 to 15% higher than its $5.08 billion of revenue in fiscal 2022. In line with that, it expects non-GAAP EPS to grow 21 or 22% year over year, and it projects an even bigger improvement in non-GAAP operating margin—150 basis points total—than its previous guidance.

That’s what I’d call a great quarter: top-line growth, bottom-line growth, higher guidance for the year. And from talking with Glaser, it’s very clear that Synopsys is serious about continuing to improve margins, both short- and long-term. By the way, the company also has plenty of cash and almost no debt.

The numbers are even more impressive when you consider the volatility of the market that Synopsys operates in. Demand for semiconductors has been down, especially for the chips that go into consumer products. But Synopsys says that its customers continue to invest in R&D for new chips so they can take advantage of opportunities when demand does pick up again. In de Geus’ words, for the chipmakers, “The worst is to miss an upturn, because that’s where most of the money is made.”

Leadership in AI helps Synopsys navigate a turbulent market

Synopsys is able to deliver such strong results because it’s in the right market space with the right technology—and AI is at the heart of that. “I think there has been great sensitization in the last few months around the whole notion of AI,” de Geus told me. “And you know, this is not new to us, because we had predicted the world was going to go to ‘smart everything’ about 12 years ago and built in that direction.”

Right now, generative AI like ChatGPT is getting most of the attention in the press, but Synopsys has already spent many years using machine learning (ML), big data and different types of AI to help develop and debug chips. As covered in my earlier post, its recent announcement of the Synopsys.ai platform means that the company is now applying AI to the entire EDA stack: design, verification, test and manufacturing—even for analog chips.

How’s that approach working? In a word, great. Nine of the top 10 semiconductor companies are already using AI-driven tools from Synopsys, and the uptake for these tools is only picking up pace. It started in mid-2021, when Samsung announced it had achieved the first AI-driven commercial tape-out in the world by using Synopsys’ DSO.ai. (DSO stands for “Design Space Optimization.”) By the end of 2022, Synopsys customers had already achieved 100 AI-driven tape-outs, and when I talked to de Geus in the past two weeks, he told me that it’s now “well over 200 tape-outs.”

To pick one example out of the many design wins Synopsys achieved in the past quarter, the company is now collaborating with TSMC to deliver EDA flows for the foundry giant’s most advanced 2nm process node. Renesas achieved up to 10X improvement in reducing functional coverage holes and up to a 30 percent increase in verification productivity.

Just think about the demands of that in terms of design complexity and zero tolerance for error.

The role of hyperscalers and generative AI

While he didn’t name specific companies, de Geus did verify that the list of his customers using AI-driven tools includes not only pure-play chipmakers, but also hyperscalers. This makes sense when you consider that about 45% of Synopsys’ business comes from what de Geus calls “systems companies,” meaning companies that intersect both hardware and software. This includes big cloud providers, car makers and others that are massive users of computing power and that have particular needs for operating at lower power consumption.

For example, think about how much onboard computing power will be needed by the end of this decade for fully autonomous cars—and yet all that computing power must still be delivered at the lowest possible level of energy consumption. As de Geus said, “If [chips] go into a car, they also better be as low-power as possible, because every 100 watts that you use in compute essentially takes away 10 to 15 miles of distance that you can drive on the battery.”

To address these needs, generative AI of the type used in ChatGPT isn’t yet ready to contribute to EDA processes. “Generative AI has fantastic results” in other contexts, de Geus told me, “but its results can have imperfections. In what we do, there’s zero room for that. No, we are a company that has to deliver 99.99999—you know the number of nines can never be enough.” This is so because even the slightest error in a chip design can have huge consequences in terms of efficiency, yield, time-to-market and so on. “So therein lies a very profound distinction between the level of precision that our AI has to have for these designs versus these more search-type AI capabilities.”

Mind you, de Geus’ enthusiasm for generative AI came through loud and clear. “I think it’s fantastic, what’s happening” with generative AI, he said, “and this will impact the world massively.” In the longer term, Synopsys president and COO Sassine Ghazi believes that although these are early days of research and development, we believe generative AI is at an inflection point and a game changing technology that will offer significant opportunities for EDA applications.

Meanwhile, Synopsys engineers are exploring how cutting-edge large language models (LLMs) of the type used by ChatGPT can help streamline internal processes and augment existing solutions. According to Ghazi, “Synopsys pioneered AI-Driven chip design and this is the only the beginning of our AI journey to deliver productivity breakthroughs for customers.”

If you want to know more about the AI journey Synopsys has been on for the past several years, I recommend a fascinating post the company published after its recent SNUG conference, where Synopsys.ai took center stage. At the conference, Ghazi celebrated the crucial work done by Synopsys engineers who were inspired by the 2016 and 2017 victories of Google’s AI-driven AlphaGo over masters of the ancient game of Go. Those engineers began to dig into the ways that AlphaGo’s reinforcement learning (RL) techniques might be applied to EDA. What started as merely an idea then quickly became an MVP, and is now being used to design some of the world’s most complex multi-die architectures.

While generative AI will definitely have a role in the future of EDA—and Synopsys is bullish about taking a leadership role there as well—the company’s existing AI tools are providing big improvements in productivity and speed for chipmakers in the here-and-now.

The future of EDA is “Smart, Secure and Safe”

For de Geus and his team, the mission is broader than just using more AI to speed up and smooth out EDA. In his earnings report and again in his call with me, the CEO reiterated the need for Synopsys to help deliver technology that is smart, secure and safe.

The “smart” part gets harder all the time as companies demand more from chips: greater speed, more sophistication, higher performance-per-watt and so on. Besides the automotive example mentioned above, this certainly applies to things like AI-driven search, which requires several times as much computing power as traditional text search. Other chips used for AI functions are similarly compute-hungry.

De Geus made the point that Synopsys itself is adding to this demand. “We are using AI, very advanced AI, now ourselves to help our customers design super advanced chips,” he said. He compared it to the famous M. C. Escher picture of two hands drawing each other, adding that “I’ve always loved that representation because we are in the midst of that.”

He also believes that safety and security are “increasingly, jointly important” alongside smarts. This is so, he told me, because “the very complex system, if it’s not secure—there are enormous dangers in that.” He added that “these dangers get aggravated if they touch human life in any form.” It’s not hard to come up with contexts—cars, medical devices, navigation systems and so on—where any of us would insist on super-high safety and security.

For customers looking to bolster security and safety, de Geus said that Synopsys helps “by virtue of putting mechanism in chips that help encrypt the data, provide root of trust, a unique identification, et cetera. And that’s built into the IP and the design flows.”

For de Geus, all these considerations are “interwoven and looped, and the really fantastic thing for us is we’re literally in the middle of many of these loops” of design, AI utilization and so on. In his view, that gives Synopsys lots of room to run. As he put it, “The roadmap of optimizing, automating and generative AI use cases is wide open to deliver productivity breakthroughs for years to come.”

The future of AI in EDA

As he answered a question on the earnings call, de Geus summed up his business philosophy when he said, “Whenever the world changes, there’s opportunity.” I found that telling and ironic, given that is where technology industry analysts grow, when there’s chance.

Synopsys has certainly been following that path since it launched its Smart Everything strategy a dozen years ago, and especially since it started going deep into RL and other AI techniques around 2017. What once sounded like science fiction is now penetrating deeper and deeper into every part of the creation process for semiconductors, from the early stages of design to the final steps of manufacturing.

I agree with de Geus that Synopsys’ opportunities in this space are diverse and open-ended. Better, the company has shown year after year that it can take good ideas and quickly turn them into real-world engineering that addresses customers’ challenges. The combination of strategy and execution has really paid off for them, and I expect that to continue.

Patrick Moorhead
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.