Apple’s Plan To Dominate Silicon

By Patrick Moorhead - June 14, 2017
Before creating my industry analyst and research company, I was fortunate enough to spend over a decade as an executive in semiconductors at Advanced Micro Devices and nearly another decade as a PC and server OEM at Compaq (purchased by HP), AT&T GIS and NCR (purchased then spun out by AT&T). I’ve met with some very colorful people along the way including Tim Cook at Compaq, Mark Hurd at NCR and Jerry Sanders at AMD. It’s funny how OEMs view semi companies and how semis view OEMS. There is tension on both sides as OEMS think the semis try and take credit for their insights and customer proximity and take all the profit, and the semis think the OEMs lack strategic clarity and don’t invest enough in IP. All this time spent at OEMs and chip companies hopefully gives me some unique perspective on how Apple is disrupting semiconductors. I’d like to talk about why Apple is doing it, how they’re doing it, and its impact on the semi space. Mini-computers the last bastion of broad, custom OEM silicon Mini-computers were the big last wave of OEMs that designed a lot of their own silicon. NCR made SCSI chips. DEC had Alpha.  PCs, aside from the very beginning, were dominated by Intel, Motorola, and VLSI. Historically, consumer OEMs like Apple were never willing to make the big investments, nor did they have the people or IP it took to field leading-edge silicon. Sure, there are some exceptions, but they’re just that, exceptions. Toshiba and Sony ultimately failed at doing both. Samsung today really is an anomaly as a company who can do both successfully as they are in smartphones and flash memory in their own fabs.  Apple currently buys more silicon than any other vendor but also designs more silicon than any other vendor, too. And their silicon, especially their Fusion SoCs and CPU architectures, are best in mobile class in CPU performance and very competitive in GPU performance. Why Apple wants to rule silicon So why would Apple take on so much risk by designing their own silicon for their most profitable products, iPhone and iPad? I mean, if they made one bad move, they could lose out on months of sales. It’s simple, Apple takes this risk because they need differentiation and cost reduction via vertical integration. Apple believes that by owning iOS, the iOS ecosystem, and now the silicon, it can deliver a better user experience. And knowing that Apple can’t really play at the lowest price points, vertically integrating lowers their cost, enabling them to more profitable hit lower price points.
Has it worked for differentiation? I’d say it has worked so far. Apple has consistently cranked out unimaginable improvements in CPU and GPU, 30 to 40% improvements each new product. That’s unheard of, particularly with CPUs at the same power use. Competitively, Apple dominates in single-threaded CPU performance, is competitive in multi-core CPU performance and has competitive GPU performance.  Connect the silicon to the iOS and ecosystem and you now have a very good mobile experience. Apple can’t claim dominant GPU or even participation in LTE silicon design, but we’ll get to that later.
It started with CPU and memory
Apple didn’t just wake up one day and become a leading designer of mobile SoCs and CPUs. They bought their way in and then invested billions to nearly perfect it on the mobile side.  In 2008, Apple purchased Palo Alto Semiconductor (P.A. Semi) for $278M and with it came a 150-person engineering team. At that years WWDC, Steve Jobs announced the team would develop custom chips for the iPod, iPhone, and other future mobile devices (iPad). PA Semi brought low power compute and SoC capability to the table and the secrecy they needed to limit leaks. As I look back at PA Semi, they had some real rocks stars, the ones like Jim Keller who designed the AMD Hammer architecture for the first Operon and Zen core used in AMD’s Ryzen.

Apple's latest A10x Fusion beast of a mobile chip

In 2010, Apple acquired Intrinsity, a processor design company from my home town, Austin, TX. Austin is a processor design and architecture hotspot and many companies do major projects there including Apple, AMD, ARM Holdings, IBM (POWER), Intel, NXP, and Samsung. Intrinsity reportedly had about 100 engineers with low-power design experience using dynamic logic. “Logic” is geek speak for CPUs. A year later, Apple bought over 200 patents from Freescale Semiconductor, who was recently purchased by NXP. This deal with Apple helped Freescale to pay down their debt but also helped Apple in their low power efforts. And then onto flash storage controllers…… Just so you’re following along, I first talked about Apple’s investments in CPUs, memory controllers and SoCs. People thought they might be done there but they weren’t. In 2012 Apple purchased Anobit, an Israeli semiconductor startup reportedly for $400-$500M. Apple was one of the biggest, if not the biggest buyers of flash, so why not speed it up? That’s what Anobit did. Anobit’s flash memory controllers were a key component of all Apple’s leading products (iPads, iPhones, MacBook Airs). Anobit reportedly added 160 chip engineers to Apple's already existing team of reported 1,000 chip engineers. And then on to low power BlueTooth…. In 2014, Apple bought Passif Semiconductor, a company that was working on low-power communication chips. The reasoning behind the acquisition was rumored to be because one of the components of Passifs chip technology was Bluetooth 4.0 LE, which could work perfect with the impending Apple watch. These low-end radios and chips could help Apple improve battery life for iOS devices and maybe even Macs. I believe we are seeing this technology in Apple Watch but also in the W1 Bluetooth QoS chip but need to do a bit more digging in on it. It starts to get messy with graphics and GPUs If my column reads like an iFixIt tear-down, then good, as Apple’s playbook reads like it. While Apple’s own pursuit of CPUs, SoCs, memory controllers and storage controllers seemed to go well, with GPUs, things start to get messy, particularly with Imagination Technologies. As background, Apple's GPU starts with Imagination Technologies IP as its base and used in iPhone and iPad. In April of this year, Imagination said that Apple will no longer be licensing its technology in less than two years’ time and that Apple is “working on a separate, independent graphics design to control its products and will be reducing its future reliance on Imagination’s technology.” Half of Imagination’s revenue comes from Apple as well as royalties from iPhone and iPad, so you could imagine how quickly Imagination stock tanked. Imagination lawyers came out just as quickly asserting there’d be no way that Apple could do their own GPU without Imagination IP. I, too, would find it incredible that Apple could develop their own GPU without a base of GPU patents from somewhere like Imagination, NVIDIA or AMD. Also noted are that many of Imagination’s GPU engineers had left and gone to work for Apple in the same city as Imagination.

Apple's impressive 500X graphics performance gain since the A4

One of the upsides to being a key Apple supplier is that when the stock market sees you are a key component, you get rewarded, but when the bottom falls out, you fall even farther down than Apple. And there’s a chance Apple may recruit your engineers. I’m not saying Apple did this here, but it’s something all Apple suppliers need to be on watch for. It’s apparent Apple wants 100% control over their GPU future likely because of the importance of VR, AR and even machine learning training, but how they will get that base IP is unknown. And gets the ugliest with LTE modems Without modems, an iPhone would essentially be a large iPod. Modems are important. In iPhones,  Apple previously sourced its 3G modems from Infineon, now owned by Intel, but then switched to Qualcomm as Infineon was very late to market with 4G LTE. Unless you have been living under a rock, you have seen that Apple is suing Qualcomm, Qualcomm is suing Apple and Apple’s ODMs, and global regulators are investigating Qualcomm, helped by complaints, I believe, from Intel, Apple, and Samsung. NOKIA, another big 4G IP creator like Qualcomm, was suing Apple, but just recently settled for undisclosed terms. Apple was suing Ericsson, Ericsson was suing Apple, but settled in the end of 2015. Isn’t life grand in modem IP land? I can tell you that the lawyers are making a ton of money. I’ll save my opinions about these lawsuits for future columns, but I think they are important that modems represent an area that is vitally important to the usefulness and utility of current and future devices. I believe Apple wants ultimate control over the LTE modem as they have control over their mobile CPU, memory controller, storage controller, SoC, like they want with GPUs. Apple needs that capability mid-term, to add integrated modems into Apple Watch SoCs in particular, and on future Apple AR headsets. Discrete modems won’t work nearly as well- Apple need them integrated, but not at the terms Qualcomm is offering. Until the courts settle this, we will likely see Apple dual-source modems between Intel and Qualcomm and there are rumors of a single source with Intel when it supports CDMA. The iPhone 8 is rumored to not support “gigabit-class” modems, and if that is true, we could very well see an in-market test on the value of modem speeds and efficiencies. While I don’t think we’ll see full gigabit speeds at the end of the year, we very well could see “gigabit-class” modems double or triple the performance of an iPhone 8 without gigabit-class on the right network. Get out the popcorn. What happens in the future? Looking back at Apple’s impressive, 10-year silicon history and what they’ve accomplished, I believe Apple would like to and have the confidence to extend control of their future destiny on mobile 4G LTE and 5G modems, flash memory (Toshiba?), WiFi, GPUs, DSPs, and ASICs. Also consider what happens when Apple decides to give the iPad a full keyboard and trackpad option maybe three years down the line. iOS11 certainly is adding more and more beefier features every day to make it look like macOS, so why not start diving into PC-performance components, challenging AMD, Broadcom, Intel, and NVIDIA?  Sounds crazy? Well, I thought Apple would crash disastrously at least once in the last 10 years based on their silicon effort and I was wrong, so I don’t think anyone should write off this possibility. With Apple’s cash balance, they could easily snatch up AMD, drop X86, and re-target all those resources to ARM-based processors and GPUs to support those larger and more powerful iPad Pro “Plus”. If you’re still a doubter, then consider what Apple could do with their $250B in cash. NVIDIA is worth $90B, Qualcomm $86B, and AMD $11B. Wrapping up Apple had the most successful ten-year silicon run I have ever seen from a company in CPUs, GPUs and SoCs… and they’re an OEM! That’s extremely hard and Apple deserves a tremendous amount of credit. Silicon is vital to Apple to enable differentiation and lower its cost to attack lower price points. Make no mistake, Apple is now running a silicon power playand I believe anything that gets in its way will be bulldozed, some successfully, some unsuccessfully.  Apple has shown they can crank out industry-meeting or beating silicon and it has the confidence, processes, and balance sheet to make it happen or make some big mistakes trying. In GPU and LTE modem land, we are witnessing brutal IP wars against much smaller companies. Apple is over 8X larger than Qualcomm and 22X larger than Imagination Technologies. Size isn’t everything, but it is something, and I even wonder if government entities would ever intervene. If you follow silicon and OEMs who rely heavily on silicon as closely as I, get out the popcorn!  
Patrick Moorhead
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.