Andy Grove’s legacy – a (slightly) dissenting view

Andy Grove - Legendary former CEO of Intel

Andy Grove – Legendary former CEO of Intel

With the recent passing of former Intel CEO, Andy Grove, there have been many tributes to his remarkable abilities and achievements,[1] not least of all, his ability to admit that he was wrong.[2]

This article is not going to say anything to attempt to detract from the great man he was, and his incredible achievements. But in the harsh glare of history, there was one key mistake he made that is oft overlooked. This article will examine that mistake with the benefit of ‘20/20 hindsight’.

A Great Legacy: Avoiding Disruption Pt 1

Firstly though, we should put into context Grove’s achievements which were truly World transforming. Grove is credited with being the man to execute upon his predecessor, Gordon Moore’s, famous ‘Moore’s Law’[3] . It was under Grove’s reign that much of this was achieved.

Tributes extend even further, to Grove’s epitomizing and propagating Silicon Valley’s culture of continual, relentless improvement. Also, when faced in the 1970’s with the existential threat of Japanese competitors ‘dumping’ dynamic random access memory (DRAM) chips – Intel’s core market at the time – it was Grove who suggested leaving the DRAM market to refocus upon the fledgling microprocessor business. One disruption event avoided!

The Celeron Chip

And again in 1997, Grove famously invited Clayton Christensen, the author of a now seminal book, ‘The Innovator’s Dilemma’ and the man attributed with coining the term ‘disruption’ in the sense we know it today, to speak to his employees. As this story from the New Yorker recounts:

‘Grove had sensed that something was moving around at the bottom of his industry, and he knew that this something was threatening to him, but he didn’t have the language to explain it precisely to himself, or to communicate to his people why they should worry about it. He asked Christensen to come out to Intel, and Christensen told him about the integrated mills and the mini mills, and right away Grove knew this was the story he’d been looking for.’[4]

From this meeting, it is said Grove famously decided to produce the Celeron chip – a cheaper, lower-powered chip than Intel’s core offering at the time.

The Orthodox View: Grove’s successor, Paul Otellini made the big miss for Intel

Consequently, Intel’s big ‘miss’, of not picking the mobile chip market, is seen as the fault of Grove’s successor, Paul Otellini.   A typical account is that portrayed by one of my favourite analysts, Ben Thompson on his Stratechery website, in this case relating a story told by Alexis Madrigal at The Atlantic:[5]

‘There is a sense, though, that the company’s strategic position is much less secure than its financials indicate, thanks to Intel’s having missed mobile.

The critical decision came in 2005; Apple had just switched its Mac lineup to Intel x86 processors, but Steve Jobs was interested in another Intel product: the XScale ARM-based processor.

The device it would be used for would be the iPhone. Then-CEO Paul Otellini told Alexis Madrigal at The Atlantic what happened:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it,” Otellini told me in a two-hour conversation during his last month at Intel. “The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do…At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”’

Since that time, ARM Holdings have gone on to become ‘market dominant in the field of processors for mobile phones (smartphones or otherwise) and tablet computers.’ [6]

My dissenting view: Grove made the big miss for Intel

In contrast to this mainstream view, I argue that it was actually upon Grove’s watch that the mistake was made. In my opinion, it was at that fateful meeting between Christensen and the people at Intel in 1997, that a proper understanding of disruption theory as we now come to know it[7] would have pointed to the likely disruptor of Intel’s core business.

It appears that all Grove and his people took away was that the disruption was going to ‘come from below’ i.e. a cheaper competitor. Intel responded with the cheaper Celeron offering.

However, this was not the paradigmatic shift in thinking that Disruption Theory truly requires. Disruption Theory[8] goes further to suggest that the competitor was likely to be so ‘asymmetric’ that the incumbent would not even think of the disrupting force as a threat.

Disruption: Personal Digital Assistants (PDA’s) morph into Smartphones

In 1997 the eventual disruptor was already beginning to take shape in the form of personal digital assistants (PDA) handheld computers such as the ‘PalmPilot’[9].

One of the original Personal Digital Assistant's (PDA's) - the PalmPilot

One of the original Personal Digital Assistant’s (PDA’s) – the PalmPilot

With their puny processing power, limited functionality and gray-scale LCD screens, they were clearly no threat to the mighty Pentium processors for which Intel is still famous.[10] But in time, these PDA’s would become the basis for the first smartphones such as the Handspring Treo 180[11] which used the PalmOS operating system.

The Handspring Treo ran off the PalmOS operating system

The Handspring Treo ran off the PalmOS operating system

Disruption: About the business model, not just the technology

What is more, ‘disruption’ in the Christensen sense also tends to come with a new business model. In other words, it is not just the technology that disrupts, but the business models that the technology enables that do the disrupting. Think Dell’s business model (selling personal computers online sales) to the conventional retail model adopted prior to that point.

ARM Holding’s business model is a classic case of this. Rather than investing hundreds of millions in a chip fabrication plant, instead they focused upon licensing the designs of the chips for others to fabricate.

To be fair to Grove, it is impossible to be omniscient – especially after he managed to avoid one major disruption. Instead, I look at the contribution (or failure?) by Christensen, who in his account[12] of the meeting professed to his clients at Intel that he didn’t know anything about the chip industry. But even a rudimentary understanding of the chip industry would have suggested the Achilles Heel of the chip industry was in the expense of the chip fabrication process. This barrier to market entry, or ‘moat’ would be flipped on its head by a business model such as ARM Holdings’.

These two clues – the easily dismissed processors in the meager hand-held devices, and the inversion of the business model of processors – should be apparent to anybody studying disruption theory today. However, we cannot blame Andy Grove for not being able to better articulate the ‘gut feeling’ he had in the late 90’s that disruption was about to befall Intel, when the father of Disruption Theory himself was still decades away from being disrupted on this point. Grove and Christensen, both great men, but not infallible.



[3] “Moore’s law” is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years. Source:




[7] Arguable one more sophisticated than even Christensen himself understands – See my earlier post citing the Techcrunch article that points this out.



[10] Grove is also credited with the ‘Intel Inside’ and Pentium promotion that made ordinary consumers stop and consider the CPU in their machines.

[11] Nerd that I am, I owned one of these when they first came out.