The Verge article ‘This tiny electric car could be the future …’

The Nissan New Mobility Concept - apparently based on the Renault Twizy

The Nissan New Mobility Concept – apparently based on the Renault Twizy

The Verge article, entitled ‘This tiny electric car could be the future of urban transportation’ echoes my own articles here and here from a couple of weeks ago. (You heard it here first. ;-). )

Modelled on the Renault Twizy (also mentioned in that piece) Nissan is conducting research & trials around the very questions I raised, for example:
– Business model: subscription?
– [From the article] “Is this a real trend? What would make a better product [for Nissan], if we need a better product? Is there interest? What are the demographic breakdowns? How do younger people use it, how do older people use it? How do females use it? How do males use it? How do those that are mobility challenged use it?”( checkout this for one example of the ‘Cambrian explosion’ of electric vehicles that emerged in the past few years)
Note also the criticisms of the vehicle by the writer and how they echo the troubleshooting I described will need to be done to provide a comparable user experience to a car. From the article:
‘One of the models comes with a rear seat, but good luck comfortably fitting a full-grown adult back there for more than a few miles. And there are no side windows, so you’re probably going to want to avoid driving one in anything other than the best weather.’
As I argued in those earlier posts, these are precisely the types of issues Apple or some other prospective disruptor would need to solve. i.e. make the new personal transport vehicle as easy and comfortable to use as a car (or easier), and make it more convenient to park, drive, maintain and own.

Andy Grove’s legacy – a (slightly) dissenting view

Andy Grove - Legendary former CEO of Intel

Andy Grove – Legendary former CEO of Intel

With the recent passing of former Intel CEO, Andy Grove, there have been many tributes to his remarkable abilities and achievements,[1] not least of all, his ability to admit that he was wrong.[2]

This article is not going to say anything to attempt to detract from the great man he was, and his incredible achievements. But in the harsh glare of history, there was one key mistake he made that is oft overlooked. This article will examine that mistake with the benefit of ‘20/20 hindsight’.

A Great Legacy: Avoiding Disruption Pt 1

Firstly though, we should put into context Grove’s achievements which were truly World transforming. Grove is credited with being the man to execute upon his predecessor, Gordon Moore’s, famous ‘Moore’s Law’[3] . It was under Grove’s reign that much of this was achieved.

Tributes extend even further, to Grove’s epitomizing and propagating Silicon Valley’s culture of continual, relentless improvement. Also, when faced in the 1970’s with the existential threat of Japanese competitors ‘dumping’ dynamic random access memory (DRAM) chips – Intel’s core market at the time – it was Grove who suggested leaving the DRAM market to refocus upon the fledgling microprocessor business. One disruption event avoided!

The Celeron Chip

And again in 1997, Grove famously invited Clayton Christensen, the author of a now seminal book, ‘The Innovator’s Dilemma’ and the man attributed with coining the term ‘disruption’ in the sense we know it today, to speak to his employees. As this story from the New Yorker recounts:

‘Grove had sensed that something was moving around at the bottom of his industry, and he knew that this something was threatening to him, but he didn’t have the language to explain it precisely to himself, or to communicate to his people why they should worry about it. He asked Christensen to come out to Intel, and Christensen told him about the integrated mills and the mini mills, and right away Grove knew this was the story he’d been looking for.’[4]

From this meeting, it is said Grove famously decided to produce the Celeron chip – a cheaper, lower-powered chip than Intel’s core offering at the time.

The Orthodox View: Grove’s successor, Paul Otellini made the big miss for Intel

Consequently, Intel’s big ‘miss’, of not picking the mobile chip market, is seen as the fault of Grove’s successor, Paul Otellini.   A typical account is that portrayed by one of my favourite analysts, Ben Thompson on his Stratechery website, in this case relating a story told by Alexis Madrigal at The Atlantic:[5]

‘There is a sense, though, that the company’s strategic position is much less secure than its financials indicate, thanks to Intel’s having missed mobile.

The critical decision came in 2005; Apple had just switched its Mac lineup to Intel x86 processors, but Steve Jobs was interested in another Intel product: the XScale ARM-based processor.

The device it would be used for would be the iPhone. Then-CEO Paul Otellini told Alexis Madrigal at The Atlantic what happened:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it,” Otellini told me in a two-hour conversation during his last month at Intel. “The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do…At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”’

Since that time, ARM Holdings have gone on to become ‘market dominant in the field of processors for mobile phones (smartphones or otherwise) and tablet computers.’ [6]

My dissenting view: Grove made the big miss for Intel

In contrast to this mainstream view, I argue that it was actually upon Grove’s watch that the mistake was made. In my opinion, it was at that fateful meeting between Christensen and the people at Intel in 1997, that a proper understanding of disruption theory as we now come to know it[7] would have pointed to the likely disruptor of Intel’s core business.

It appears that all Grove and his people took away was that the disruption was going to ‘come from below’ i.e. a cheaper competitor. Intel responded with the cheaper Celeron offering.

However, this was not the paradigmatic shift in thinking that Disruption Theory truly requires. Disruption Theory[8] goes further to suggest that the competitor was likely to be so ‘asymmetric’ that the incumbent would not even think of the disrupting force as a threat.

Disruption: Personal Digital Assistants (PDA’s) morph into Smartphones

In 1997 the eventual disruptor was already beginning to take shape in the form of personal digital assistants (PDA) handheld computers such as the ‘PalmPilot’[9].

One of the original Personal Digital Assistant's (PDA's) - the PalmPilot

One of the original Personal Digital Assistant’s (PDA’s) – the PalmPilot

With their puny processing power, limited functionality and gray-scale LCD screens, they were clearly no threat to the mighty Pentium processors for which Intel is still famous.[10] But in time, these PDA’s would become the basis for the first smartphones such as the Handspring Treo 180[11] which used the PalmOS operating system.

The Handspring Treo ran off the PalmOS operating system

The Handspring Treo ran off the PalmOS operating system

Disruption: About the business model, not just the technology

What is more, ‘disruption’ in the Christensen sense also tends to come with a new business model. In other words, it is not just the technology that disrupts, but the business models that the technology enables that do the disrupting. Think Dell’s business model (selling personal computers online sales) to the conventional retail model adopted prior to that point.

ARM Holding’s business model is a classic case of this. Rather than investing hundreds of millions in a chip fabrication plant, instead they focused upon licensing the designs of the chips for others to fabricate.

To be fair to Grove, it is impossible to be omniscient – especially after he managed to avoid one major disruption. Instead, I look at the contribution (or failure?) by Christensen, who in his account[12] of the meeting professed to his clients at Intel that he didn’t know anything about the chip industry. But even a rudimentary understanding of the chip industry would have suggested the Achilles Heel of the chip industry was in the expense of the chip fabrication process. This barrier to market entry, or ‘moat’ would be flipped on its head by a business model such as ARM Holdings’.

These two clues – the easily dismissed processors in the meager hand-held devices, and the inversion of the business model of processors – should be apparent to anybody studying disruption theory today. However, we cannot blame Andy Grove for not being able to better articulate the ‘gut feeling’ he had in the late 90’s that disruption was about to befall Intel, when the father of Disruption Theory himself was still decades away from being disrupted on this point. Grove and Christensen, both great men, but not infallible.



[3] “Moore’s law” is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years. Source:




[7] Arguable one more sophisticated than even Christensen himself understands – See my earlier post citing the Techcrunch article that points this out.



[10] Grove is also credited with the ‘Intel Inside’ and Pentium promotion that made ordinary consumers stop and consider the CPU in their machines.

[11] Nerd that I am, I owned one of these when they first came out.


How Government Investment in the Culture Economy Led to a Billion Dollar Industry

What burgeoning team sport phenomenon awarded over AU$20 million in prize money[1] this past August to a team of 5 players where the youngest broke onto the international competitive scene last year at the tender age of 15, and the oldest is nicknamed ‘Old Man’[2] at a mere 27 years of age?

Here are some clues: Its players, bear nicknames like ‘Piglet’, ‘Faker’ and ‘Amazing’, its 119 pound (54kg) stars can mysteriously burn-out[3] at the age of 21[4], and its audience is already measured at 137 million people around the World. [5] Team names include ‘Evil Geniuses’, Cloud9 and fNatic.

I’ll give you just one more clue: The team is made up of what would typically be considered the least athletic people alive – geeks.

By now, most male readers under 30 will know what I’m talking about. The rest of you are probably scratching your head at this perversely inverted world where pimply nerds are sports heroes and worshipped by legions of female fans. [6]

The phenomenon in question is eSports in which computer gamers play against each other, frequently online, and at the elite level, in the flesh at stadiums including the Wembley Arena.[7] The game that awarded over $20 million in prize money is DOTA 2[8], a computer game that allows multiple players to compete online in a virtual battle arena, or a MOBA (Multiplayer Online Battle Arena) for short. DOTA itself is an acronym for ‘Defence of the Ancients’, which is in turn, a spin-off of the extraordinarily popular ‘real-time strategy’ game[9], World of Warcraft 3 published by Blizzard Entertainment. The ‘2’ in Dota 2 refers to the fact that it is the second official version by Valve Corporation which produces and distributes games.[10]

But DOTA 2 is just the tip of the iceberg when it comes to ‘eSports’. Other games commanding multi-million dollar prize pools include League of Legends, Call of Duty and Smite.[11] These are just a few of the video games played competitively, described as ‘massively online battle arena’s’ or MOBA for short.

People from all corners of the ‘connected’ Earth play eSports against each other making it, in some ways, even more international than soccer/football where players are restrained by travel and passports to play against each other. Of course, in the interest of fairness, and to make their competitions a compelling live event, most competitions at the elite level require players to compete at the same venue on the same equipment live before an audience of screaming fans. Nevertheless, the purely online competitive component has its sophisticated leader boards, through which some child star players have emerged like ‘overnight’ sensations.[12]

Its nerdy star players look so much like you would expect a professional video gamer to look, it makes any parent wonder about the future health of their boy– or their girl.[13] Hailing from all the ‘nerd’ classes; pimply, deathly pale, skinny or overweight (but never physically well-developed), bespectacled, greasy-haired, Asian (even 2 of the Canadian DOTA 2 world champions ‘Evil Geniuses’ are of Asian descent) it is perhaps not surprising considering professional teams have coaches and rigorous training regimes,[14] big brand name sponsors,[15] as well as billionaire owners and backers,[16] just like ‘real’ sports teams.

What has this all got to do with the title of this article?

Here’s a hint. The ‘Super Nation’ of eSports is South Korea[17] where the micro-momentary expression of a pro-gamer losing to the upstart wunderkind, Faker, has its own meme page.[18]

In an impressive display of government intervention triumphing over the free market, the Korean government made a conscious decision nearly 20 years ago to promote its ‘soft power’.[19] Frequently the historical whipping boy of its near-Asian neighbours, China and Japan, and with a mere fraction of the people of those populous giants, Korea’s government felt it needed to somehow compete with its historical ‘big brothers’. During this time, not only did it provide universal superfast broadband, but it sponsored the development of its key cultural industries, including film, television, popular music, and of course, gaming. The rest, as they say, is history.










[10] The history of how DOTA 2 came to be it itself an interesting illustration of the power of the crowd-sourcing phenomenon, where a fan of the game, known only under the ‘handle’ (alias) of Eul, kicked off a chain of successive iterations by even more fans adept at programming.   Ultimately, Valve commissioned the last in this line of fans, ‘Ice Frog’ to help build their official version of DOTA 2. ;



[13] All-female eSports teams exist e.g. Girls HK, Team Siren, and presently, a select few, earn respectable prize money: ; ; . However, they are still a minority in eSports:




[17] ;

[18] . The original expression can be seen at around 13 seconds in at:


The Inventor of ‘Disruption’, disrupted?

In my post detailing my thinking on the Apple Car I touched upon how Clayton Christensen – the person who is attributed as having coined the term ‘disruption’ in the post-Internet age – himself does not believe Apple is ‘disruptive’ according to his own definition.

The following article from TechCrunch details issues with Christensen’s definition, suggesting the definition itself, has been ‘disrupted’.

Measuring What Matters – Evaluating Arts Impact

I will be presenting at the Creative Victoria Expert Arts Panel ‘Measuring What Matters – Evaluating Arts Impact’ on Wednesday 28 October, 3.25-5.00 pm at Treasury Theatre (Lower Plaza), 1 Macarthur St, East Melbourne.
The event is free but I think places might be limited, so for more information and to book, click here. 
I will be talking about evaluation methods in the field of arts, culture and social inclusion projects. Hilary Glow and Anne kershaw from Deakin Uni will talk about evaluating the impact of arts and culture on wellbeing, and Mark Hogan from Regional Development Victoria will talk about evaluation methods in social impact for regional centres and hubs.
Please feel free to let your peers and friends know if you think this might be useful for them.