Steve Jobs is still retired, but earlier today I heard an interesting commentary on the role that he played not only for Apple but the industry.
The source was Leo Laporte, who hosts a technology-oriented radio show on weekends here. For almost 30 years, I’ve had a love-hate relationship with computer journalists (typically over their bias, lack of technical knowledge and opinionated self-importance) but I’d have to rank Laporte as one of the best — and perhaps the best in the electronic media.
What caught my attention was when Laporte listed all the things that Apple didn’t invent but perfected
- Apple II (1977) was not the first PC (1975 or perhaps 1974)
- Mac (1984) was 3 years after the IBM PC (1981)
- iPod (2001) was not the first MP3 player
- iTunes was not the first music software
- iPad (2010) trailed Chairman Bill’s failed efforts at making a tablet by a decade; and
- the iPhone (2007) was not the first smartphone
Reading between the lines, none of these Apple breakthroughs happened during the Jobs interregnum (1985-1997). Apple did do a few important things during this period — QuickTime and the first PowerBook — but I think it’s easy to argue that none rise to the level of the Mac, the iPhone or the iPad.
As Laporte noted, Steve Jobs was not about making wads of money but changing the world. He once exhorted his workers to be “insanely great” which is how Apple transformed existing product categories into something new. (Management scholars might say Apple created the Dominant Design.)
Also as Laporte noted, the world that Apple created is probably different than the world that would have existed without Apple.
I’ve read every major book published about Apple in the 20th century, and a few of the later ones. The early histories make clear two things. First, Steve and Steve and the early employees were about changing the world. Secondly, their vision was creating computing for the masses, i.e. making computing personal. (And arguments about whether the Mac stole from PARC miss the point that Xerox was never going to change the world with a $15,000 workstation targeted at the Fortune 500.)
If you look at Atari and Commodore and some of the other early consumer PC pioneers, they had a completely different DNA. They were about selling lots of products, but they didn’t really care whether they got used or not. (The Commodore 64 was a powerful machine for its day, but without decent software I suspect most got stored in a closet as ours eventually did.)
Instead of just peddling boxes, Apple created the K-12 education market, a market they owned until (post-Steve) fears about its future (and the abandonment of the Apple II) created an opening for Dell to sell commodity PeeCees. Of course, much of this success accrued to Apple’s benefit, with half of US college students wanting a Mac over a PC.
Meanwhile, the decentralized third party software ecosystem concept of the Apple II has been replicated over and over — in the IBM PC, Macintosh, Windows, Xbox/PS/Gameboy, the iPhone, iPad and Android to name a future. The idea that you make a cool box, distribute APIs and wait for third parties to provide the missing pieces is now the norm for any software-enabled device.
As someone who now teaches bioscience students — many of whom will go into drug discovery and medical devices — I want to see how we can translate and infuse this passion for making a difference into their careers. People tend to hold up Steve Jobs as a brilliant product designer — and industry strategist — but we tend to forget the part about his focus on changing the world.