The Rise of Google, Part I: A history lesson

by Dirk Knemeyer

This is part one of a three-part series that will detail Google’s rise to becoming the dominant company in the computing industry. Part one will review the history of IBM and Microsoft, Google’s predecessors in this position; part two will take a close look at the last decade in computing and particularly at Google’s; and part three will look into the future and help you understand what’s to come for Google and the rest of the industry.

Depending on which gushing analyst you listen to, Google’s release last week of the Nexus One “superphone” is going to change the computing industry. Some are pointing to the phone itself and the fact that Google is now officially a hardware company. Others are pointing to the Google ecommerce store and approach to selling the phone and calling that the true harbinger of future dominance. Wrapped up in much of this excitement is a sense of surprise, as if Google’s doing these things wasn’t something that—at the very least—should be seen as a predictable result of Google’s expanded impact in the industry over the past decade. This very short-sighted breathlessness makes me wonder if the people who are telling us what to think really know what they are talking about.

I foresaw Google’s emergence into all of these spaces five years ago, even suggesting in an article that their path into this diversification would be an acquisition of Apple. While that idiosyncratic prediction did not come true (yet!) I’ve had a pretty sanguine view of where Google was headed since 2003 and want to share my take on what the Nexus One means and where Google is headed next.

The bottom line is that Google is in the process of taking over as the “Evil Empire of Computing,” supplanting the decade-in-decline Microsoft. To understand this, consider the historical context for companies rising and falling at the top of this industry:

IBM was the first iconic industry leader, riding their near-ubiquity as the business computing platform during the 1960s and 70s. Emerging as the pioneer of mainframe computing for large corporations, IBM was the unquestioned computing leader during those decades. They delivered an end-to-end service, providing the mainframes, terminals, operating systems, software, training, service … everything that businesses needed to leverage computers at the then-leading edge of technology. IBM was Big Blue, and Big Blue was big business.

In the 1980s, IBM remained the clear industry leader, but they spent the decade in a decline that can be traced to two specific things. First, Apple Computer emerged as the company creating the beachhead into home and personal computing thanks to their more humane computing environment and user interface. This movement is iconically represented by the legendary 1984 Super Bowl commercial, where Apple positioned IBM as the soulless and domineering Big Brother in contrast to their fun, hip, approachable brand. The other (and actually more devastating event) was when Microsoft made the decision not to bend to IBM’s demands - Microsoft was providing their operating system to IBM - and instead broke off to independently develop the Microsoft OS, which eventually became Windows. Make no mistake: it was the rise of Windows as a preferred platform for non-Apple personal computers that spelled the fall of IBM as the giant of computing, in spite of the latter’s continued strong presence in the mainframe and supercomputing worlds.

So rose Microsoft. By the 1990s the coronation was official: Microsoft had won. More than just operating systems, Microsoft was producing software of every possible kind. Microsoft was so dominant that the company was in the news for anti-trust and intellectual property lawsuits more than for their actual products. Microsoft WAS computing, and they probably would still remain uncontrovertibly so if not for one little thing: the rise of the Internet.

As with most things, the impact and degree of eventual decline was not immediately clear. The early Internet was a clumsy and confusing thing. People knew that it mattered and that it was important, but nobody really knew how to take advantage of it. The early leaders were walled gardens, companies like Prodigy, CompuServe, and AOL that tried to offer “everything” to users in a closed and confined community, for a long time without access to the larger Internet. Microsoft’s response was an all too typical blend of strongarm business tactics: they completely crushed the innovative early leading web browser, Netscape, while attempting to control user’s interactions with the Internet by increasingly tying online functionality into the operating system itself. This “Internet thing” wasn’t going to stop Microsoft: Netscape was vanquished, and things were the same as they ever were in Microsoft’s Redmond, Washington stronghold.

Only, they weren’t. Microsoft succumbed to the classic error of clinging to an outdated business model, of being too fearful of losing a massive, proven market and revenue stream as opposed to introducing innovation. By ignoring the larger, underlying changes to computing that the Internet necessarily represented, Microsoft entered a decade of slow decline and, as importantly, opened the door for Apple’s very survival (it wasn’t all that long ago this was seen as tenuous), the rapid and somewhat unprecedented growth of the open source movement, and most ominously for Microsoft’s future, the emergence of Google as the true giant—if not yet bully—of the industry playground.

In Part 2, The Rise of Google, we will look at the process Google took to claim the title of the most important company in computing. Part 3 will look ahead towards Google’s role in the new decade.

Topics: apple, ibm, microsoft, Analysis, Blog, google