Why Android isn’t the new Windows: the server-client system wasn’t around in the 1990s

Six months ago venture capitalist Fred Wilson sparked a slight controversy on his blog by stating that Android was going to be the dominant mobile phone operating system, and he's recently returned to the subject, triumphantly boasting that figures for February showing that his prediction that the mobile phone market was going to develop like the PC market did in the 1990s, with Apple replaying its failed bid to offer an alternative to the PC and Android in the role of Windows:

It looks like the Verizon iPhone launch is helping iOS hold its own with 25% of the market. I expect (and hope) that iOS will remain a strong competitor to Android. But as I've been saying for several years now, I believe the mobile OS market will play out very similarly to Windows and Macintosh, with Android in the role of Windows. And so if you want to be in front of the largest number of users, you need to be on Android.

Last night Apple fanboi supremo John Gruber came up with what I thought was an unjustifiably moderate response to Mr Wilson, effectively saying the game was still Apple's to win, but without giving any real arguments to back up his position:

That’s the question of the decade. Is mobile going to work out like the console market, with a handful of competing and roughly equal major platforms? Or is it going to work out like the PC, where a lower-cost inferior licensed OS grows to an overwhelmingly dominant monopoly position?

It's unusual for me to absolutely certain about markets, but there's never been the slightest doubt in my mind that this was not going to happen.

The factor Mssrs Wilson and Gruber are failing to see is one which has been driving every trend the tech sector in the last fifteen years: the server/client model that emerged as a result of the technological revolution unleashed by the Internet.

Windows' emergence as the dominant operating system for computers started in about 1990 (I can still remember buying my first copy of Word for Windows 1.0 in a Chicago software store in April of that year and being blown away by how user-friendly it was) and continued relentless until the mid-1990s, by which time it was essentially complete.

Why did this happen, when Apple then already offered, for some years, an arguably superior operating system that Windows possibly plagiarised?

The reason's simple: the Internet wasn't yet in widespread use in the early 1990s.

This meant that when you wanted to use your computer (which, in those days, was still likely to be a desktop computer), you had to rely on software you installed from a floppy disk and the applications you ran would store your information on your computer's hard drive.

For people who needed to work together, this meant they had to share not only the same information, but also the same applications. This meant that, in practice, if you wanted to share information with someone, you had to use the same operating system. This didn't apply to hermits who used their computers to work on their own and never shared the result of their labours with anyone—but even before the rise of social networking, there weren't many pure hermits around. People used their computers to work, to a greater or lesser extent, with other people.

Sure, when someone told me they had a Mac and wanted my essay in Word for Mac format, there were ways in which I could get it to them: I could save the essay in Mac format using Word's built-in conversion tool; I then had to invest in a special utility to format a 2.5" diskette (which would be a different size from the 5" floppies that many PCs still used) to make it readable for Macs, transfer my essay to it; and have it physically delivered or sent to its intended reader.

Most people didn't bother and just defaulted to the operating system most people used: Windows. As a result, developers also didn't often bother offering Mac versions of their applications. Not only were there fewer people using Macs—there was less software for them to use, and it was darned difficult building bridges between them. A classic chicken-and-egg situation.

Consider the same situation today. People don't store the stuff they want to share on their computers. They store most of their stuff on an online server: when I send you an email, I use whatever server configuration I prefer (in my case, Google Apps), send the email using a client (in my case, Sparrow App), and the email gets stored on Google servers. End of story. I don't even think, let alone worry about whether you're on a Mac, a PC, an iPhone, nor am I interested in whether you're also using Gmail/Google Apps, or whether you prefer Hotmail, AOL or Yahoo Mail (although I'd pity you if you told me you were). What combination of server and client setup you're using is immaterial; all that matters is that we're using the same protocol.

So what Wilson is crucially failing to see is that his comparison with the pre-Internet Windows-Mac war is based on factors that no longer apply: today's overall tech ecosystem is based on the Internet—which it wasn't then—and the dichotomy between servers and clients means that information can be shared by anyone, irrespective of what operating system they're using.

The implication of this are huge: whereas in the 1990s, even if you were convinced that Mac was the best operating system (which I'm not sure it was anyway before OS X, but that's another story), there wasn't much point in using it; today, it doesn't matter if the tools (operating systems and client apps) you're using are a tiny niche, so long as you're accessing the same servers, using the same protocols, as the people with whom you're sharing your work.

You aren't really tied to an application, except in the very rare cases where using it requires you to store data in a specific way and it isn't easy to port it elsewhere (yes, I'm looking at you, Facebook). If you want to use a different web host, email provider or text-editing app, you just purchase the corresponding software, with one click, usually for a fraction of what you had to pay for software that had to be delivered to you physically fifteen years ago. This standardisation and technical simplification has provided developers with both the means and the incentive to offer applications for most environments—so long as demand was forthcoming from customers willing to pay. Which brings me to my final point.

Despite the fact that's it's considerably easier, for the above reasons, to develop client applications across platforms than it was fifteen years ago, Android is way behind the iPhone in terms of the number of applications it actually sells to its users, and those applications are consistently of much lower quality. Developers just aren't interested in developing for Android. In considering whether Android is going to win, Fred Wilson doesn't even so much as mention the fact that Android sucks. Not many of my friends use anything other than an iPhone, but the couple of ones that do use Android hate it for its ugliness—a good part of that ugliness being the result of Google's fatally-flawed policy of not reviewing apps before they become available. Compounding this strategic error which means that finding a good Android app is well-nigh impossible, Google chose to allow phone operators that use the software to install proprietor versions of it on the phones they sell, meaning once they pocket the price of the phone they have no incentive to maintain that software, which they invariably pack with useless proprietary bloat in an attempt to lock their subscribers into their own ecosystem of junk.

In contrast, each of Apple's iOS software updates is eagerly awaited by phone owners and developers alike. Because it's out of the control of the phone companies, the investment you made in your phone isn't rapidly depreciated. If I wanted to, I could still use any of the five iPhone's I've owned since 2007 and still have a meaningful user experience on them, ensuring that the iPhone will remain both more relevant and also vastly more profitable than the Android, irrespective of its market share.

Being 'the new Windows' was always going to be a dubious distinction. I've absolutely no doubt as to which is the superior system: developers will continue to prefer the iPhone, because it's so much better. The Android won't disappear, of course: it'll remain, I'm afraid, a second-best, substandard—and, crucially, lower-margin—alternative for the undiscerning hoi polloi.