Matt Nicholson uncovers five quotes that define 30 years of computing
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
Thirty years is a long time in any industry, let alone one as fast-moving as this. By 1983, when Grey Matter first set up shop, the IBM PC had been around for a couple of years, but there were only a few early clones about, and Apple had yet to launch the Macintosh. The Commodore 64 was more common than the IBM PC, and the Apple II was still going strong, as was the Sinclair Spectrum and the BBC Micro. In the business world, CP/M was more popular than MS-DOS.
But the computer industry is as much about personalities as it is about technology. We now regard people such as Bill Gates and Steve Jobs as being the movers and shakers, but at the time they were no more aware of the way the industry would grow and change than anyone else. And in hindsight, some of the things they said in the heat of the moment are worth revisiting.
“Well, Steve… I think it’s more like we both had this rich neighbour named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.” Bill Gates, Microsoft, 1984
Contrary to common belief, the graphical user interfaces (GUIs) that we all ‘know and love’ were not invented by Apple or Microsoft. In fact, most of the relevant ingredients came out of work carried out for Xerox at its Palo Alto Research Center (PARC) much earlier. By the mid-1970s, researchers at Xerox PARC were sitting in front of GUIs running on networked workstations, sending each other emails, and printing on laser printers. However without microprocessors, the technology was prohibitively expensive.
Fascinated by the new microcomputers that were beginning to appear, and realising that the company they worked for was too big and cumbersome to compete, a small group of Xerox executives came to an agreement with Steve Jobs that would allow Xerox to purchase a $1m stake in Apple in exchange for a demonstration of this technology. The result was first the Apple Lisa and then the Apple Macintosh in 1984, both developed with the help of a number of engineers who left Xerox to join Apple.
Steve Jobs realised early on that his new machines would not sell unless they came with useful applications, so in 1981 he demonstrated a prototype to Bill Gates. Microsoft agreed to write versions of its new spreadsheet and word processor applications for the Macintosh. These were developed under the guidance of Charles Simonyi who had also worked at Xerox PARC, and had also given Gates a demonstration of Xerox PARC technology.
Then, in 1982, Gates came face to face with Visi On, a GUI that was being developed by VisiCorp which was already well-known for VisiCalc. Not wanting to lose out, Gates instigated a new project that would eventually see the light of day as Microsoft Windows.
Gates had agreed not to release any kind of GUI until at least a year after January 1983, which was when the Macintosh was scheduled to launch. However Apple was running late, so when VisiCorp actually released Visi On that October, Gates decided he could wait no longer. Understandably upset, Jobs summoned Gates to Apple headquarters and promptly lost his temper. Gates’ reply hides a more complicated story, but he did have a point.
“What was often difficult for people to understand about the design was that there was nothing else beyond URIs, HTTP and HTML. There was no central computer ‘controlling’ the Web, no single network on which these protocols worked, not even an organisation anywhere that ‘ran’ the Web.” Tim Berners-Lee, CERN, 1990
By the end of the 1980s, the Internet was well established. However it was still largely controlled by the US government. This had ensured the widespread adoption of TCP/IP – at least within the US – but the network still contained many independent systems that had problems communicating with each other. Access for those who did not work in government or academic institutions was limited, although ordinary users could dial in to services such as CompuServe or CIX and participate in discussion groups and the like.
The situation was particularly difficult for an organisation such as the European Organisation for Nuclear Research (more commonly known as CERN) which had branches across Europe and links to the US. One person who found it annoying enough to do something about it was Tim Berners-Lee, a British contractor working at the CERN laboratory in Geneva. Berners-Lee wanted to create a system that would make it easier to bring together documents held in disparate systems, and he realised that such a system would have to be extremely simple if it was to have any chance of being widely adopted.
The first Web site gained its first user on Christmas Day 1990, but growth was initially slow. By the start of 1993 there were around 50 Web sites running, and the figure grew to some 1,500 over the next 18 months. However this was just the start – there are now countless millions of Web sites across the globe.
“If I were running Apple, I would milk the Macintosh for all it’s worth – and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.” Steve Jobs, Pixar, 1996
Steve Jobs was a genius, but he could also be very difficult to work with. By 1985, Apple could take no more and side-lined Jobs into a figurehead role. Jobs resigned in disgust and sold all but one of his shares in the company, netting him some $100 million. Over the years he got involved in various projects, including Pixar which he eventually owned outright. By 1996 he was worth well over a billion dollars.
Initially Apple did very well with the Macintosh range. However by the early 1990s it was heading for disaster, tied up in its own bureaucracy and unable to compete with low-cost PC clones and the increasingly powerful Microsoft Windows.
Jobs’ feelings towards Apple were complicated, perhaps best summed up by his friend Larry Ellison who likened it to the relationship between a married man and an old girlfriend who has fallen in with a bad crowd – wanting to help but wary of getting too involved. This off-hand comment to FORTUNE magazine describes exactly what Jobs did when he eventually returned to Apple in 1997, re-inventing the Macintosh with the iMac, and then launching first the iPod in 2001 and then the iPhone in 2007.
“There’s no magic line between an application and an operating system that some bureaucrat in Washington should draw. It’s like saying that as of 1932, cars didn’t have radios in them, so they should never have radios in them.” Bill Gates, Microsoft, 1997
The World Wide Web took Microsoft by surprise, and Gates was particularly disturbed by talk of an ‘Internet Operating System’ which would revolve around the Netscape browser and relegate Windows to little more than a “poorly debugged set of device drivers” (as Netscape founder Marc Andreessen tactfully put it). Microsoft’s response was to quickly bundle Internet Explorer into the forthcoming Windows 95, so ensuring that its icon would be prominently displayed on the screen of every new PC compatible. Unimpressed, Netscape and a number of other companies lobbied the US Department of Justice (DoJ) to file a petition holding Microsoft in contempt of court.
Microsoft had already been taken to task by the DoJ for the way in which it licensed PC manufacturers to distribute Windows. This had led to Microsoft agreeing, amongst other things, not to make the licensing of one product conditional on the licensing of another. However the agreement did add that this “shall not be construed to prohibit Microsoft from developing integrated products.” The case therefore hinged on whether Internet Explorer was an ‘integrated product’ or not.
Microsoft won the case on appeal, at which point the DoJ sued the company for violations of the Sherman Antitrust Act. The case dragged on for several years, almost reaching the point at which the DoJ would order the company to be split into two – one responsible for the operating system and the other for applications. In the event the government balked at breaking up one of the country’s most successful companies, and much of the case was thrown out following agreement that Microsoft would operate in a more open manner in future.
But that was not the end of it. Between 2003 and 2005, Microsoft paid out well over $5 billion in settlement of numerous cases arising from the DoJ’s findings, including over $600m to the European Commission. Despite that, Microsoft’s profit more than doubled over the same period, reaching $12.6 billion in 2006, by which time Internet Explorer accounted for more than 90 per cent of the browsers in use.
“The free lunch is over.” Herb Sutter, Microsoft, 2005
For most of the previous three decades, the speed at which microprocessors operate, as measured by clock frequency, has increased by a factor of ten every ten years, reaching 10MHz in 1980, then 100MHz by the mid-1990s, and on to 1.5GHz by 2000. However this trend faltered in 2003, hitting a wall at around 3.4GHz beyond which quantum effects and heat build-up made it uneconomic to go any faster. This was not the end of Moore’s Law – the number of transistors that can be fitted on a chip continues to grow exponentially – so Intel’s solution was to increase processing power not by increasing clock frequency, but by putting more than one processing core on the same chip.
However, this poses problems for the programmer. Until that point, software developers have benefited from a ‘free lunch’ in that the speed at which their software runs has increased every time a faster chip has come along, without them having to do anything. But as Herb Sutter warned in his article for Dr Dobb’s Journal, this is no longer the case. If programmers want to keep up with the times they need to rewrite their software to take advantage of multi-core processors.