This is my editorial for the Spring 2014 issue of HardCopy magazine:
Back in 1978, the BBC’s Horizon broadcast an episode called ‘Now The Chips are Down’, which predicted mass unemployment as a result of the microprocessor. It was followed a year later by a six-part series on ITV which made similar predictions. So seriously were the issues taken that the government launched the Microelectronics Education Programme (MEP) which aimed to “help schools prepare children for a life in a society in which devices and systems based on microelectronics are commonplace and pervasive.” It called for revisions to the curriculum, specialist teacher training and the provision of microcomputers in secondary schools throughout the land. It also spawned the BBC Computer Literacy Project and the much-loved BBC Microcomputer. Read more…
This is my editorial for the Winter 2013 issue of HardCopy magazine:
There are two distinct sides to the computer industry. On one sit the hardware manufacturers. For them, each unit produced costs money to make and money to ship, and the industry operates in much the same way as that of the car or the TV. For the past decade, particularly since Apple adopted the Intel architecture, there has been little to distinguish one manufacturer’s product from another, which means a greater reliance on brand awareness. However competition is fierce and it is difficult for one brand to dominate the market for long. Apple only succeeded in carving a niche by firmly establishing itself early on as a supplier of luxury goods at premium prices.
On the other side are two industries that have benefited from hitherto unprecedented economies of scale, namely those involved in software and the silicon chip. Developing something like an Intel i7 Core processor or a modern operating system is extremely costly. Against that the cost of creating a copy, or even millions of copies, is insignificant – and the more copies you create, the thinner that initial investment gets spread. Read more…
Matt Nicholson uncovers five quotes that define 30 years of computing
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
Thirty years is a long time in any industry, let alone one as fast-moving as this. By 1983, when Grey Matter first set up shop, the IBM PC had been around for a couple of years, but there were only a few early clones about, and Apple had yet to launch the Macintosh. The Commodore 64 was more common than the IBM PC, and the Apple II was still going strong, as was the Sinclair Spectrum and the BBC Micro. In the business world, CP/M was more popular than MS-DOS.
But the computer industry is as much about personalities as it is about technology. We now regard people such as Bill Gates and Steve Jobs as being the movers and shakers, but at the time they were no more aware of the way the industry would grow and change than anyone else. And in hindsight, some of the things they said in the heat of the moment are worth revisiting.
“Well, Steve… I think it’s more like we both had this rich neighbour named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.” Bill Gates, Microsoft, 1984
Contrary to common belief, the graphical user interfaces (GUIs) that we all ‘know and love’ were not invented by Apple or Microsoft. In fact, most of the relevant ingredients came out of work carried out for Xerox at its Palo Alto Research Center (PARC) much earlier. By the mid-1970s, researchers at Xerox PARC were sitting in front of GUIs running on networked workstations, sending each other emails, and printing on laser printers. However without microprocessors, the technology was prohibitively expensive.
Fascinated by the new microcomputers that were beginning to appear, and realising that the company they worked for was too big and cumbersome to compete, a small group of Xerox executives came to an agreement with Steve Jobs that would allow Xerox to purchase a $1m stake in Apple in exchange for a demonstration of this technology. The result was first the Apple Lisa and then the Apple Macintosh in 1984, both developed with the help of a number of engineers who left Xerox to join Apple. Read more…
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
The annual Intel Software Conference is always worth attending, and not just for its location (this year, a converted chateau just outside Chantilly). Intel has been the driving force behind the personal computer industry from the beginning, and introduced this series of events in 2006 as a way of alerting developers to the need to master the parallel programming techniques that will allow them to take advantage of multi-core processors.
The industry has changed considerably in the intervening years, and so has this conference. Initially the focus was on Intel Core processors on the desktop and Intel Xeon in the servers. Intel’s main weapon here has been Parallel Studio. Introduced in 2009, this collection of tools and libraries is aimed primarily at optimising C/C++ code for multi-core processing. Read more…
Identity theft can be devastating for both individuals and companies. Matt Nicholson finds out how you can combat it.
This article originally appeared in the Spring 2013 issue of HardCopy magazine.
1958 saw the publication of a short novel by science fiction writer Algis Budrys called Who? in which a Cold War scientist by the name of Dr Lucas Martino is caught in a devastating explosion at a secret research centre. He is ‘rescued’ by the Soviets who, in response to increasing diplomatic pressure, return him to the Americans several months later. However the man they return has undergone not only lengthy interrogation but also extensive surgery, to the extent that he is now unrecognisable. The rest of the book is devoted to the efforts taken by intelligence agent Shawn Rogers to determine whether this is actually Martino, who is vital to the Allied war effort, or a Soviet spy impersonating Martino, in which case he needs to be kept well away from Martino’s work. The task proves extremely difficult.
Although written over 50 years ago, the novel goes to the heart of an increasingly important problem: how we establish and protect our electronic identity. For most of us the solutions we adopt are laughably insecure, but the effects of identity theft can be absolutely devastating. Read more…
This is my editorial from the Spring 2013 issue of HardCopy magazine:
Over the past decade or so, those involved in the distribution of intellectual property (IP) have had their worlds turned inside out by the Internet. First it was the music business getting to grips with the illegal download of music through services such as Napster. More recently the film industry has had to watch DVD sales plummet as online streaming sites such as Netflix steal business away from high-street shops. Meanwhile there’s panic in the bookshops as they attempt to compete with Amazon and the downloadable eBook. About the only IP industry not affected is that of invention. As far as I know, patents continue to be licensed and sold in much the same was as they’ve always been.
The reason for this revolution lies in the fact that the Internet has all but eliminated the cost of distribution. Prior to the Internet, distributing IP such as music, film or books cost serious money. Vinyl records, CDs and DVDs had to be manufactured; paperbacks and hardbacks had to be printed and transported. But underneath it all was just raw data, and with the Internet, the incremental cost of distributing data is virtually nothing.
Of course one of the first industries to get to grips with the Internet was the software industry. Even before the Internet, distribution costs were fairly low, and it quickly became obvious that the value lay in the intellectual property – in other words the source code – rather than in the medium by which it was distributed, which is why you buy a licence to use software, rather than the software itself. This is of course also true of films and music and books, but here it is readily accepted that customers can sell the DVDs and CDs and paperbacks they have bought to their friends, because this is a clear transfer of use from one person to another.
Which makes the recent case of Usedsoft vs. Oracle particularly interesting. Like most software companies, what Oracle sells is a perpetual licence to use its software, rather than the software itself. German company Usedsoft, on the other hand, are in the business of buying and selling unused software licences. Despite vigorous argument from Oracle, the European Court of Justice has ruled that anyone in possession of a perpetual licence to use a computer program has the right to sell the licence on, even if they had originally downloaded the software from the author’s Web site. The ruling effectively puts software on the same footing as paperbacks and DVDs, so going against the generally accepted view that a perpetual licence cannot be transferred.
For the industry, volume licences which are renewed every year or so provide a measure of defence against the decision. However software distribution is already learning lessons from elsewhere. Services such as Adobe Creative Cloud or Microsoft Office 365, where software is sold on a monthly or annual subscription service, bear comparison with Netflix, which serves much the same purpose.
This is my editorial from the Winter 2012 issue of HardCopy magazine:
Cast your mind back to 1995. Microsoft was well established, with either MS-DOS or Windows running on over 90 percent of desktop computers. However Bill Gates was beginning to worry about a new phenomena: the World Wide Web, which had been invented a few years previously.
In particular, he was worried by what a young man named Marc Andreessen was saying about his operating system. Andreessen was responsible for Netscape Navigator, the Web browser that was spreading like wildfire across his desktops, and Andreessen was starting to talk about something called the Internet Operating System: an environment in which Netscape Navigator became the user interface, giving access to applications that ran remotely across the Internet. Navigator was available not only for Windows but also for the Apple Macintosh and even for UNIX. In such a scenario, Windows became little more than a “poorly debugged set of device drivers.”
Other companies that had a bone to pick with Microsoft rallied to the call. Sun Microsystems had developed Java, which Netscape licensed so that its browser could run programs written in Java. Oracle supported the idea because its server technologies could help deliver the applications, and started talking about a Network Computer that would give users all they needed to access applications across the Internet – and that did not include Windows.
A few implementations did appear, including the Acorn Network Computer and the Sun JavaStation. However the intiative died a fairly swift death. The Internet was simply not fast enough and the hardware was too expensive for it to succeed.
But fast-forward to the present day and the concept makes more sense, and indeed is in the ascendant. Its most faithful manifestations are to be found in the cloud-hosted applications that we interact with through the browser, such as Office Web Apps or Google Docs. Then there are the ‘apps’ that we find in iOS, Android, Windows Phone and now Windows 8. These have a similar architecture to those of Java, although I imagine that Andreessen would have been dismayed by their platform dependence and mutual incompatibility.
An integral part of this new Internet Operating System is the cloud architecture, and the emerging realisation that it defines an architecture, rather than a location. It’s strength lies in its ability to operate out in the ‘public cloud’, but it also makes sense when applied in-house, and even more so in a hybrid architecture that allows any component to move seamlessly from location to location.
An important strength of such an architecture is its implications with regards to privacy and security. Company data and bespoke applications are core business assets, and many are unhappy at the prospect of hosting them outside the firewall. Adopting a cloud-based architecture means that discussions about their location can revolve around business considerations, rather than technical issues.