This is my editorial for the Summer 2014 issue of HardCopy magazine:
When we look back at the first few decades of the 21st century, in a couple of decades from now, what will be our take on the Internet? Like most people, I have always assumed that it will continue to grow, getting faster and more ubiquitous as technologies develop, and burgeoning with endpoints as the Internet of Things comes on line, but essentially the same as now. However recently I have begun wondering whether the view might be somewhat different: that instead we will look back with fondness at an era when the Internet blossomed, before falling apart, an inevitable victim of the machinations of governments and corporations.
The Internet was recently described by Vladimir Putin as a “CIA project”, and he does have a point. It did indeed originate in a US government funded project to link organisations involved in the Cold War and the Space Race. However those organisations included the Stanford Research Institute, the University of Utah, MIT and Harvard, where the students who went on to create many of the technologies we now take for granted were given unprecedented levels of funding to research almost anything they wanted.
Military communications moved to MILNET in 1983, and then in the late 1980s, once what remained had developed into something capable of handling the traffic, the process of “commercialising and privatising” began. What we now call the Internet was officially opened for “private and business use” in 1992, and the first websites appeared shortly after. Thanks to the original investment of US taxpayers’ money, and the relatively enlightened manner in which it was handed over to the private sector, we now have a network that spans the globe and has in general been driven by a desire to create a level and secure playing field for everyone.
However that network is now under threat. Snowden’s revelations reveal that not only the National Security Agency but intelligence agencies around the world have been ‘hacking the Internet’ with gay abandon, often with the cooperation of the companies that run it. As The Economist stated in its article ‘The Snowden effect’ (24 Jan 2014), “the big consequence … will be that countries and companies will erect borders of sorts in cyberspace.” Then there is the Federal Communication Commission which is looking to allow broadband companies to charge companies for higher speed connections, so creating a multi-tier Internet that gives priority to big business. And finally there’s the shadowy Trans-Pacific Partnership which has designs on our freedom of speech and right to privacy.
These are complex issues which makes it difficult to raise awareness, but unless we do, we won’t know what we stood to lose until it’s already gone.
This is my editorial for the Spring 2014 issue of HardCopy magazine:
Back in 1978, the BBC’s Horizon broadcast an episode called ‘Now The Chips are Down’, which predicted mass unemployment as a result of the microprocessor. It was followed a year later by a six-part series on ITV which made similar predictions. So seriously were the issues taken that the government launched the Microelectronics Education Programme (MEP) which aimed to “help schools prepare children for a life in a society in which devices and systems based on microelectronics are commonplace and pervasive.” It called for revisions to the curriculum, specialist teacher training and the provision of microcomputers in secondary schools throughout the land. It also spawned the BBC Computer Literacy Project and the much-loved BBC Microcomputer. Read more…
This is my editorial for the Winter 2013 issue of HardCopy magazine:
There are two distinct sides to the computer industry. On one sit the hardware manufacturers. For them, each unit produced costs money to make and money to ship, and the industry operates in much the same way as that of the car or the TV. For the past decade, particularly since Apple adopted the Intel architecture, there has been little to distinguish one manufacturer’s product from another, which means a greater reliance on brand awareness. However competition is fierce and it is difficult for one brand to dominate the market for long. Apple only succeeded in carving a niche by firmly establishing itself early on as a supplier of luxury goods at premium prices.
On the other side are two industries that have benefited from hitherto unprecedented economies of scale, namely those involved in software and the silicon chip. Developing something like an Intel i7 Core processor or a modern operating system is extremely costly. Against that the cost of creating a copy, or even millions of copies, is insignificant – and the more copies you create, the thinner that initial investment gets spread. Read more…
Matt Nicholson uncovers five quotes that define 30 years of computing
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
Thirty years is a long time in any industry, let alone one as fast-moving as this. By 1983, when Grey Matter first set up shop, the IBM PC had been around for a couple of years, but there were only a few early clones about, and Apple had yet to launch the Macintosh. The Commodore 64 was more common than the IBM PC, and the Apple II was still going strong, as was the Sinclair Spectrum and the BBC Micro. In the business world, CP/M was more popular than MS-DOS.
But the computer industry is as much about personalities as it is about technology. We now regard people such as Bill Gates and Steve Jobs as being the movers and shakers, but at the time they were no more aware of the way the industry would grow and change than anyone else. And in hindsight, some of the things they said in the heat of the moment are worth revisiting.
“Well, Steve… I think it’s more like we both had this rich neighbour named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.” Bill Gates, Microsoft, 1984
Contrary to common belief, the graphical user interfaces (GUIs) that we all ‘know and love’ were not invented by Apple or Microsoft. In fact, most of the relevant ingredients came out of work carried out for Xerox at its Palo Alto Research Center (PARC) much earlier. By the mid-1970s, researchers at Xerox PARC were sitting in front of GUIs running on networked workstations, sending each other emails, and printing on laser printers. However without microprocessors, the technology was prohibitively expensive.
Fascinated by the new microcomputers that were beginning to appear, and realising that the company they worked for was too big and cumbersome to compete, a small group of Xerox executives came to an agreement with Steve Jobs that would allow Xerox to purchase a $1m stake in Apple in exchange for a demonstration of this technology. The result was first the Apple Lisa and then the Apple Macintosh in 1984, both developed with the help of a number of engineers who left Xerox to join Apple. Read more…
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
The annual Intel Software Conference is always worth attending, and not just for its location (this year, a converted chateau just outside Chantilly). Intel has been the driving force behind the personal computer industry from the beginning, and introduced this series of events in 2006 as a way of alerting developers to the need to master the parallel programming techniques that will allow them to take advantage of multi-core processors.
The industry has changed considerably in the intervening years, and so has this conference. Initially the focus was on Intel Core processors on the desktop and Intel Xeon in the servers. Intel’s main weapon here has been Parallel Studio. Introduced in 2009, this collection of tools and libraries is aimed primarily at optimising C/C++ code for multi-core processing. Read more…
Identity theft can be devastating for both individuals and companies. Matt Nicholson finds out how you can combat it.
This article originally appeared in the Spring 2013 issue of HardCopy magazine.
1958 saw the publication of a short novel by science fiction writer Algis Budrys called Who? in which a Cold War scientist by the name of Dr Lucas Martino is caught in a devastating explosion at a secret research centre. He is ‘rescued’ by the Soviets who, in response to increasing diplomatic pressure, return him to the Americans several months later. However the man they return has undergone not only lengthy interrogation but also extensive surgery, to the extent that he is now unrecognisable. The rest of the book is devoted to the efforts taken by intelligence agent Shawn Rogers to determine whether this is actually Martino, who is vital to the Allied war effort, or a Soviet spy impersonating Martino, in which case he needs to be kept well away from Martino’s work. The task proves extremely difficult.
Although written over 50 years ago, the novel goes to the heart of an increasingly important problem: how we establish and protect our electronic identity. For most of us the solutions we adopt are laughably insecure, but the effects of identity theft can be absolutely devastating. Read more…
This is my editorial from the Spring 2013 issue of HardCopy magazine:
Over the past decade or so, those involved in the distribution of intellectual property (IP) have had their worlds turned inside out by the Internet. First it was the music business getting to grips with the illegal download of music through services such as Napster. More recently the film industry has had to watch DVD sales plummet as online streaming sites such as Netflix steal business away from high-street shops. Meanwhile there’s panic in the bookshops as they attempt to compete with Amazon and the downloadable eBook. About the only IP industry not affected is that of invention. As far as I know, patents continue to be licensed and sold in much the same was as they’ve always been.
The reason for this revolution lies in the fact that the Internet has all but eliminated the cost of distribution. Prior to the Internet, distributing IP such as music, film or books cost serious money. Vinyl records, CDs and DVDs had to be manufactured; paperbacks and hardbacks had to be printed and transported. But underneath it all was just raw data, and with the Internet, the incremental cost of distributing data is virtually nothing.
Of course one of the first industries to get to grips with the Internet was the software industry. Even before the Internet, distribution costs were fairly low, and it quickly became obvious that the value lay in the intellectual property – in other words the source code – rather than in the medium by which it was distributed, which is why you buy a licence to use software, rather than the software itself. This is of course also true of films and music and books, but here it is readily accepted that customers can sell the DVDs and CDs and paperbacks they have bought to their friends, because this is a clear transfer of use from one person to another.
Which makes the recent case of Usedsoft vs. Oracle particularly interesting. Like most software companies, what Oracle sells is a perpetual licence to use its software, rather than the software itself. German company Usedsoft, on the other hand, are in the business of buying and selling unused software licences. Despite vigorous argument from Oracle, the European Court of Justice has ruled that anyone in possession of a perpetual licence to use a computer program has the right to sell the licence on, even if they had originally downloaded the software from the author’s Web site. The ruling effectively puts software on the same footing as paperbacks and DVDs, so going against the generally accepted view that a perpetual licence cannot be transferred.
For the industry, volume licences which are renewed every year or so provide a measure of defence against the decision. However software distribution is already learning lessons from elsewhere. Services such as Adobe Creative Cloud or Microsoft Office 365, where software is sold on a monthly or annual subscription service, bear comparison with Netflix, which serves much the same purpose.