This is my editorial for the Spring 2015 issue of HardCopy magazine:
Recently I powered up my Windows Phone to discover that it had downloaded and installed a new app – not something I’d chosen for myself, you understand, but something that Microsoft obviously thought I needed. I am talking about Cortana, which (I assume) has appeared on countless other Windows Phones as well.
For those not blessed with a Windows Phone, Cortana is Microsoft’s answer to Apple’s Siri, Google Now or Amazon Echo in that it’s an Intelligent Personal Assistant, designed to feed you information tailored to your personal needs and desires. Such information can range from a reminder that you’ve got an appointment with your boss, to a notification that your favourite band is playing in Barcelona at the same time as you’re planning a short break, and a link to the ticket office, the airline, and a bijoux hotel close to the venue that it thinks you might like. Read more…
This is my editorial for the Winter 2014 issue of HardCopy magazine:
I recently succumbed to my desires, and the repeated pleading of my wife (who was getting pretty fed up of me borrowing her iPad), and bought myself a tablet. I didn’t intend to. What I really wanted was a notebook PC; something that had a decent keyboard so I could actually do some proper work in a café or on a train, but small and light enough to fit in a shoulder bag. A MacBook Air would have done nicely, but I’m getting too old to learn yet another operating system, and they are pretty expensive. I toyed with the idea of a ‘convertible’, and even considered a Microsoft Surface.
But then I came across the Lenovo Miix 2, and I was hooked. It uses one of the latest quad-core Intel Atom processors and 32GB of solid state memory, which means it’s pretty fast and has a decent battery life. It runs the full version of Windows 8.1, and even comes with Microsoft Word, Excel, Powerpoint and OneNote 2013 ready installed at a third of the price of a MacBook Air. I took some convincing that I could live with an 8-inch screen, but it is incredibly light and really very usable. I haven’t touched my wife’s iPad since. Read more…
This is my editorial for the Summer 2014 issue of HardCopy magazine:
When we look back at the first few decades of the 21st century, in a couple of decades from now, what will be our take on the Internet? Like most people, I have always assumed that it will continue to grow, getting faster and more ubiquitous as technologies develop, and burgeoning with endpoints as the Internet of Things comes on line, but essentially the same as now. However recently I have begun wondering whether the view might be somewhat different: that instead we will look back with fondness at an era when the Internet blossomed, before falling apart, an inevitable victim of the machinations of governments and corporations.
The Internet was recently described by Vladimir Putin as a “CIA project”, and he does have a point. It did indeed originate in a US government funded project to link organisations involved in the Cold War and the Space Race. However those organisations included the Stanford Research Institute, the University of Utah, MIT and Harvard, where the students who went on to create many of the technologies we now take for granted were given unprecedented levels of funding to research almost anything they wanted.
Military communications moved to MILNET in 1983, and then in the late 1980s, once what remained had developed into something capable of handling the traffic, the process of “commercialising and privatising” began. What we now call the Internet was officially opened for “private and business use” in 1992, and the first websites appeared shortly after. Thanks to the original investment of US taxpayers’ money, and the relatively enlightened manner in which it was handed over to the private sector, we now have a network that spans the globe and has in general been driven by a desire to create a level and secure playing field for everyone.
However that network is now under threat. Snowden’s revelations reveal that not only the National Security Agency but intelligence agencies around the world have been ‘hacking the Internet’ with gay abandon, often with the cooperation of the companies that run it. As The Economist stated in its article ‘The Snowden effect’ (24 Jan 2014), “the big consequence … will be that countries and companies will erect borders of sorts in cyberspace.” Then there is the Federal Communication Commission which is looking to allow broadband companies to charge companies for higher speed connections, so creating a multi-tier Internet that gives priority to big business. And finally there’s the shadowy Trans-Pacific Partnership which has designs on our freedom of speech and right to privacy.
These are complex issues which makes it difficult to raise awareness, but unless we do, we won’t know what we stood to lose until it’s already gone.
This is my editorial for the Spring 2014 issue of HardCopy magazine:
Back in 1978, the BBC’s Horizon broadcast an episode called ‘Now The Chips are Down’, which predicted mass unemployment as a result of the microprocessor. It was followed a year later by a six-part series on ITV which made similar predictions. So seriously were the issues taken that the government launched the Microelectronics Education Programme (MEP) which aimed to “help schools prepare children for a life in a society in which devices and systems based on microelectronics are commonplace and pervasive.” It called for revisions to the curriculum, specialist teacher training and the provision of microcomputers in secondary schools throughout the land. It also spawned the BBC Computer Literacy Project and the much-loved BBC Microcomputer. Read more…
This is my editorial for the Winter 2013 issue of HardCopy magazine:
There are two distinct sides to the computer industry. On one sit the hardware manufacturers. For them, each unit produced costs money to make and money to ship, and the industry operates in much the same way as that of the car or the TV. For the past decade, particularly since Apple adopted the Intel architecture, there has been little to distinguish one manufacturer’s product from another, which means a greater reliance on brand awareness. However competition is fierce and it is difficult for one brand to dominate the market for long. Apple only succeeded in carving a niche by firmly establishing itself early on as a supplier of luxury goods at premium prices.
On the other side are two industries that have benefited from hitherto unprecedented economies of scale, namely those involved in software and the silicon chip. Developing something like an Intel i7 Core processor or a modern operating system is extremely costly. Against that the cost of creating a copy, or even millions of copies, is insignificant – and the more copies you create, the thinner that initial investment gets spread. Read more…
Matt Nicholson uncovers five quotes that define 30 years of computing
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
Thirty years is a long time in any industry, let alone one as fast-moving as this. By 1983, when Grey Matter first set up shop, the IBM PC had been around for a couple of years, but there were only a few early clones about, and Apple had yet to launch the Macintosh. The Commodore 64 was more common than the IBM PC, and the Apple II was still going strong, as was the Sinclair Spectrum and the BBC Micro. In the business world, CP/M was more popular than MS-DOS.
But the computer industry is as much about personalities as it is about technology. We now regard people such as Bill Gates and Steve Jobs as being the movers and shakers, but at the time they were no more aware of the way the industry would grow and change than anyone else. And in hindsight, some of the things they said in the heat of the moment are worth revisiting.
“Well, Steve… I think it’s more like we both had this rich neighbour named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.” Bill Gates, Microsoft, 1984
Contrary to common belief, the graphical user interfaces (GUIs) that we all ‘know and love’ were not invented by Apple or Microsoft. In fact, most of the relevant ingredients came out of work carried out for Xerox at its Palo Alto Research Center (PARC) much earlier. By the mid-1970s, researchers at Xerox PARC were sitting in front of GUIs running on networked workstations, sending each other emails, and printing on laser printers. However without microprocessors, the technology was prohibitively expensive.
Fascinated by the new microcomputers that were beginning to appear, and realising that the company they worked for was too big and cumbersome to compete, a small group of Xerox executives came to an agreement with Steve Jobs that would allow Xerox to purchase a $1m stake in Apple in exchange for a demonstration of this technology. The result was first the Apple Lisa and then the Apple Macintosh in 1984, both developed with the help of a number of engineers who left Xerox to join Apple. Read more…
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
The annual Intel Software Conference is always worth attending, and not just for its location (this year, a converted chateau just outside Chantilly). Intel has been the driving force behind the personal computer industry from the beginning, and introduced this series of events in 2006 as a way of alerting developers to the need to master the parallel programming techniques that will allow them to take advantage of multi-core processors.
The industry has changed considerably in the intervening years, and so has this conference. Initially the focus was on Intel Core processors on the desktop and Intel Xeon in the servers. Intel’s main weapon here has been Parallel Studio. Introduced in 2009, this collection of tools and libraries is aimed primarily at optimising C/C++ code for multi-core processing. Read more…