I recently succumbed to my desires, and the repeated pleading of my wife (who was getting pretty fed up of me borrowing her iPad), and bought myself a tablet. I didn’t intend to. What I really wanted was a notebook PC; something that had a decent keyboard so I could actually do some proper work in a café or on a train, but small and light enough to fit in a shoulder bag. A MacBook Air would have done nicely, but I’m getting too old to learn yet another operating system, and they are pretty expensive. I toyed with the idea of a ‘convertible’, and even considered a Microsoft Surface.
But then I came across the Lenovo Miix 2, and I was hooked. It uses one of the latest quad-core Intel Atom processors and 32GB of solid state memory, which means it’s pretty fast and has a decent battery life. It runs the full version of Windows 8.1, and even comes with Microsoft Word, Excel, Powerpoint and OneNote 2013 ready installed at a third of the price of a MacBook Air. I took some convincing that I could live with an 8-inch screen, but it is incredibly light and really very usable. I haven’t touched my wife’s iPad since.
That said, there was one point during the set-up procedure which I found decidedly unsettling, and that was when I logged in to the device using my Microsoft account. It didn’t let me do this until I had typed in the two-phase authentication code that Microsoft texted to my phone, but no sooner had I done that than the Facebook app came live; the People app became populated with contacts pulled in from my Exchange account and the likes of LinkedIn and Twitter (including some I didn’t know I had); and I was logged into my OneDrive account which, as a DropBox user, I’d long forgotten existed.
Of course this only happened because, over the course of the last few years, I’ve allowed various apps, particularly on my Windows Phone, to access various services. When a website asks if you want to log in using your Facebook account, rather than dream up and have to remember yet another password, it’s awfully tempting to say yes. However this is the first time the full extent of the cross-authentication that I have authorised has been brought home to me, and with it the realisation that my only protection is the single password that gives me access to the device itself: once someone’s got through that, they’ve got access to a considerable chunk of my personal data.
It is all very convenient for the end user, but the convenience hides risks that the industry has little interest in bringing to our attention. Microsoft does provide some facilities for managing linked accounts, but the full implications are not obvious. I’m certainly not going to touch a PIN-less mobile payment system, for example, until I’m much more confident of what’s going on behind the scenes.
This is my editorial for the Summer 2014 issue of HardCopy magazine:
When we look back at the first few decades of the 21st century, in a couple of decades from now, what will be our take on the Internet? Like most people, I have always assumed that it will continue to grow, getting faster and more ubiquitous as technologies develop, and burgeoning with endpoints as the Internet of Things comes on line, but essentially the same as now. However recently I have begun wondering whether the view might be somewhat different: that instead we will look back with fondness at an era when the Internet blossomed, before falling apart, an inevitable victim of the machinations of governments and corporations.
The Internet was recently described by Vladimir Putin as a “CIA project”, and he does have a point. It did indeed originate in a US government funded project to link organisations involved in the Cold War and the Space Race. However those organisations included the Stanford Research Institute, the University of Utah, MIT and Harvard, where the students who went on to create many of the technologies we now take for granted were given unprecedented levels of funding to research almost anything they wanted.
Military communications moved to MILNET in 1983, and then in the late 1980s, once what remained had developed into something capable of handling the traffic, the process of “commercialising and privatising” began. What we now call the Internet was officially opened for “private and business use” in 1992, and the first websites appeared shortly after. Thanks to the original investment of US taxpayers’ money, and the relatively enlightened manner in which it was handed over to the private sector, we now have a network that spans the globe and has in general been driven by a desire to create a level and secure playing field for everyone.
However that network is now under threat. Snowden’s revelations reveal that not only the National Security Agency but intelligence agencies around the world have been ‘hacking the Internet’ with gay abandon, often with the cooperation of the companies that run it. As The Economist stated in its article ‘The Snowden effect’ (24 Jan 2014), “the big consequence … will be that countries and companies will erect borders of sorts in cyberspace.” Then there is the Federal Communication Commission which is looking to allow broadband companies to charge companies for higher speed connections, so creating a multi-tier Internet that gives priority to big business. And finally there’s the shadowy Trans-Pacific Partnership which has designs on our freedom of speech and right to privacy.
These are complex issues which makes it difficult to raise awareness, but unless we do, we won’t know what we stood to lose until it’s already gone.
This is my editorial for the Spring 2014 issue of HardCopy magazine:
Back in 1978, the BBC’s Horizon broadcast an episode called ‘Now The Chips are Down’, which predicted mass unemployment as a result of the microprocessor. It was followed a year later by a six-part series on ITV which made similar predictions. So seriously were the issues taken that the government launched the Microelectronics Education Programme (MEP) which aimed to “help schools prepare children for a life in a society in which devices and systems based on microelectronics are commonplace and pervasive.” It called for revisions to the curriculum, specialist teacher training and the provision of microcomputers in secondary schools throughout the land. It also spawned the BBC Computer Literacy Project and the much-loved BBC Microcomputer. Read more…
This is my editorial for the Winter 2013 issue of HardCopy magazine:
There are two distinct sides to the computer industry. On one sit the hardware manufacturers. For them, each unit produced costs money to make and money to ship, and the industry operates in much the same way as that of the car or the TV. For the past decade, particularly since Apple adopted the Intel architecture, there has been little to distinguish one manufacturer’s product from another, which means a greater reliance on brand awareness. However competition is fierce and it is difficult for one brand to dominate the market for long. Apple only succeeded in carving a niche by firmly establishing itself early on as a supplier of luxury goods at premium prices.
On the other side are two industries that have benefited from hitherto unprecedented economies of scale, namely those involved in software and the silicon chip. Developing something like an Intel i7 Core processor or a modern operating system is extremely costly. Against that the cost of creating a copy, or even millions of copies, is insignificant – and the more copies you create, the thinner that initial investment gets spread. Read more…
Matt Nicholson uncovers five quotes that define 30 years of computing
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
Thirty years is a long time in any industry, let alone one as fast-moving as this. By 1983, when Grey Matter first set up shop, the IBM PC had been around for a couple of years, but there were only a few early clones about, and Apple had yet to launch the Macintosh. The Commodore 64 was more common than the IBM PC, and the Apple II was still going strong, as was the Sinclair Spectrum and the BBC Micro. In the business world, CP/M was more popular than MS-DOS.
But the computer industry is as much about personalities as it is about technology. We now regard people such as Bill Gates and Steve Jobs as being the movers and shakers, but at the time they were no more aware of the way the industry would grow and change than anyone else. And in hindsight, some of the things they said in the heat of the moment are worth revisiting.
“Well, Steve… I think it’s more like we both had this rich neighbour named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.” Bill Gates, Microsoft, 1984
Contrary to common belief, the graphical user interfaces (GUIs) that we all ‘know and love’ were not invented by Apple or Microsoft. In fact, most of the relevant ingredients came out of work carried out for Xerox at its Palo Alto Research Center (PARC) much earlier. By the mid-1970s, researchers at Xerox PARC were sitting in front of GUIs running on networked workstations, sending each other emails, and printing on laser printers. However without microprocessors, the technology was prohibitively expensive.
Fascinated by the new microcomputers that were beginning to appear, and realising that the company they worked for was too big and cumbersome to compete, a small group of Xerox executives came to an agreement with Steve Jobs that would allow Xerox to purchase a $1m stake in Apple in exchange for a demonstration of this technology. The result was first the Apple Lisa and then the Apple Macintosh in 1984, both developed with the help of a number of engineers who left Xerox to join Apple. Read more…
This article originally appeared in the Summer 2013 issue of HardCopy magazine.
The annual Intel Software Conference is always worth attending, and not just for its location (this year, a converted chateau just outside Chantilly). Intel has been the driving force behind the personal computer industry from the beginning, and introduced this series of events in 2006 as a way of alerting developers to the need to master the parallel programming techniques that will allow them to take advantage of multi-core processors.
The industry has changed considerably in the intervening years, and so has this conference. Initially the focus was on Intel Core processors on the desktop and Intel Xeon in the servers. Intel’s main weapon here has been Parallel Studio. Introduced in 2009, this collection of tools and libraries is aimed primarily at optimising C/C++ code for multi-core processing. Read more…
Identity theft can be devastating for both individuals and companies. Matt Nicholson finds out how you can combat it.
This article originally appeared in the Spring 2013 issue of HardCopy magazine.
1958 saw the publication of a short novel by science fiction writer Algis Budrys called Who? in which a Cold War scientist by the name of Dr Lucas Martino is caught in a devastating explosion at a secret research centre. He is ‘rescued’ by the Soviets who, in response to increasing diplomatic pressure, return him to the Americans several months later. However the man they return has undergone not only lengthy interrogation but also extensive surgery, to the extent that he is now unrecognisable. The rest of the book is devoted to the efforts taken by intelligence agent Shawn Rogers to determine whether this is actually Martino, who is vital to the Allied war effort, or a Soviet spy impersonating Martino, in which case he needs to be kept well away from Martino’s work. The task proves extremely difficult.
Although written over 50 years ago, the novel goes to the heart of an increasingly important problem: how we establish and protect our electronic identity. For most of us the solutions we adopt are laughably insecure, but the effects of identity theft can be absolutely devastating. Read more…