Skip to content

The future of computing

17 Feb 2011

What follows is a cut-down version of the cover feature of the 50th issue of HardCopy magazine, originally published Nov 2010:

Any attempt to foretell the future is fraught with difficulties, to say nothing of the potential for the sort of embarrassment embodied by Thomas Watson’s apocryphal 1943 statement, “I think there is a world market for about five computers.” Nevertheless, let’s give it a go.

Perhaps the safest place to start is with Moore’s Law. This was formulated by Gordon Moore, co-founder of Intel, and states that the number of components that it is economic to put on a single chip doubles every two years. It has remained true from the Intel 4004, introduced in 1971 with 2,300 transistors on the chip, up to Intel’s latest Quad-Core Itanium processor which crams on over two billion.

Up until 2004, Moore’s Law had been synonymous with processing power, although this had more to do with increasing clock-speed than transistor density. However in 2004, Intel realised that it could not increase clock speed any further with existing technology, and that the best way forward was through multi-core processors able to execute more than one instruction at a time. The Intel Core Duo was introduced in 2006, and the Intel Core 2 Quad in 2008. At the time of writing, Intel’s Core i7 range includes two models boasting six cores, while Intel has talked about introducing processors with 64 cores sometime next year.

Of course programs won’t run any faster unless they are written to take advantage of parallel processors. Writing such code requires new disciplines, and although Intel will continue to introduce libraries and tools to simplify the process, it is the solution architect who needs to ‘think parallel’. It seems likely that such skills are going to be in high demand over the next few years – and in short supply if cuts in the education budget continue to bite.

Looking further ahead, Intel has predicted that Moore’s Law will break down by around 2015, with quantum effects preventing components from getting any smaller. Whether or not this is the case, it does seem likely that new technologies will replace the silicon chip, which has been around for some 50 years. What these will be is open to question, but likely candidates include carbon nanotubes, which could allow the construction of semiconductor components at an atomic scale.

Then of course there’s quantum computing – computers that make use of quantum phenomena such as superposition and entanglement. Back in 2003, Grey Matter CTO Sean Wilson was already talking about such devices in this magazine, and here we are seven years later and not much closer to fruition. Forced to make a prediction, it does seem likely that, perhaps towards the end of this decade, quantum computers will exist. However they will be highly specialist and very expensive.

Something that will improve our lives sooner rather than later is the demise of the magnetic disk drive. Solid State Drive (SSD) technology has been with us for a few years now, and will move to the mainstream within a year or two. Toshiba launched a laptop with a 512Gb SSD over a year ago, and you can buy 256Gb SSD replacement hard disks for less than £400 right now. These drives are lightning fast in comparison to conventional hard disks, particularly when it comes to booting up Microsoft Windows or Office. They should quickly lead to a new generation of notebook-style devices that are lighter and faster and have longer battery life than anything we’ve seen so far. Waiting for boot-up could at last be relegated to history.

Clients and clouds

Not only will our computers get faster and faster, so will our networks. Nowadays, in the developed world at least, most of us use internal networks that operate at gigabits a second, and hardly notice whether we are accessing a hard disk on our own machine or on a server in the next building. The distinction only becomes apparent when we venture beyond our firewall. Even here we’re used to broadband speeds of many megabits per second, while large parts of China, Mexico and South Africa achieve a couple of megabits a second. Much of the Third World has broadband access, and O3b Networks is looking to use satellites to bring high speed Internet access to the ‘Other 3 billion’, starting in 2012.

All of which serves to blur the boundaries between one computer and the next, and make it less relevant where processing actually takes place. Virtualisation serves to disassociate the application from the physical hardware, and we can expect to see it becoming ubiquitous in future operating systems. Paul Stephens predicts the death of the Personal Computer by 2018, replaced by a credit-card sized module, or perhaps your mobile phone, which contains your personal desktop and communicates wirelessly with whatever user interface you happen to be near, whether it’s a TV, a touch pad or a screen and keyboard.

Cloud computing is also likely to become ubiquitous – so much so that, once teething problems over reliability and (more problematic) privacy are overcome, server applications are likely to be dished up in much the same way as electricity and telephone services are now. By the end of this decade, companies will no more think of maintaining their own data centres than they think of running their own power stations.

One thing that does seem to have doggedly stuck with us, despite many attempts to dislodge it, is the keyboard, mouse and screen, and the traditional desktop that (for most of us at least) arrived with the Apple Macintosh some 26 years ago. Accurate speech recognition has been with us for many years, and is very affordable with products like Dragon NaturallySpeaking, and yet very few of us actually talk to our computers. Once the stuff of science fiction, virtual reality headsets have also been available for some time, but are rarely seen outside the arcades. Indeed it is only in the gaming world that the more innovative interactive devices have gained a foothold.

The one exception is the touch screen which has come of age with portable devices such as the Apple iPad and phones running Google Android or Windows Phone 7. One development that we can expect to see more of over the next few years is ‘electronic paper’ of the sort found in devices such as the Amazon Kindle. Currently it’s an expensive technology and only practical in monochrome, but it does have the huge advantage of being reflective, and so readable in direct sunlight. We can expect to see colour versions becoming common on portable devices by the middle of the decade.

Looking further ahead, it is possible that that we will see user interfaces that interact more directly with the brain. At the very least we can hope for devices that are more attentive to what we are doing, and don’t pop up intrusive dialogs just as we’re trying to finish a difficult paragraph.

Survival of the fittest

Microsoft, Apple, Amazon and Google: big names now, but will they still be around in 2023? This is perhaps the hardest area to predict. Microsoft and Apple have already been around for some 35 years, so perhaps their time is up – but then IBM was founded in 1911 and, although drastically downsized in the early 1990s, it is still here. Amazon and Google are rather younger, dating from 1994 and 1998 respectively, but still older than HardCopy.

Perhaps rather more fruitful is to look at the companies that have disappeared or been acquired since our first issue. The most active acquirers have been Adobe, which absorbed both Macromedia and Allaire to become a major force in graphics and Web development; and Oracle, whose acquisitions have included BEA Systems, PeopleSoft, Siebel Systems, MySQL and of course Sun Microsystems. We’ve also seen HP take over Compaq; Micro Focus take over Borland and NuMega; Business Objects taking over Crystal Reports, and then being taken over by SAP; and Symantec consolidating its position in the important security sector by absorbing Veritas, PGP and Verisign.

If cloud computing does become prevalent over the next decade, then we can expect Microsoft, Amazon and Google to remain major players (although large-scale acquisitions can’t be ruled out). As for Apple, it’s built its reputation and its business as a supplier of stylish and desirable clients, and seems likely to do so for many years yet – although hopefully with healthy competition from the likes of HTC, Samsung and Sony.

Application development

The shift to the cloud is already having an impact on application development, and will continue to do so. Indeed Tim Anderson suggests that we could soon see Visual Studio Cloud Edition as a hosted service that you can connect to from any device. Once logged in you will be able to assemble a drag-and-drop HTML6 visual interface, connect it to standard service components, write a little C# code to stitch them together, and deploy it to the cloud.

That said, the fragmentation of mobile platforms and the demise of the PC will complicate your deployment options. Tim also suggests that the security and usability benefits of locked-down devices means that developers will increasingly have to go through an approval process and pay commission to an app store to get their applications deployed to the general public.

On a less optimistic (but probably realistic) note, Kay Ewbank foresees the continued appearance of tools that claim to make database applications easier to develop, but in practice simply hide the complexities beneath another layer and generate applications that you could probably translate back to IBM COBOL without much effort, or much difference in user satisfaction. Indeed when it comes down to it, many of the innovations claimed today are simply reworked versions of solutions devised decades ago.

Of course the big question is whether HardCopy will still be in print in 2023, or exist solely as a multi-media experience that can be served up to the personal interface of your choice. Whichever, I’m looking forward to the ride.

2 Comments leave one →
  1. A Aer permalink
    19 May 2011 13:28

    You’re too hard on Mr Watson- he was talking about the market AT THE TIME, the time being one in which computers were as big as rooms and as expensive as all hell. There is NO evidence he was saying what you seem to be implying ie that ‘there will NEVER EVER be a market for more than about 5 computers’

    Like

  2. Matt Nicholson permalink*
    19 May 2011 15:33

    There doesn’t actually seem to be a great deal of evidence that he said it at all (hence ‘apocryphal’).

    Like

What do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: