How Windows is shooting itself in the foot
What follows is my editorial from the February 2011 issue of HardCopy magazine:
I recently bought myself a brand new notebook PC. It’s not something I’ve done for a while but the battery life and performance of my previous model, a dinky little JVC number that seemed so sweet when I bought it some eight years back, was no longer tolerable. I wanted Windows 7, I wanted dual-core, I wanted light weight and I wanted something with a battery that would last at least a return trip to London, if not a transatlantic flight. I dabbled momentarily with the idea of an iPad, but I needed something that could integrate seamlessly into my Windows-centric world, and I baulked at the lack of keyboard. Netbooks seemed tempting but under powered, so in the end I plumbed for a very natty-looking notebook from a very respected manufacturer who shall, for reasons that will become apparent, remain nameless. Read more…
The future of computing
What follows is a cut-down version of the cover feature of the 50th issue of HardCopy magazine, originally published Nov 2010:
Any attempt to foretell the future is fraught with difficulties, to say nothing of the potential for the sort of embarrassment embodied by Thomas Watson’s apocryphal 1943 statement, “I think there is a world market for about five computers.” Nevertheless, let’s give it a go.
Perhaps the safest place to start is with Moore’s Law. This was formulated by Gordon Moore, co-founder of Intel, and states that the number of components that it is economic to put on a single chip doubles every two years. It has remained true from the Intel 4004, introduced in 1971 with 2,300 transistors on the chip, up to Intel’s latest Quad-Core Itanium processor which crams on over two billion.
Up until 2004, Moore’s Law had been synonymous with processing power, although this had more to do with increasing clock-speed than transistor density. However in 2004, Intel realised that it could not increase clock speed any further with existing technology, and that the best way forward was through multi-core processors able to execute more than one instruction at a time. The Intel Core Duo was introduced in 2006, and the Intel Core 2 Quad in 2008. At the time of writing, Intel’s Core i7 range includes two models boasting six cores, while Intel has talked about introducing processors with 64 cores sometime next year. Read more…
Accessing 32-bit DLLs from 64-bit code
Migrating your 32-bit Windows application to a 64-bit machine can be problematic if you have 32-bit DLLs that you cannot re-write. Mike Becker shows you how you can access 32-bit DLLs from 64-bit code using built-in IPC mechanisms.
Originally published on DNJ Online, June 2007
Microsoft’s 64-bit technology first appeared with Windows Server 2003 for Itanium 2 (also known as IA64 Architecture) and for eXtended technology CPUs (also known as x64 Architecture). It offers many advantages but also raises new issues for the software developer. For example, you may still need to access existing 32-bit DLLs from a 64-bit process.
A key advantage of 64-bit technology is its ability to address up to 8Tb of memory, against a maximum of 2Gb for 32-bit processes. As a result, 64-bit technology allows most data processing to take place in memory, without any need for temporary disk storage. This can considerably increase performance and open up new data processing scenarios. There are therefore good arguments for migrating current 32-bit software products to a 64-bit platform.
Many C or C++ applications are easy to migrated to a 64-bit platform, particularly if they are written in a monolithic fashion. Sometimes they just need to be rebuilt with an x64/IA64 compiler to run as native 64-bit applications. However distributed or module-based software can cause more problems.
Inside Open XML
Open XML is not just a new file format for the latest version of Microsoft Office, but an open standard capable of expressing any Word, Excel or PowerPoint document.
Originally published on DNJ Online, Jun 2007
The 2007 Microsoft Office system brings many changes but the most significant, as far as organisations of any size will be concerned, is one that few end-users will ever see. This is the new Office Open XML file format used by Word, Excel and PowerPoint 2007 to store documents. For Office 2007, Microsoft has adopted a native file format that, while able to support all the features of Office documents back to Office 2000, is also an open standard capable of being read, manipulated and extended by third-parties.
The Office Open XML standard was approved by Ecma International (previously the European Computer Manufacturers Association) in December 2006 and represents the collaborative effort of some 20 members including vendors such as Apple, Intel and Novell, and organisations such as BP, Barclays Capital, The British Library and the US Library of Congress. It has now been submitted to the ISO/IEC Joint Technical Committee for ratification. The full specification runs to over 6,000 pages but a useful overview can also be found at www.ecma-international.org/publications/standards/Ecma-376.htm. Read more…
Think Parallel
Limitations in processor design mean the ‘free lunch’ is over for software developers, as we come to terms with the fact that chips simply aren’t going to get much faster. Matt Nicholson went to Think Parallel, Intel’s conference held in Lisbon in April 2007, to find out more.
Originally published on DNJ Online, Jun 2007
Up until now, programmers have had it easy. Ever since the first 16-bit processors appeared in the late 1970s, processor speeds have been increasing at an exponential rate, doubling every two years, or more recently, every 18 months. The modern Intel Pentium 4 offers 10,000 times the processing speed of the 8086, found in the original IBM PC. However the essential architecture has remained unchanged, which means programs which ran on last year’s processors continue to run on today’s, only faster. They may need to be recompiled to take advantage of the latest technology, but in the end it has always been hard disk or network access that caused the bottleneck and that wasn’t a coder’s problem.
However, as we found out at Think Parallel, Intel’s EMEA Channel Conference 2.0 held in Lisbon in April 2007, that’s about to change. Here Herb Sutter, Microsoft Software Architect and chair of the C++ Standards Committee, told us in the opening keynote, “Your free lunch is over.” Up until recently, clock speed has increased along with processing speed, from the 4.77MHz of the 8086 to the 3.2GHz of the Pentium 4. However Intel abandoned plans for a 4GHz processor in 2004, and since then maximum processor speeds have increased to only around 3.8GHz. Read more…
Writing secure code
Security can seem a daunting subject but there are a few basic concepts and simple techniques that can help you build more secure applications. As Matt Nicholson explains, you need to think like a hacker and adopt a mind-set that makes you suspicious of every item of data that can come into your system.
Originally published on DNJ Online, May 2006
Every hour of the day, every day of the year, someone is trying to break into your system. Most of these attacks are automated – spiders tirelessly scanning your ports, looking for a way in. It doesn’t matter whether you’re an international bank or a one-man band, these programs are looking for weaknesses that they can report back to their owners for evaluation – and possibly a more sophisticated follow-up attack.
Until recently, such attacks have concentrated on your operating systems and network infrastructure. However, as companies like Microsoft put more resources into plugging the security holes in their software, attackers have realised there is an easier way: through the applications that you write to run on these systems. Few companies have the resources or expertise of Microsoft when it comes to resolving security issues, and if the application is on the Internet then the attacker can access it in a fairly anonymous fashion from almost anywhere in the world. Read more…
Understanding security
Authentication and confidentiality are issues that have fascinated scientists and mathematicians for centuries. Matt Nicholson looks at some of the techniques in use today.
Originally published DNJ Online, May 2006
A man walks into a bank and presents a cheque to the cashier. The cheque is made out in the man’s name, but the cashier refuses to cash the cheque. Why?
There could be many reasons. The cashier could suspect the cheque has been tampered with so as to pay out a larger sum than the payer intended. The cashier could suspect that the customer is not who he claims to be, or that the driving licence he presents as identification is a forgery.
Whatever the scenario, secure communication essentially comes down to authentication and confidentiality. In the real world, authentication is achieved through a passport or a signature, and confidentiality through a sealed envelope or locked safe. In the digital world authentication is achieved through knowledge of a secret code, such as a password or a PIN (Personal Identification Number), and confidentiality through encryption. Read more…
Security glossary
From anonymous proxies to zombies through boot viruses, botnets, cross-site scripting, decoy scanning, denial of service, Google hacking and honeypots, keystroke logging, pharming and phishing, rogue diallers, rootkits, spiders, trojan horses and worms – a glossary of the tools and techniques used by those who want to break into your computer system.
Originally published on DNJ Online, May 2006
Anonymous proxy: A proxy server that hides the attacker’s machine by stripping its IP address from the request. All you see is the IP address of the proxy server.
Auto-encryption: Where a program encrypts part or all of itself, making it more difficult to analyse.
Back door: Where a developer deliberately includes undocumented commands that gives him or her unauthorised access to the system.
Banner-grabbing: Most operating systems display some sort of ‘banner’ detailing operating system type and version at some point during the log-on process. Getting a system to display its banner can provide the hacker with useful information.
Boot virus: A virus that infects the boot sector of a hard or floppy disk so that it gets executed when the machine is powered up or rebooted. Particularly popular when floppy disks were used to transfer data from machine to machine as users often forgot to remove them before shutting down, with the result that the boot virus got executed next time the machine was powered up. Read more…
Is BizTalk Server an ESB?
Is BizTalk Server Microsoft’s version of the Enterprise Service Bus? Matt Nicholson discusses this together with human workflow and long-running transactions with Scott Woodgate from Microsoft’s Business Process and Integration team.
Originally published on DNJ Online, Dec 2005
Matt: Some people have said that BizTalk Server is Microsoft’s Enterprise Service Bus (ESB). Would you agree with that?
Scott: We can definitely address that issue! But first, let’s rewind. With a Service Oriented Architecture (SOA), creating the services is important, and on our platform, using ASP.NET and Windows Communication Foundation (formerly ‘Indigo’) to create those services is a fine thing. The capabilities of such services, and how you talk to them in a point-to-point manner, will continue to improve through additional layers supporting security, transactions, reliability and so on. WCF is all about creating services and then accessing them point-to-point in a secure, reliable and transactive manner.
However in any SOA the notion of composing a multiple of these services into a broader application, rather than just talking point-to-point, is important. This is where the business process support of BizTalk Server comes in.
The third aspect of SOA is the notion that one service shouldn’t know about another service. All a service needs to know is that it’s sending a message such as, say, a purchase order. Something else ‘in the middle’ can figure out where to deliver that message. It may be that nobody is interested in the purchase order, but more than likely there’s another service, or multiple
services, that will receive it. However as far as the source application or service is concerned, it doesn’t care. That’s important because if I need to go from having one target to two, or replace one with another, then I want to be able to do so without affecting the source. In BizTalk parlance that ‘thing in the middle’ is the publish and subscribe messaging infrastructure. Read more…
Choosing between C# and VB.NET
The .NET Framework supports a variety of programming languages, including Microsoft’s much heralded C#. Huw Collingbourne considers whether Visual Basic is still as sharp as the competition.
Originally published on DNJ Online, March 2005
In the past, different programming languages tended to do things in their own way. Programmers using C++ would, in all probability, make use of the types and routines provided by the Microsoft Foundation Classes (MFC) libraries; Visual Basic had its own built-in types and routines;
non-Microsoft languages such as Delphi and Java used yet other class libraries, each of which was incompatible with each other.
With the advent of .NET that has changed. No matter which programming language you use, you will have access to the same rich collection of classes and functions provided by the .NET
Framework. Indeed, it is even possible to write classes in one language and derive descendant classes from them in another language. Your source code is not compiled directly into a machine code executable. Instead it is translated into Microsoft Intermediate Language (MSIL). This intermediate language is only converted into machine code when the program is run by the .NET Common Language Runtime (CLR).
In effect, the .NET CLR understands only one language: MSIL. This means that it really doesn’t matter which language your programs are written in. Whether they were written in C#, J#, VB, Delphi or some other .NET language, they will all end up as MSIL. Given the fact that all .NET programming languages have access to the same class library, are translated to the same intermediate language and are executed by the same runtime system, you may wonder what, if anything, there is to choose between them. In this article, I shall be taking a close look at Microsoft’s two principal .NET languages, C# and VB.NET, in an attempt to answer this question. Read more…