AuthentificationCyberCrimeFirewallSecuritySecurity ManagementVirus

Tales In Tech History: The Y2K Bug

Tom Jowitt is a leading British tech freelance and long standing contributor to TechWeek Europe

Computing armageddon was forecast in the lead up to the new century, yet for many the Y2K Millennium bug was a damp squid

The Y2K bug (also known as the Millennium Bug) was considered an extremely serious threat to global computing infrastructure in the late 1990s.

Indeed there were countless predictions of a computing Armageddon once the world left the 1900s behind and the date changed to 01 January 2000.

Some countries spend hundreds of billions of dollars on the threat. Indeed, it is estimated that the Y2K bug globally cost over $300bn (£232bn) or $417bn (£322bn) nowadays, once inflation is factored in.

Bug (c) bofotolux, Shutterstock 2014Climate Of Fear

It is perhaps hard looking back with hindsight at the Y2K bug to appreciate the concern it had been causing governments and private enterprises in the last few years of the 1990s.

There were countless predictions and warnings of the havoc it would cause for computers and computer networks around the world at the beginning of the year 2000.

For example John Hamre, the US Deputy Secretary of Defence from 1997 to March 2000, was quoted as saying that “the Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe.”

Indeed, in the 1990s, countless firms around the world began offering Y2K services to mitigate against the threat, and some countries even formed special committees and governmental taskforces to tackle the issue, whilst other countries did virtually no preparation at all.

Essentially, the Y2K or Millennium bug was problem to do with dates beyond 31 December 1999.

When computer programmers began to code back in the 1960s, engineers only used a two-digit code to signify the year (remember computer storage back then was expensive). This meant that engineers simply left off the “19” when coding in the year. So instead of a date of 1969 for example, the code would just read 69.

But in the 1990s, concerns began to arise as to what would happen if computers were unable to interpret 00 as 2000. What if they thought it was 1 January 1900 instead? Would the computer system even work?

Matters were made worse by the fact that the year 2000 was actually a leap year as well.

The Y2K bug therefore was predicted to cause severe problems for banks for example, where interest rates are calculated on a daily basis. Once the new millennium kicked in, the computer could have suddenly add 100 years of interest charges for example. Entire global trading systems could have collapsed overnight.

Was it the end of the world? Find out on page 2…