Warning: This year might bite! Armed and considered extremely dangerous to automated systems!" Perhaps we should put up wanted posters in all public places or place a label on every calendar. Some call it the year-2000 problem, others the Millennium Bug, and still others the Y2K problem; but whatever you call it, call it a serious problem. Of course, there are many people who, for whatever reason, don't believe it, are still hoping for a "silver bullet," or are just plain mad at the whole thing.
What is the year-2000 computer glitch? Simply stated, programmers in the past used two digits, instead of four, to describe the year. Thus 1997 became 97, and 2001 would become 01. If someone says to you "I was born in 45," you would quickly assume they were born in 1945. Your mind makes a logical conclusion that the person could not still be alive if they were born in 1845 and obviously could not have been born in 2045, so they must have been born in 1945. Your mind used deductive reasoning -- one of the many affable traits of human beings.
Unless a programmer has instructed a computer, through a program, to make the same sort of deduction, the computer will not make any assumption at all. For instance, assume for a moment the year is 2000 and a computer program is to calculate a person's age but is given only the last two digits of a person's birth year (say 45). The computer will perform as it has been instructed, i.e. it will subtract the birth year from the current year to calculate the age. In this case it will calculate current year - birth year = computer age, e.g. 00 - 45 = -45. Of course, the correct answer is really 55. But in this simple, but realistic, example the computer did not "blow up" or stop working, or anything obvious -- it just gave the wrong answer.
Problem Occurs at Three Levels
This problem can occur at three levels. The first is the most obvious: in traditional "mainframe" computers. These systems are dominated by older computer languages such as COBOL, which encouraged significant independent programming. While these programs served their entities extremely well and have paid for themselves many times over, they generally were not written for years beyond 1999.
The second place to look for this problem is in embedded process controllers. "What's that?" you ask. It is any automated process controlled by a computer chip, such as personal computers, security elevators and doors, telephone switches, traffic lights and electric utility sub-stations. This problem is truly a "Year 2000 Bug." When the microchip was invented, the BIOS (Basic Input Output System) of these chips was designed only to have a two-digit year. A personal computer containing a faulty chip might change its internal date to Jan. 1, 1980, when restarted after Dec. 31, 1999. By the way, Jan. 1, 2000, is a Saturday but Jan. 1, 1980, was a Tuesday, a situation that may cause business doors to be unlocked when they should be automatically closed.
The third place to find the year-2000 problem is, unfortunately, largely overlooked. The third problem is hidden in data. For instance, how will spreadsheets be interpreted after Jan. 1, 2000, when staff has entered only the last two digits of the year for the last several years? The answer depends on the spreadsheet being used. If Excel or Lotus 1-2-3 is being used, it also depends on programmers in Utah (where Lotus 1-2-3 is manufactured) or Washington (where Excel is manufactured). These two popular spreadsheets provide different answers and very little documentation. As long as customers of these systems enter data in its abbreviated two-digit year form, there will be a year-2000 problem when those dates are compared to other dates after the year 2000.
How is the problem fixed? Technically the year-2000 problem is not very challenging. Even novice COBOL programmers can