IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

When Our Systems Make Us Stupid

We love our systems, even while they steal our common sense, refuse to solve the problems for which they were created, add new problems of their own and delude us into thinking that we're doing fine. You think you've goofed? Take heart, you are not alone. Here's a look at what not to do.

Photo: Gopal Kapur. This article is edited from a speech given by Gopal Kapur, and it first appeared in Government Technology's Visions Magazine in Sept. 1997.

For years, the River Nile flooded each year, and rich sediments came down the river. Farmers would use this as free fertilizer, making the Nile River Valley one of the most fertile in the world. The Aswan Dam -- built at enormous expense -- stopped that flooding.

So now there were no natural fertilizers for the farmers. Gigantic factories were built to replace the fertilizer. Unfortunately the factories now use nearly all the electricity produced by the dam. The chemical fertilizer is too expensive for most of the farmers, because they were used to getting it free. In the meanwhile, the dam is filling up with sediments.

Why We Fail

The first reason we fail is the belief that any system set up will solve our problems. The second is the belief that complex systems can be made to function to achieve its objective. Both of these are faulty beliefs, because when a system is set up, a new entity comes into being whether or not it solves the problem. And no matter what the purpose of the system, embedded in it is what we call "system behavior," and the first system behavior is that complex systems tend to oppose their own proper behavior.

Another example of a complex system opposing its own proper function is a multi-million dollar project of the Motor Vehicle Department in the state of New Jersey a few years ago. Within days of setting up this system, the DMV found it did not have enough time at night to process all the tickets that were issued by the Highway Patrol, and the system was too slow.

Within a few weeks, 1.4 million transactions backed up. So how do they solve the problem? The governor of New Jersey was forced to issue an Executive Order to the Highway Patrol not to cite people for traffic violations and this included driving at high speeds.

The moment this Executive Order came, drivers from neighboring states were quick to discover his gift. They began to drive through New Jersey at high speeds just to irritate the Highway Patrol.

Another example is that at 2 a.m. Sunday October 27, 1985, Amtrak came to a halt all over the country and remained motionless for one hour. Passengers were notified that the nation was switching back to standard time from daylight savings time and the trains were waiting for the clocks to catch up. Aren't we glad the same people did not program the airline flight control systems? Now that was in 1985. You could say "Well, that was the first time, we'll learn from it, we won't do it again."

Now consider an example from San Francisco's Bay Area Rapid Transit (BART) system. In an attempt to run more trains per hour, BART upgraded its hardware, wrote new software and redid its computer system.

During the first few days after switching to a state-of-the-art system, if a train was behind schedule, it would run right past the station. That is exactly how they programmed their computer.

This feature was not appreciated by riders, who missed their station and had to get off and go back. A new term was coined: station deja vu, as in "I have seen this station before." However, once you got on the train to get back to the station which was behind you, there was no proof that it would stop there. As a result, most trains ran on time -- however, most customers were late getting to work. Now this is the clincher -- BART management made this change on one of the busiest days of the

year.

I call this premeditated stupidity, spontaneous stupidity is allowed to everybody one in a while. We all do this. We run into walls, we look around and we straighten ourselves and walk away.

The next system theorem is: "Systems are like babies. Once you get one, you have it. They won't go away." On the contrary, they display the most remarkable persistence. They not only persist, they grow. Now we see this phenomenon alive and well in information technology every day. Operating systems have grown bigger, more complex and more important. For example, Windows NT. Great system.

But the new systems have become more complex and bigger than any legacy system anybody ever had. So now, as a result of this, office workers demand higher wages -- not because they can write better letters, but because they know the operating system. Meanwhile the quality of a letter is going down, and inflation-adjusted cost of the letter is going up. However, the same low-quality letter can be e-mailed instantly to 300 people.

The next system theorem is: "Complex systems can exhibit unexpected behaviors. The larger the system the more unexpected the behavior." For example, the largest building in the world, the space shuttle preparation building at Cape Canaveral, generates its own bad weather, including clouds, rain and the occasional electrical storm. It was built to protect the rockets from the elements but it generates its own storms. So on a clear day, you may have to park the rocket outside, because it's raining inside. On a rainy day, it doesn't matter where you park them.

The next system theorem is: "If you are lost, speeding up is not the best thing." A major West Coast bank, a few years ago, spent $69 million before they abandoned a failing system. A failed system is no problem. It would be silly to say that every system we plan will be successful. There would have to be failure in which we learn and we continue. But $69 million before they figure out it doesn't work? There needs to be some milestone where people check their progress and find out what is going on.

A project manager actually asked me what would have been my action on the project. I told him "$1 million into the project, I would have given up." If it was my money -- $95.

Then there is the example of a $1.8 billion loss due to a failed project by the FAA. They bought all the hardware before the software was written. Then they wrote software for a year and a half. Then they discovered that the hardware had become obsolete. Does it take anybody with any brains to know that a year and a half later the hardware is going to be obsolete?

The information technology industry has significantly changed the way software defects are viewed. Far too many computer professionals and consumers have come to accept defects as the norm. Let's look at three different professions: engineering, food service and software.

Imagine if an engineering company had a sign outside their building that said: ""400,000 square feet of buildings built -- with no implosions." Would that be a good PR sign? Well, of course not. Or if a food service company advertised: "We served 400,000 meals and no botulism."

Now a software company says: "400,000 lines of code written, no bugs." Nobody wants to believe that. OK, 400,000 lines of code. How about 4,000 or maybe 400? The thing is, everybody expects software to be defective. If your PC was to go on the blink due to electric failures as often as it does due to software failures, you would be livid with your electric utilities.

But try to be livid with your software company. This is how things happen. First,

the call is on you. Then you are put on hold -- a long hold. To serve you better these days, they say "You have been on hold for 11 minutes." As if I didn't know.

Then, you are forced to listen to some boring music. At least they should have a selection of music: "Would you like Tchaikovsky? Would you like country/western? Would you like silence?" But everybody accepts this, even though nobody will accept that in their refrigerators, appliances or in their cars. The software people have made people believe that defects are the norm.

Sensory Deprivation

Recently an IT department in a Midwestern state installed a system on 750 workstations. Every time the system ran, it would delete a large number of records. They called the service people, but the blame was on the customer and the software group refused to take any action. Within six weeks, the system deleted practically all their records from their database, a total of 1.2 million records. The old saying "On a clear disk you can seek forever" came true.

Another example of sensory deprivation is the case where a senior member of Congress was concerned about a flight on an airplane with a bomb on it. So he asked the computer department to figure out the probability of getting on a plane with a bomb on it. A senor analyst and a programmer were assigned to do the analysis. They collected a lot of statistical data, and did some simulations and they went back to the Congressman and announced: "Sir, your probability of getting on the plane with a bomb is 1:114,247. He thought about if and stated: "That's not good enough, can you guys do something to improve my probability?" So they found this was a good chance to order more hardware and software, and then they ran more simulations and about three weeks later they reported: "Yes we have been able to improve your probability to 1:1,156,549." He liked that. He asked "What do I have to do?" They said "Sir, carry a bomb on with you. The probability of two bombs on the same plane is smaller."

Another example of sensory deprivation is that of a child welfare agency in a Midwestern state. They installed a client/server system, with fuzzy logic and artificial intelligence. This tells me there is trouble ahead.

The system was designed to assist social workers make better decisions in regards to child adoption. On its very first day, it rejected an application of a couple and the official reason was: "The couple had too happy a homelife." The child would not grow up under "normal" conditions.

Their logic shows these people don't fight, they've been married a long time. Neighbors like them. They don't have any bad credit -- something is wrong. The couple protested, but the department stood by its decision. They said "Our system could not be wrong." So the couple was not allowed to adopt.

The only artificial intelligence system I personally trust is a thermos bottle. It keeps hot food hot and cold food cold. The question is -- how does it know?

Eventually, system people become so committed to their system and so involved that whatever mediocre product their system delivers, they are convinced that's what they wanted all the time.

Gopal Kapur is founder of the Center for Project Management. The system theorems discussed in this material are adapted from the book "Systematics" by John Gall.