IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Large Project Software Scare

Large Project Software Scare

When California's huge Department of Motor Vehicles database project skidded to a halt in 1994, much of the blame for the fiasco was attributed to poor project management. But a key reason why DMV management failed to complete the project was its inability to develop the software capable of managing a database of 38 million vehicle and 31 million license and identification files.

In Florida, criminal charges were
filed (and later dropped) against the welfare agency's MIS director for management problems stemming from an automation project that cost the state in excess of $200 million over five years, more than double what was originally budgeted for the system.

According to several people familiar with the project, the underlying cause for its failure was the extreme difficulty the agency had with modifying the system's software to meet the state's unique needs. The welfare determination system was part of a federally funded transfer of technology. By all accounts the system is huge, designed to handle more than 5 million transactions every day.

States, of course, aren't suffering alone. The Federal Aviation Administration (FAA) has been working long and hard at replacing its aging air-traffic control system with state-of-the-art technology. But the software, described as "bug-infested," hasn't been deployed yet, and the entire project is five years late and more than $1 billion over budget. Meanwhile, the existing system crashed several times over the past summer, resulting in significant slow-downs in the nation's air traffic.

Denver's new international airport probably had the most visible example of a software development debacle when its much-touted $193 million baggage handling system failed to work, holding up the airport's opening for months. The problem was numerous errors in the software that controls the baggage system.

By now, an unmistakable pattern has emerged. Large-scale software development projects are prone to failure. In his article, "Software's Chronic Crisis," (Scientific American, Sept. 1994), writer W. Wayt Gibbs said, "Studies have shown that for every six new large-scale software systems that are put into operation, two others are canceled. The average software development project overshoots its schedule by half; larger projects generally do worse. And some three quarters of all large systems are 'operating failures' that either do not function as intended or are not used at all."

While large software development projects present risks of cost-overruns and failure in all businesses, government seems particularly vulnerable. Whether it's the budgeting process that restricts redeployment of money for projects in trouble, or a hostile state legislature that discourages the phased growth of software systems, states and localities are often fenced into building mega-sized projects.

As Florida, California, the FAA and untold other government agencies have found out, getting it right the first time is difficult. When the system is very complex, as was the case with Florida's welfare system -- which is supposed to determine welfare eligibility, manage refugee aid, enforce child-support payments and handle Medicaid programs -- then the chances of building an error-free software system are minuscule at best.

Unfortunately, the trend in government is toward bigger systems handling more complicated tasks. Having spent years automating a number of peripheral operations, government is now attempting to automate core tasks, according to Steve Kolodney, chief information officer for the state of Washington. "We are starting to use technology to drive the way we deliver service," he said. "We are getting at the very heart of the operation. Instead of building around the core legacy process, we are now dealing with the process itself."


HARDENED SOFTWARE
Software development has been going on for nearly 50 years, but for the past 25 years, computer scientists and engineers have been urging that engineering methods be applied to develop and maintain software. The hope is to make software something that can be easily reused, changed or modified, much like a manufactured part rather than some one-of-a-kind creation. However, as Gibbs put it, "the vast majority of computer code is still hand-crafted from raw programming languages by artisans using techniques they neither measure nor are able to repeat consistently."

To make matters worse, software is a lot less flexible than it seems. Even though it's written on computer keyboards, not forged from iron, the code for a particular program can be so long and intricate that any kind of modification or repair is a serious and expensive undertaking.

Unfortunately, change and modification of software is almost a given in government. Developers design software by conducting interviews with key users, then go away and write in code what they thought was said. More often than not, the results do not fully capture what the system is supposed to do.

As a result, application software is unlikely to contain government program's business rules correctly the first time, raising the likelihood that the system will need costly modification later on. Then there are the rules and regulations for programs that are constantly being added or changed by new legislation or mandates. Trying to work these changes into the software of a large-scale system can be a nightmare.


BIGGER IS NOT BETTER
To mitigate the problems and risk of software development, experts recommend that software should "grow" so that developers can continuously improve it as the system gets larger. Experience shows that when software is built all at once, the risks of failure increase.

In the private sector, developers take a graduated approach toward large-scale systems, developing and implementing them in slices. "The idea," said Larry Singer, national director for state and local government at Texas Instruments, "is not to proceed with the second phase until the first phase has succeeded."

But in government, the attitude is to build large-scale systems as a major project. One reason for this is the procurement process, which does not encourage the growth of a system in a continuous series of implementations. As a result, it takes just as long to get authorization on a small project as it does for a large-scale project. To avoid procurement hassles, agencies focus on building a big system that will last a long time.

Agencies also dislike having to go back to legislatures or councils to seek approval for more funding. Rather than make repeated requests to fund a series of small projects, agencies ask for the whole thing up front, ending up with a monster project.

Even if problems occur while growing software into a large-scale project, private sector organizations have the alternative of redeploying their financial resources, explained Singer. "But in the government budgeting process, if you are given so many millions of dollars for a project, you have to use it. You don't have the option of making a left turn and using the remaining money to deliver the end-benefit in another way."

The inflexibility of traditional software development has become particularly acute in the area of transfer systems. The federal government's well-intentioned efforts to reimburse states that implement an information system transferred from another state was meant to standardize welfare-eligibility systems while reducing the overall cost of system development.

But as Florida found out when it transferred Ohio's eligibility system, trying to change and modify the software code so that the system would fit the state's particular business needs can cost more in work and funds than it would if the state had built its own system from scratch. The underlying problem with transfer systems is that while the feds and the states know there are commonalities in a welfare eligibility system that can be shared, they don't know how to capture those commonalities in the software, and still leave room for change and modification.


THE STRUGGLE FOR SOLUTIONS
Computer science has been working for years on ways to capture a program's commonalities and reuse the software in different hardware environments. According to Gibbs, "Programmers have for decades used libraries of subroutines to avoid rewriting the same code over and over. But these components break down when they are moved to a different programming language, computer platform or hardware environment."

In recent years, the academic community and the computing industry have taken steps to improve the quality of software and to standardize how it is built. For instance, the Software Engineering Institute, a research institution, has developed the Capability Maturity Model, a concept that enables programmers to measure the process for software production. With measurements, programmers have a better way of telling if their software is meeting the project's business needs.

Improved software development tools are also helping developers with their work. Computer-aided software engineering (CASE) tools have been around for a while as a way to analyze, design and program information systems. With CASE tools now able to develop client/server as well as mainframe systems, software developers are using them to develop models of systems and to generate the actual computer code.

With software modeling, developers can involve users in all phases of software development, from planning to actual layout of what appears on the computer monitor. Users can sit with the developers, who display on computers how the software will present and process information based on the agency's business rules. The developers can make immediate changes and modifications based on users' reactions. Once the model is perfected, CASE tools -- not the developer -- automatically generate the computer code for the software. More importantly, software developed by CASE can be reused on other computing platforms.

State governments are beginning to pay heed to the problems of large-scale software systems, and are attempting to cobble together solutions that will mitigate risk. One approach involves consortiums of state agencies coming together early in the life of a project and sharing the analysis and design of a software system. Each state is then responsible for actual development and implementation of its own system.

Several consortiums are taking place around the country and in different areas of government operations. Maryland and two private sector firms, with funding from the U.S. Department of Labor, have started a consortium of state unemployment insurance agencies with the goal of designing a variety of software prototypes that can be easily replicated by other states.

A consortium of seven Southern states have begun work on designing a regional solution to electronic benefits transfer. The group plans to set standards as well as share costs and resources for handling electronic payments for food stamps and other program benefits throughout the region.

As for public assistance systems, the federal government responded to state problems with software transfers, and agreed to provide enhanced funding and guidelines for helping states collaborate on software development for child welfare systems. In response, a user group of 10 states was formed in 1993 and, working with a nonprofit organization, produced a prototype of an automated caseworker system.

In 1994, Mike Hale, then chief information officer for Florida and now CIO for Georgia, proposed the development of a national consortium for public sector software development (see "The Mission-Critical Software Crisis" Government Technology, Nov. 1994). The consortium would use software engineering concepts, such as CMM, and tools, such as CASE, to build a library of "templates of logical and physical design for designated application areas." These templates would, in fact, be reusable models of core components for large-scale, mission-critical systems.

Frank Reilly, director of Human Resource Information Systems for the General Accounting Office, has examined the problems that have plagued past transfer systems. Knowing the difficulty that software development presents, he has supported Hale's consortium proposal, which has been adopted by the National Association of Information Resource Executives (NASIRE). "The proposal addresses the true problem involved in large system development, which is setting standards," he said.

Since Hale introduced the proposal last year, NASIRE has been working with The Software Productivity Consortium to get the project off the ground. The Productivity Consortium describes itself as the "leading provider of processes and methods needed to successfully develop software-intensive systems." The consortium's members include leading high-tech firms, federal agencies and research laboratories.

Whether the consortium approach to software development works remains to be seen. The consortium of unemployment insurance agencies is off to a good start, but it's still too early to measure any concrete results. Unfortunately, the child welfare consortium has unraveled as problems and disagreements on direction led some states to strike out on their own.

Singer is not surprised. With a consortium there has to be unanimity on where the work will be conducted and who will be contracted to help with the project. "Inevitably, consortiums always end up with one state that takes the lead, making the whole project less collaborative than it began," he said.

In fact, Singer is concerned about how states are progressing with the latest federally funded automation project -- child welfare. He believes nearly 90 percent of the systems under way are in trouble of meeting completion deadlines set down by the feds.

Worried that funding will stop if troubles persist (and a big market could dry up), he said that Texas Instruments has begun giving away its CASE toolset along with a model of the core components for a Statewide Automated Child Welfare Information System (SACWIS). "We strongly believe that vendors can no longer allow these projects to continue to fail, otherwise funding will eventually stop," he said. "Our whole marketplace is dependent on it, so we want to promote success."

Giving away what amounts to almost $1 million worth of software development might seem a bit extreme, but the situation appears serious, according to a number of information executives in state government. In setting forth the reasons for establishment of a multi-state software development consortium, Hale pointed out that "states are becoming increasingly anxious and uncertain about the risks of [software] development and yet they realize they must move ahead in order to meet the exploding demands of these complex programs. In addition, the trends toward greater use of new technology, such as distributed systems and client/server architecture, are causing this anxiety to grow."


*


With more than 20 years of experience covering state and local government, Tod previously was the editor of Public CIO, e.Republic’s award-winning publication for information technology executives in the public sector. He is now a senior editor for Government Technology and a columnist at Governing magazine.