Large, mainframe-based systems, goes conventional wisdom, are obsolete, more an Industrial Age artifact than a tool of the rapidly changing Information Age. Public and private enterprises have been moving steadily toward distributed systems relying on networks and computing power at workers' desks, making less and less use of mainframes.

But contrary to what seems to be a broad consensus, said Chong Ha, head of California's Stephen P. Teale Data Center, mainframe-based systems are still needed for some functions because the newer, distributed systems aren't developed enough to bear mission-critical functions. Mainframe technology took about 20 years to fully develop recovery, security and other important features, he said, and client/server and similar architectures haven't reached that point yet.

"We're getting there," Ha said. "In three-to-five years, maybe we'll be there. Then the question will be which system is easiest to develop, and if it is cheaper to use than a mainframe."

Ha points out that organizations still do much of their crucial computing on mainframes. "If you look at Fortune 100 companies, you'll see their mission-critical functions are on mainframes," he said.

The reason for this is that most of the critical data on applications used today were written a decade or more ago, he explained. Converting these from a mainframe to a distributed or other system can mean a hefty investment in rewriting applications, converting files and training users.

Another key reason there is no major rush to convert to client/server or similar systems, Ha said, is that there is no widespread, common agreement on platforms. "We haven't settled on one system yet for client/server."

Another unresolved issue is that it takes a number of pieces to put together a client/server system, and the pieces haven't been standardized yet. "To make something happen with client/server, you need so many products," he said. "Then it takes a few talented people to get those pieces together. But we don't even have those pieces. So everybody's in a waiting game."

Because of this, businesses aren't making wholesale conversions to other platforms, Ha said. "For new applications, they will tend to use client/server. But for old ones, they won't even attempt to do it. So mainframes will be around for a long time."

DATA CENTER ROLE

Once large-scale conversions to distributed systems occur, there will still be use for mainframes. "It could be used in client/server as a server," Ha said. "But the mainframe will play an important role in the future."

Data centers such as Teale, whose primary purpose has been to do large-scale computing for customer agencies, will have to evolve as the role of the mainframe changes. Ha said Teale will have a purpose even after mainframe importance shrinks. "There will still be a need for physical facilities like this."

Ha added that there is more to data centers than just a residence for big computers. "When people look at data centers, they just think of mainframes," he explained. "But our role is to build infrastructure or the network. The future is the networked computer. Someone has to provide that technology. To me, we will be needed more than ever.

"Our role is to provide the infrastructure so our customers, state agencies and departments, can move to that new platform," Ha said. "We build infrastucture so they can move to the new platform, or PC-based system."

PRIVATIZED ALREADY

Some of these jobs could arguably be done by the private sector. Consultants and other companies could help agencies develop and implement systems and networks, possibly for less money. Privatization is a trend sweeping the public sector for the past few years, from garbage to tax collection.

Information technology is not immune. Indiana briefly floated the idea of privatizing data