IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Intranets Add Life to Legacy

A small but growing number of government agencies are beginning to use intranets to integrate legacy databases.

The Texas Comptroller of Public Accounts has several fiscal responsibilities, one of which is managing the state's property and assets. Whenever the comptroller's office prepared a debt report for the Texas Senate on what the state still owed on its property, the agency's Information Services staff took the query over the phone and then cranked out a report on the agency's IBM mainframe.

But that day-long, roundabout way of answering a query has come to an end. Agency staff now use a Netscape browser on their PCs, Macs, UNIX workstations or AS/400 terminals, pull up the Web site for state property assets on an intranet and run their own query to the mainframe database. According to Ralph Hutchins, systems analyst for the comptroller's office, it took about two weeks to write the Web script for the queries.

Compared to building a client/server solution for the same problem, integrating an intranet with the legacy database wins hands down. "Using a browser is simpler, thinner [than a client], cheaper and has a lower learning curve," said Hutchins. "You just give them the URL and bam, they are running their own query."

A FEW HARDY AGENCIES
Government use of the Internet has been dominated by static document publishing: Web pages for public consumption that contain information dutifully collected by a government agency. That's beginning to change, thanks to intranets.

A few hardy agencies are beginning to deploy intranets -- internal networks built on Internet standards -- to run actual applications. More importantly, they are using intranets as a gateway to legacy databases. For government IS managers who shudder at the thought of migrating legacy systems and databases over to client/server technology, intranets present a welcome alternative.

Intranets are attractive because they are quick to assemble and deploy, and use standard interfaces, protocols and programming features to build applications and move information around an agency. But for government IS managers, intranets provide a way to leapfrog over the arcane science of client/server and into a newer but simpler means of doing business with computers.

"Our skills are based on programming mainframe-based systems," said Stuart Greenfield, an analyst for the comptroller's office. "We don't have many people who understand C or C++ programming. But with the intranet we can provide users with nice front ends, which they expect from client/server, and we can open up access to legacy data."

According to a report published by Dynamic Information Systems (DIS), legacy databases are a major issue confronting organizations moving to client/server systems. To bypass this and other troublesome aspects of client/ server -- such as the so-called "fat client" dilemma, and laborious application upgrades -- IS managers can develop solutions using the World Wide Web.

"Correctly implemented, the Web offers a powerful environment for applications that need to access information from a variety of data sources," the report stated. "Few environments outside the Web offer an architecture that allows data from disparate sources to be blended and accessed within the same application."

According to DIS, the Web's browser interface is ideal for search and retrieval services in document management. It is conducive to group scheduling, project management, help-desk, customer service and other types of team and group applications and simple data-entry operations. The Web also is well suited as an interface for querying and reporting. "It's possible to deliver Web solutions that accept a request and return tabular, cross-tabulated or even graphical results based on a query," reported DIS.

On the other hand, Web and intranet applications fare poorly in the area of intensive online transaction processing, applications where safety and security are critical factors, and where concurrency control -- managing simultaneous access to databases -- is an issue.

FAST PRODUCT GROWTH
So how does an agency link its old data to an intranet? David Strom, an intranet consultant, writes about accessing legacy databases with an intranet in his white paper, Finding the Right Intranet Technologies to Buy. He mentions several products available now that can put data residing in host-based systems into an HTML (Hypertext Markup Language) format.

For agencies that have an existing SQL (Standard Query Language) database server, a number of software tools exist, known as gateways, that connect existing SQL servers to Web servers, so that workers using Web browsers can query the data. Other tools provide more sophisticated interaction with the legacy data. But Strom cautions that the "Web/database junction is still fairly new. Things that have long been possible in client/server systems are more difficult to implement with HTML forms."

Writing in Computerworld, Kim Nash pointed out that earlier this year, "few choices were available for linking Web applications to databases. Users could buy or download free translators such as GSQL, a gateway from the University of Illinois that transforms SQL into a format readable by Web browsers." Or they could write "connector code" with the Internet's Common Gateway Interface (CGI).

Now, choices abound, according to Nash, including:

Third-party utilities that link host system users to Web applications.
Point-and-click tools that help build Web applications with database back ends.
Middleware that translates SQL queries to Web-readable formats.
Special translators from the leading database firms, including Sybase, Oracle and IBM.
TWO INTRANET TALES
Government agencies first need to ask themselves the purpose of their intranet before they seek answers to questions about tools that link legacy databases to the Web. They should also determine the complexity of the application: Will workers in one department or several departments share the data? "The more complex the application, the more you'll need advanced features and more than just a single technology to do the job," said Strom in his white paper.

For the Office of the Texas Comptroller of Public Accounts, several issues seemed to dovetail into an intranet solution. Two years ago, the office began linking different information resources into an integrated tax system. The office has a large mix of Macintoshes, PCs, UNIX workstations, as well as AS/400 and HP 9000 systems. But problems with integrating the Macs into the new system led analysts Greenfield and Hutchins to ask whether the newly emerging Web might work as a unifier.

"In addition to unifying different information resources, we wanted to provide a common interface to our tax and accounting systems, use more graphics and make the worker's interface more intuitive," said Greenfield.

In 1995, Greenfield received approval from management to develop a proof of concept to see whether the Web architecture would work. "The Web sounded like a good idea, but we wanted to know if there were any tools that we could use to do this." What sounded most appealing was using the Web browser with its graphical interface to access legacy data, without the agency having to reengineer its back-end databases.

The comptroller's office chose for its prototype Web application the state property database, which resided on an IBM 9000 mainframe in DB2. Working closely with IBM, the agency used off-the-shelf software tools from IBM and third-party developer CEL Corp. to develop an intranet. A key component was the installation of an IBM RS/6000 computer, which runs a UNIX version of DB2 that can talk with DB2 databases on the mainframe. The RS/6000 acts as an intermediary database server, pulling property asset data off the mainframe so that workers can access the information faster.

Hutchins worked with agency staff to figure out the 10 most common SQL queries used with the property database and, using a DB2 World Wide Web software tool, developed 10 canned queries that could be run in HTML from a Netscape browser. Now, instead of calling IS to produce a report in a day, agency staff can use their browsers to generate the most common kinds of property reports in just seconds.

Having spent approximately $50,000 on the prototype, Greenfield and Hutchins are hoping to expand the intranet/Internet to include some viable applications involving DB2 and CICS transaction processing language. Currently, the comptroller's office receives 12,000 calls a month from credit reporting agencies, such as Dun & Bradstreet, that want to know if a business owes the state any money. "We're hoping to put this public information on the Web so that credit firms can run their own queries without having us look it up," said Greenfield.

Hutchins and Greenfield see two big benefits from using the Web to access data. First, it allows IS to leverage its talent, which is primarily in the mainframe field. COBOL and CICS programmers don't have to become C++ programmers. Second, IS doesn't have to reengineer its legacy applications, which would have been a virtual certainty if they were to migrate to a client/server model. That's a big cost savings for cash-strapped government agencies.

Another low-cost intranet that accesses legacy databases can be found in California's Environmental Protection Agency, Department of Pesticide Regulation. There, staff use their browsers to run queries on information that originates from the U.S. EPA's databases in Research Triangle Park, N.C. Or they find the location of any one of 150,000 pesticide studies maintained by the state agency.

"Once we had our infrastructure in place -- the network backbone and Ethernet for the department -- we tried to develop this intranet very cheaply," said John Stutz, a systems analyst. Other than the purchase of an Oracle database and an Internet server, Stutz and his colleague Steve Kishaba developed the agency's intranet using mainly freeware tools available over the Internet.

For the EPA databases, Kishaba wrote an FTP script that automatically grabs the data off the mainframe in Research Triangle Park, imports data into the Department of Pesticide Regulation's server, loads it into the Oracle database and then makes the information available over the intranet. Staff can also access and query internal databases, which are imported into the intranet server using the same scripting method.

According to Stutz, the staff can perform complex queries using their browsers. "We designed the queries so that staff can enter as many as five variables on the Web form," he said. "The queries are designed to meet 90 to 95 percent of the questions that people normally ask."

Stutz, who has no formal computing expertise -- "my background is agriculture" -- but has 12 years experience working with computers, stressed several key points if a government agency wants to develop an intranet application for databases:

Make sure the data is useful. "There's a lot of garbage out there [on the Web]. Make sure the data is what the people need and can use."
Have a good technical person who can turn your vision into reality.
Remember that integrating legacy databases and an intranet is not rocket science. "Data is data," said Stutz. "You can export data from any database and load it into any other database. When you load it into a database that has an Internet interface, then you have made the data available on the Internet."
Use the Internet to locate tools and find out what other people have done.
Finally, it can be a hard sell to convince management that the intranet can work with legacy databases. One way to win approval is to set up a prototype. "We took all these complex EPA databases," said Stutz, "and made them available [over the Internet] in a week or two."
*




[ January Table of Contents]