Intranets -- internal networks based on Internet technology -- have become the rage in the corporate world. Few state and local governments, however, have utilized the potential of intranets, with one notable exception -- the California Environmental Protection Agency's Department of Pesticide Regulation (DPR).
This department is responsible for regulating pesticides to protect public health and the environment. It is charged with evaluating and mitigating the impact of pesticide use, maintaining the safety of the pesticide workplace, ensuring producteffectiveness, and encouraging the development and use of reduced-risk pest-control practices. To aid in accomplishing this mission, DPR links data and users though an external Internet site and an internal intranet.
In a scant three years the department has progressed from having no computer LANs, to having independent LANs, to a fully networked department with multiple LANs, and finally, to a full-blown intranet that spans all eight DPR LANs.
The department's legacy databases are being incorporated into the intranet so that the information is far more easily accessible, not only to DPR staff, but to outside scientists and the public. Navigation through complex databases is now just a matter of point and click from a Netscape Web browser.
Finally, the DPR intranet -- which serves an organization of nearly 400 employees -- is largely being installed and maintained by only two staff. In fact, one of the two, John Stutz, the systems analyst responsible for planning and overseeing the project, only works on it part-time.
Moreover, this intranet is being built using "as much free software as possible, because our budget limitations were such that it had to be inexpensive," explained Stutz. "We found that not only can many things be done very cheaply using the same free software that operates most of the Internet, but also that numerous resources exist on the Internet which also tell you exactly how to do it. We have done everything ourselves. We have not hired a soul to advise us or do any programming or anything else. It is completely an internal deal."
A trip to the DPR's public Web site on the Internet gives an idea of the sophistication of the intranet. For example, one can point and click through a series of maps to find a section level map (one square mile) anywhere in California and discover whether the use of a certain pesticide will threaten any resident endangered species. Use restrictions and mitigation instructions are developed by the Oracle database, along with pictures of the species in question. In other words, it is a high-tech Web site that might make even the most dedicated of Net surfers sit up and take notice.
The external Web site serves to illustrate what is being accomplished internally because -- unlike many public Web sites that are created almost independent from internal network operations -- the DPR Web site is actually driven, in a very real sense, by its internal intranet. "We set out, right from the start, to create an intranet that would be the foundation," said Stutz. "That was our initial goal -- use an intranet to disseminate information to every single staff and department, thereby saving resources, time, paper, production costs and distribution costs."
Much of the technical pizzazz, so apparent from the external Internet site, stems from a simple idea which was adopted right from the outset in the planning stages. Instead of using expensive and complex tool sets to access the information in databases -- tool sets which are frequently complex to learn and which also eat up system computing resources when running -- why not make the easy-to-use Netscape Web browser the interface to access all these databases? That way, all staff members needed to learn was the Web browser.
"Everyone in the department has Netscape," said Stutz. "So automatically, they have the capability of accessing the databases from their desktop and potentially even managing data through Netscape. In some instances, we are already using hypertext front ends to manage some of the simpler Oracle databases -- to actually input, edit and delete information from the database."
Many of the DPR databases are not small. For instance, pesticide use reports now total over two million new records a year. And while this is pretty much set data only reported yearly, there are other more dynamic transaction databases which receive input on a regular basis -- several times a day or even continuously.
The Web-style access to these legacy databases -- the product/label dating back more than a decade and a half and now in its fourth iteration of software and hardware platforms -- is being made possible because they are being migrated from mainframe and minicomputer environments to a client/server environment with Oracle as the standard database software. "You might say that we have taken to the client/server environment like ducks to water," said Stutz.
"We've found that for some of these larger databases, the tools that we're using to access them over the Internet or over our internal intranet might not be quite as fast as some other things, but the capabilities of relating the data and being able to navigate through it using Netscape as a front end are fantastic. It is something you really can't do with other kinds of front ends.
"For example, when you do a database query on our system, this creates a report that may also link to other databases," Stutz explained. "The end result is that the users can seamlessly navigate across databases. They don't even have to know exactly what databases they are accessing internally or externally anywhere in the country. We have a query, for instance, that does a distributed join across the Internet to a table on a machine at the federal EPA in Washington, D.C.
"They have a database called the master chemical integrator and it just happens to have a field that is common to our database. However, for the users, they simply click for the data they want and they get the answer, whether this is coming from one database or several different ones."
Such achievements are the result of utilizing the technological possibilities of Internet software and of innovative planning which, Stutz emphasized, has been key to the success of their intranet. In their case, planning began in Feb. 1995 with a full internal white paper that laid out how the whole system would work.
"Oftentimes, implementations of new technologies come from small workgroups that aren't responsible for legacy systems and network support like the IS branch," observed Stutz. "They have the time to explore the newer technologies. I think that is what happened with us here.
"A little over two years ago, we got our Internet and our backbone in place, and we had a router with our T1 line to the outside, which we needed to access our mainframe and support our legacy systems. So then it was just a matter of setting up a machine running the http demon and we started experimenting within our branch, Pesticide Registration."
Stutz said that they quickly realized the value of using the Internet to disseminate their Oracle databases using the interfaces available, such as Perl scripts. They also started experimenting with WAIS indexing, because the software was free.
Stutz then developed the white paper that outlined a departmentwide three-tiered solution. The first tier would be the external public Internet site. The second would be an administrative, department-level home page that would disseminate departmentwide resources -- such things as policy guidelines. And the third tier, which he considered
the most important, would be the intranet.
"Our initial design was that the bottom tier, what we call the branch level, would provide branch-specific resources to staff within one of the eight branches," said Stutz. "But because everything which ends up on our external Internet page springs from within, I like to say that the external Web page is a 'subset' of the intranet.
"The design of the system is like a pyramid. At the top is the external Web site. Below that is the administrative internal home page. And below all that is the intranet that forms the foundation of it all."
The plan also enabled employees to put content online. "You need to develop policies on this," explained Stutz. "For instance, the external Web site represents the department, so you have to have policy in place for approval of content that is going out on the Internet. In our case, external Web site content requires approval of our assistant director. It is not quite so critical to have the same kind of approval for internal material, so branch-level home pages require only branch-level approval."
"You come up with something that you need to put online to make available to everybody," said Stutz. "So what you do is develop a process for putting that online. You develop a security arrangement where the person who is responsible is the only one who has access to write material to that portion of that page. You get the approval of the concept and the format, and so forth, and then it's just like hard copy. They develop it, it gets approved, then they hypertext it and put it online."
To keep things under control, the DPR intranet uses the security features built into UNIX. Each users' IP address gives that user certain privileges. The IP address determines which directories that user has authorization to write material to and which directories the user can read. Access is controlled very specifically by IP address and user name for the Novell LANs.
For added security, they have installed a hypertext login and password system. "It's a dual security thing," said Stutz. "Of course this is all internal, so it is independent of what's outside on the external Web site."
Another vital concept which Stutz's plan emphasized was integrity of data. "Internet technology is very good for this because everybody is looking at one file in one place," he explained. "That, to my mind, is the key to maintaining data integrity. Apart from mirrors or backups, the file is only pointed to at one location on the internal or external site."
This concept, while simple, has been difficult to sell. For example, some data and databases which are accessed from the intranet and Internet actually reside on the external Web page server. "It was hard to get some people to understand why they would be going outside the intranet proper to access certain data. However, the principle was simply that if two distinct and separate files existed, then it was possible that one would get updated while the other wouldn't. At that point you have lost data integrity. In addition, there were data needed by both internal staff and the public."
HANDLE DATA ONCE
Another key factor in the intranet's success is automating everything possible. "The programmer who is the real wizard here, Steve Kishaba, sets everything up to be automatic," said Stutz. "He does not like to handle data more than once. Once he's developed a program to process the data properly, he sets up an automated routine that just does it by the clock. It works like a champ. I think since last October, we have only had it fail three times. And once was a hardware problem."
An example of this automation is the way that the external Web site server is updated. The department has set up what they call their internal/external site -- a production site on which staff work on material that is going to reside on the external Web page. This is an internal site where staff might be adding content, reports, or might be changing a database query or adding a new one. This is all done internally.
Meanwhile, the external Web site continues blasting along out there unchanged during the day. Then starting at about 6:30 p.m., this internal/external site is completely WAIS-indexed so that everything is fully searchable. Then, at about 11:30 p.m., the entire external Web site is refreshed. Everything on the external site, including the databases, is rewritten when the internal/external site is mirrored to the external Web site.
WAIS indexing is a vital process which is automatically performed on portions of the internal pages as well. "We've put in some simple standards for hypertexting which we consider very valuable," said Stutz. "Like we put a real title within title bars of a document so that when you do a search, instead of just getting a file name, you get the full title. We've set up the WAIS index so that it picks up that title. So whenever you do a search, you always get back more than a file name. "
This enables users to get to the information they seek more quickly, and at the same time, allows the system to automatically generate a meaningful hypertext "what's new" file. This picks up any file which has been edited or added to the site in the last 30 days. So users of the external home page simply have to click on a button to pull up a list of what has been added to or changed. And, in Netscape, if they have accessed it already, it's highlighted and if they haven't, it's still blue. This makes it very simple for staff and the public to keep up with changing information.
Another use of WAIS indexing is now being implemented to handle internal "weekly issues" circulars. "I think there are 16 different entities within the department which produce what we call 'weekly issues,' so that's quite a bit of material," said Stutz. "What we did was set up a process whereby we empowered these 16 people, by login and password, to author directly onto the department home page, to a certain area in a certain directory. Then we set up a macro which when run, automatically converts the Word Perfect files, in which these weekly issues were being written, to an RTF format which is automatically converted to HTML, which goes in the appropriate Web directory. The hypertext menu for the directory is also automatically regenerated. The whole process takes about 20 seconds. This is just a plain, standard vanilla hypertext document for internal use, but when all these are WAIS-indexed they are fully searchable.
"This means that over the years -- say perhaps five years down the line -- someone might come up against a problem which was addressed in a weekly issues circular years ago. How do you find that if you are dealing with hard copy? Now you can simply search the intranet for weekly issues that dealt with the specific problem you are interested in. I think that is an example of a very valuable internal tool, it demonstrates the value and power of electronic over hard copy."
THE BIG SELL
The hardest part of setting up an intranet, according to Stutz, is selling the idea and getting the support of the staff. "It's hard because it involves changing long-standing business processes. People look at it and say, hey, this is extra work. I don't have time for that.
"What you have to sell them and management on is the changing of business processes. You produce this, somebody copies it, somebody addresses it, somebody goes out and sticks it in mail boxes. So the author's part of the process is one portion, but there are four other people down the road that are putting in their little bit too. So the change involves adding one step to the front end -- posting the material with suitable title and all to a home page, for instance. But this then cuts out the other four steps down the road. For some reason, some people have a hard time conceptualizing this."
To launch a successful intranet, said Stutz, you should start with a plan that details not just what content you will put on both internal and external pages, but how this will be controlled as a routine, day-to-day activity. "The plan can expand later. We've thought of a hundred things down the road and a year and half later that would be great to put on. Our implementation would have gone a lot smoother if we had committed more time to our plan. But I think for most of the people I've talked to in other departments, the biggest problem is resources, committing the resources to this kind of an implementation. This is both a new technology and a new drain on scarce resources not traditionally addressed by current budgets. Committed resources, number one -- and number two, having the management commitment and support needed to be able to change the business processes of the organization.
"This last point is critical. If management isn't behind the project 100 percent and isn't telling people that from now on, this is what they are going to do, it's awfully tough to make it work. Management buy-in right from the very top of the organization is critical."
For intranet builders, the Internet provides a wealth of information, reference material and software. Here are a few good jumping-off points to locate what's out there:
* Web-Building Virtual Library
This provides a comprehensive set of links and material for much of what you need to build the server side of an intranet. This is a very good place to start.
* Web Reference
A comprehensive reference site for Web building.
* Winsock Software (TuCows)
Billed as the "ultimate collection of Winsock software," and so popular that it is now mirrored to dozens of sites around the world.
* The California State University Windows Shareware Archive
This contains public domain software as well as shareware.
* Shareware Index
This allows you to search for shareware and freeware for various platforms and links to most of the university FTP archives of software around the world. A very useful tool to find software.
* FreeBDS UNIX
One of the many mirrors of the FreeBDS UNIX operating system which many public and private Internet providers use because of its robust reliability.
* Berkeley Software Design
More UNIX software.
* Searching For References
If you need further references, try searching using Digital's Alta Vista.