IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Hidden Computer Bottleneck Can Cost Agencies Millions

Organizations are losing money by not handling fragmentation of their systems.

By Drew Robb | Contributing Writer

Organizations are losing money by not handling fragmentation of their systems.

Within just a few months, 64 percent of desktops and 81 percent of servers will be running either NT 4.0 or Windows 2000, according to a survey by San Jose-based Survey.com. This is cause for celebration in Redmond, but it may be bad news for government users if their systems managers fail to deal with a hidden bottleneck that is responsible for gradual performance degradation on Windows-based systems: disk fragmentation.

Analysts from International Data Corp. (IDC) have identified fragmentation as a little-known but costly fact of life in the NT/2000 enterprise environment. The resulting performance slow-downs are causing state and local agencies to lose thousands, if not millions, of dollars each year. An IDC report, "Disk Defragmentation for Windows NT/2000: Hidden Gold for the Enterprise," provides a detailed analysis of total cost of ownership (TCO) and hard-money losses due to curtailed productivity, increases in calls to the help desk, unnecessary hardware upgrades and significantly higher IT staff costs as a result of scattered files on disks.

IDC estimates indicate that corporations and government agencies are losing, "... as much as $50 billion per year as a result of not defragmenting every server and workstation on the network," said Steve Widen, director of IDCs Storage Software Research.

Approximately $6 billion is related to unnecessary hardware upgrades purchased in an attempt to mask the effects of fragmentation.

The Gradual Decline

Many have experienced logging onto a brand new NT or Windows 2000 desktop and being dazzled by its speed and responsiveness. A few weeks or months down the road, however, it doesnt seem quite so powerful. Many users shrug it off, reasoning that they grew accustomed to the speed and now take it for granted. However, the more likely reason is that fragmentation has made the system more sluggish. Over time, productivity is significantly impacted as accessing files and programs takes longer and longer.

"After two or three months, our NT workstation boot time and file access performance deteriorated by about 25 to 35 percent," said Henry Tint of the Department of Defenses Department of Test and Evaluation, which is responsible for a network of 120 NT workstations and six NT servers. "Two to three weeks of backup and data transfer on a server created enough fragmentation to slow it by 30 to 40 percent."

Fragmentation means that files are broken into multiple fragments rather than existing in a contiguous state. As a result, file access takes longer because the pieces of a document must be collected from around the disk, requiring several head movements instead of just one. Backups and reboots can also be delayed by this condition.

"One of our servers took 20 minutes just to shut down during reboot," said Kevin Beaulieu, senior automation manager at the U.S. District Court in Portland, Maine. "System deterioration over time became the reality of the NT world as far as we were concerned."

In a short time, file fragmentation gets into such an advanced state that files can end up in thousands of pieces. According to a recent survey of U.S. businesses by American Business Research of Irvine, Calif., it is far from uncommon for server and workstation files to be splintered into tens of thousands of fragments.

"Some of the partitions on our NT file servers were 70 percent fragmented when we first installed the operating system," said Marsha Perrott, network analyst of Pittsburgh. When users try to access such files, they face long delays while the disk thrashes around during document compilation.

Some system managers seek to minimize fragmentation by keeping a large percentage of free space available. They reason that the condition will only take hold once disks fill up. Unfortunately, this theory doesnt pan out in reality. My NT workstation has 60 percent free space, yet requires defragmentation at least once a week. And any time I load another application, directories are strewn throughout the disk and another defragmentation run has to be scheduled immediately.

This phenomenon of fragmentation resulting from application installation is best observed on a new hard drive. Load NT or Windows 2000 and measure the amount of fragmentation that exists prior to actually using the computer. Even though the disk is almost empty, some users report as much as 50 percent fragmentation. Thus, some machines may never have performed to their true capacity.

The Remedy

The remedy for fragmentation in the enterprise environment is to load and run networkable defragmentation software. Such programs reorganize disk space by consolidating both files and free space. If performed regularly, fragmentation takes a relatively short time and can be set up to run in the background, consuming no noticeable overhead in either a workstation or a server. If a machine has been neglected and severe levels of fragmentation exist, however, it may be best to run the program at a higher priority line in order to thoroughly handle it. It might also be necessary to free up enough disk space for the utility to function properly and initiate the boot-time defragmentation option to consolidate system files and directories.

"Boot-time defrag for Windows NT is of particular importance as this often clears up many unexplained [operating system] behaviors," said Jim Mittl of the National Renewable Energy Lab in Golden, Colo. "As well as significant time savings, the use of Executive Softwares Diskeeper utility has prevented several workstations from being completely done over."

This last point concerning hardware upgrades is covered at length in the IDC report. Some jurisdictions fail to recognize fragmentation as an important factor in the performance equation, and unnecessarily acquire new machines in an effort to solve a gradual degradation in system performance.

Manual Versus Network Defrag

With the release of Windows 2000, Microsoft built a "lite" manual defragmenter into the operating system known as "Disk Defragmenter." This utility is similar in functionality to the defragmenters that currently ship in Windows 98. Since Windows 2000s arrival earlier this year, some have been tempted to use this tool to handle defragmentation across their networks. "When I heard that Windows 2000 was going to incorporate a defragmenter, I initially planned to use that utility rather than buy more defrag licenses," said Kathy Thomas-Smallshire, a network engineer for the city of San Joses Planning, Building and Code Enforcement Department.

But according to Microsoft Knowledgebase Article No. 254564, "Disk Defragmenter is not intended to be a tool for administrators to maintain networked workstations. This version is not designed to be run remotely and cannot be scheduled to automatically defragment a volume without interaction from a logged-on user."

IDC investigated this issue further, carrying out a cost analysis of using a manual versus network defragmentation utility in the enterprise. Based on an estimate of one hour to manually defragment each server and workstation, analysts found that it could cost millions of dollars annually to have qualified system managers go from place to place handling each desktop and server machine in their networks. Networkable defragmenters, on the other hand, can schedule, monitor and control defragmentation from a single console, cutting the demands on IT staff to two hours a month, regardless of the size of the network.

While a networkable utility is the obvious choice for a large government operation, IDC found that even in small jurisdictions, it is cheaper to buy a full-functioned utility. For a system consisting of only one server and 10 workstations, for instance, IT staff would spend as many as 572 hours conducting weekly, manual defragmentation at a cost of over $20,000 a year. Conversely, remote defragmentation worked out at $960 a year in labor costs and $648 in licensing fees.

What does this mean to state and local government agencies? "The decision to defragment the enterprise automatically versus manually will save thousands if not millions of dollars per site," said IDCs Widen. "Even though the actual numbers may vary from network to network, when considering the significant impact on TCO, it is difficult to find any argument to position manual defragmentation over network defragmentation."

Another factor in the manual versus network debate is the poor performance users sometimes experience from manual defragmenters. When employed by Virginia, for instance, the utilities built into Windows 98 and 2000 were found to be deficient in several areas. "Sometimes when trying to defrag the system, the manual versions hang up," said Bob Fraser, IT manager for Virginias Enterprise Solutions Division. "Ive never successfully used one to defragment large hard drives, particularly those in the 20GB range."

Security is another issue to consider when using the disk defragmenter built into Windows 2000. In order for this utility to function on a desktop, the user must be granted administrative privileges. Few system managers are likely to allow this due to possible security repercussions.

Impressive Gains

How much real benefit can be experienced through the use of defragmentation software? Testing by the National Software Testing Labs (NSTL) revealed that defragmented systems performed anywhere from five to 200 percent faster than fragmented systems, depending on the application being run. The lab report can be viewed at www.nstl.com.

"Several of our NT workstations were slowing to a crawl," said Thomas-Smallshire. In addition to Microsoft Office and Internet Explorer, San Joses planning department also uses MapInfo Corp.s (mapping) and FileNET Corp.s (imaging) software. "Defragmenting these machines has greatly improved their performance."

Users at the state level, too, report gains of up to 40 percent after fragmentation. "Not only access times are affected by fragmentation; it can also cause data corruption if fragmentation is left unchecked," said Frederick Meisiek, a computer programmer with Floridas Department of Environmental Protection. "I dont run any kind of server or high-end workstation without using a defragmenter."

Similarly, Fraser of Virginia noted a significant improvement on application loading and system performance after consolidating files in the hard drive.

With this much gain potentially available, look for more government offices to introduce networkable defragmentation technology throughout their enterprises. "Regularly scheduled defragmentation keeps a network performing at its best," said Paul Mason, IDCs vice president for System Infrastructure Software Research. "Just as you need virus protection to protect from file damage, every machine needs defragmentation software to protect it from performance degradation."

The full text of the IDC report, "Disk Defragmentation for Windows NT/2000: Hidden Gold for the Enterprise," can be purchased at www.idc.com or accessed at www.execsoft.com.



Drew Robb is a Los Angeles-based writer specializing in technology issues.