Just a little while ago, big iron seemed lost.
Mainframes didn't appear to fit the new world of distributed computing and decentralized networks, and took on the role of quaint, nostalgic reminders of the days of computing past. Everybody was ready to forget the mainframe, ready to repeat Sun's memorable, "The network is the computer" mantra.
Suddenly, however, the mainframe is becoming relevant again.
In an earnings report issued earlier this year, IBM announced mainframe revenue growth of 44 percent, year over year. The company also said the performance turned the mainframe into its largest hardware growth segment.
The zSeries mainframes are making a lot of money for the company -- and it's a good thing, given IBM's decision to invest four years, $1 billion and the time of 1,200 developers on the zSeries' top-shelf z990 mainframe.
As often happens in the IT world, this particular trend owes its emergence to other, related happenings. Mainframes' sudden cachet can be traced to several concrete factors: The new 64-bit architecture for processors allows mainframes to perform at blazing speeds; open source operating systems and applications suddenly emerged in enterprises of all sorts; customers are moving to off-the-shelf software instead of proprietary solutions; and a trend toward consolidation and centralization of enterprise computing has emerged.
The mainframe didn't regain stardom overnight. Like any other "new" sensation, plenty of years went into big iron's rebirth as technology that matters.
Back From the Brink
Back in the day, the term "mainframe" meant a large computer.
Everybody used mainframes because that's all that was available. A mainframe consisted of a big CPU cabinet connected to banks of tape and disk drives, and a front-end processor. The mainframe connected to a series of terminals (the famous "green screens"), which staff used to manipulate and extract data stored in the memory drives.
Mainframes had their strong points -- since the data, and
the applications that access and manipulate the data, were controlled by the mainframe, big iron worked fast. IT departments liked mainframes, because they could centrally manage them. Staff didn't necessarily like using terminals, because of their limitations, but they didn't have much of a choice -- that's what you had to use.
Big iron's downside, though, eventually cost mainframes their starring role in enterprise computing. If you bought a mainframe, you also bought a proprietary software environment that often didn't play well with other software, and it took significant programming effort to get around that exclusivity. It wasn't easy to buy off-the-shelf software and load it onto a mainframe. The economics of mainframes favored their manufacturers, not buyers.
With the explosion of the World Wide Web and TCP/IP in the early to mid-1990s, mainframes started looking old and slow. Who needed an old clunker when new, smaller servers were capable of zipping lots of data to client devices? The client devices themselves weren't green screen anymore -- they turned into powerful PCs that perform a range of operations on that data.
Mainframes seemed fated to join the scrap heap of obsolete technologies.
Instead they pulled off a surprising renaissance, and enterprises have taken interest in what these "new" mainframes can do and how different they are from their predecessors. The emerging 64-bit architecture gives mainframes even more processing power than before. Pairing this new horsepower with the open source Linux operating system allows enterprises to be flexible and move away from proprietary software environments. New virtualization capabilities, especially with Linux, allow enterprises to make hundreds of copies of the operating system on one mainframe, creating the software equivalent of a rack of server blades for a significant number of
Going Big Iron
The Oklahoma Department of Human Services (DHS) opted to buy a new zSeries mainframe in 2002 to handle a range of applications, including its mission-critical eKIDS program. eKIDS is a Web-based application giving authorized users access to selected child welfare information.
"We had an IBM mainframe that needed upgrading, but we also had an HP midrange server that needed upgrading," said Marq Youngblood, CIO of the DHS, adding that both environments ran specific applications. "We looked at it, and we wanted to maximize the taxpayers' money. Rather than just spending money in both of these environments, incurring not only the initial cost of upgrade but also incurring the maintenance costs on both, how do we move some things to better leverage the money we have available?"
Youngblood said the DHS moved a total of 25 software packages from the midrange server to the new zSeries server running Linux to avoid the additional cost of HP's UNIX operating system. The 25 packages consisted of database management tools, system maintenance tools, schedulers for jobs from various DHS divisions, as well as the software package that maintains source code, among others.
The DHS also moved the Oracle database for the eKIDS application off the midrange server this April, he said, but the eKIDS application itself is still being run on servers in the field -- something that's targeted for change.
"That's something we plan to do in the future with the Java capability, to actually run code on the mainframe," he said. "That will benefit us significantly. Rather than having so much of that code decentralized, we'll have a scenario that will allow us to leverage the strength of the past. The code runs centrally, but the presentation element of the application is decentralized. That's where you really get a significant bang for your buck."
When dealing with particularly complex code that crunches a lot of numbers or handles big transaction volumes, the mainframe is clearly the best place to run that type of business logic, he said.
"We want to use the z server where it makes the most sense, but we'll use other midrange servers where they make sense," he said. "We're going to look at all that we have available to us -- and for this particular effort or that particular application, what makes the most sense?"
There's still plenty of use for the client/server architecture, he said, citing numerous file and print servers set up in DHS offices in the state's 77 counties, and the mainframe can clearly complement what already exists. That new flexibility made the DHS very interested in buying a mainframe, despite big iron's historic unwillingness to play nice with others.
"The mainframe was caught by not being up to date with technology," Youngblood said. "The flexibility was not there -- although you have the reliability -- but the functionality was not there. The intuitiveness was not there."
Mainframes' new affinity for open standards and the Linux operating system means big iron is now sufficiently nimble that DHS software developers can use Java to whip up a new application for a department or set of users needing to retrieve data stored in a mainframe.
Perhaps the biggest compliment for the new mainframe is that users don't even realize they're working on mainframe-based applications, Youngblood said. In the past, mainframe systems were painfully obvious to end-users -- one worked on a screen that displayed a limited number of characters across and so many lines down. There were no graphics or drop-down menus.
"We can build the same or very similar solutions that use the mainframe as the processing environment we have been building on the smaller servers," he said.
appreciation for the mainframe's prowess didn't develop overnight, though the headaches of maintaining a far-flung web of clients and servers had much to do with mainframe's appeal.
"If you were looking at the IT market 20 years ago, and you were building serious systems -- whether for Social Security or for Bank of America, it doesn't matter -- mainframes were all over that," said Jonathan Eunice, principal analyst for Illuminata, an IT research and advisory firm. "In the late '80s and through the '90s, Windows and PCs and UNIX, and then Linux, came of age, and people often forgot about mainframes. There was so much attention paid to these other platforms."
Mainframe vendors fell prey to the lure of resting on their laurels, then woke up and realized the market was leaving them behind, he said, prompting a scramble in the mid-1990s through 2000 to get back up to speed.
Companies struggled to regain parity -- to have a reasonable speed of updates to their products; to accommodate the fact that UNIX, TCP/IP networking and Web servers changed the IT landscape, and mainframes hadn't changed with it, he said. With that realization, vendors started making sure their hardware could handle TCP/IP networking, gigabit Ethernet and run the Linux operating system.
"There were a couple of years, especially in the second half of the '90s, where the majority of activity was really in accommodating those changes in the world," Eunice said. "Over the last couple of years -- 2000 to 2004 -- the mainframe has been on more of a positive course, as opposed to a corrective course.
"Mainframes now have stuff people want, but once you've gone through this phase where everyone has kind of forgotten you, it takes a while for you to get credibility again," he said. "It's a slow process to change people's opinions, especially when their opinion has migrated away from you."
Though it's tempting to make the connection, big iron doesn't necessarily owe its resurgence to the current trend toward centralization.
"It is about working together," said IBM's Jim Porell, chief strategist of zSeries software. "There is no argument now about centralized versus decentralized. The world is decentralized. The world is distributed. The mainframe is a blind and deaf computer. It can't work without a PC anymore. It can't work without some kind of pervasive computing device as a front end."
Given the strength of the distributed web of PCs and midrange servers in the government enterprise, mainframes can't hope to unravel those threads. That web exists because agencies got used to using PCs and midrange servers as the platform on which to run applications, Porell said, and it's up to big iron to cozy up to that Intel RISC architecture.
"There's a separation of decision-making between the 'legacy mainframe' and the new, cool HR application on Intel RISC -- the new, cool food stamp application on Intel RISC," he said. "But the reality is mainframes are writing the checks, and the electronic funds transfers are still being done on traditional mainframe transaction processing programs. Rather than keeping them independent, how can we glue them together?"
Cool applications can only go to one place for the data they need -- the mainframe. Though the PC or midrange server hosting the application does whatever work is needed on the data to generate a report or create a database for a user, that data must be pulled in from somewhere.
The way to get around that problem is using Web services on the RISC platform to create a mechanism for data exchange between the legacy mainframe and end-user applications. It works, but creates another layer between the application and the data. The next phase is enterprises running
Web services directly on the mainframe, Porell said, something approximately 15 percent to 20 percent of IBM's customers are doing.
"The majority of that is in the finance industry, where the bulk of our business is, but the principles we're generating there apply directly to other industries," he said. "You have call centers, food stamp programs, Medicaid -- that's a lot of infrastructure across some of these larger states and a tremendous transaction processing volume."
Mix and Match
Washington state is another government taking advantage of mainframes' ability to do the heavy lifting while fitting into today's modern IT infrastructures.
"There are some dinosaurs that are quick and nimble," said Kay Metsker, IBM technical services manager in the Department of Information Services' computer services division. "Dinosaurs died out. But in this case, mainframes are evolving. Vendors had a choice of either trying to continue with their proprietary ways, or they could try to evolve."
As a result of that evolution, vendors now characterize mainframes as giant, enterprise servers, Metsker said, to play up the strengths of the mainframe -- high throughput, high transaction counts, scalability, reliability and security -- and their new ability to handle workflows typically handled by smaller, RISC-based servers.
Washington's Computer Services Division bought one new IBM z900 mainframe this spring to replace an older mainframe, and might buy the bigger z990 mainframe in June 2005 to replace another old mainframe, she said. The division also bought two new Unisys CS7802 mainframes to replace a test and development mainframe, and a production mainframe.
New software for the mainframe also helps create the impression that the mainframe can support your needs, said Carlyle Ludwig, Unisys tech support manager of the computer services division.
"On the Unisys mainframe, you can run a Windows environment right next to an old 2200 Unisys operating system in the same machine," Ludwig said. "Unisys has positioned themselves [so] that if you do want to migrate, you can have one machine and do the migration gracefully."
One perhaps unintended consequence of mainframe evolution is that big iron's move to open standards has actually sparked employee interest. Metsker said her IT staff is excited about bringing in open standards-based software and protocols, such as TCP/IP, and making it all work on mainframes.
"TCP/IP is just fast, and it also connects to the Web very easily," Ludwig said. "What you have is a mainframe backbone, but you're now interfacing with the brave new world of the Internet -- access to everything. Mainframes have evolved to work in conjunction with the server world."