Just a little while ago, big iron seemed lost.

Mainframes didn't appear to fit the new world of distributed computing and decentralized networks, and took on the role of quaint, nostalgic reminders of the days of computing past. Everybody was ready to forget the mainframe, ready to repeat Sun's memorable, "The network is the computer" mantra.

Suddenly, however, the mainframe is becoming relevant again.

In an earnings report issued earlier this year, IBM announced mainframe revenue growth of 44 percent, year over year. The company also said the performance turned the mainframe into its largest hardware growth segment.

The zSeries mainframes are making a lot of money for the company -- and it's a good thing, given IBM's decision to invest four years, $1 billion and the time of 1,200 developers on the zSeries' top-shelf z990 mainframe.

As often happens in the IT world, this particular trend owes its emergence to other, related happenings. Mainframes' sudden cachet can be traced to several concrete factors: The new 64-bit architecture for processors allows mainframes to perform at blazing speeds; open source operating systems and applications suddenly emerged in enterprises of all sorts; customers are moving to off-the-shelf software instead of proprietary solutions; and a trend toward consolidation and centralization of enterprise computing has emerged.

The mainframe didn't regain stardom overnight. Like any other "new" sensation, plenty of years went into big iron's rebirth as technology that matters.

Back From the Brink

Back in the day, the term "mainframe" meant a large computer.

Everybody used mainframes because that's all that was available. A mainframe consisted of a big CPU cabinet connected to banks of tape and disk drives, and a front-end processor. The mainframe connected to a series of terminals (the famous "green screens"), which staff used to manipulate and extract data stored in the memory drives.

Mainframes had their strong points -- since the data, and the applications that access and manipulate the data, were controlled by the mainframe, big iron worked fast. IT departments liked mainframes, because they could centrally manage them. Staff didn't necessarily like using terminals, because of their limitations, but they didn't have much of a choice -- that's what you had to use.

Big iron's downside, though, eventually cost mainframes their starring role in enterprise computing. If you bought a mainframe, you also bought a proprietary software environment that often didn't play well with other software, and it took significant programming effort to get around that exclusivity. It wasn't easy to buy off-the-shelf software and load it onto a mainframe. The economics of mainframes favored their manufacturers, not buyers.

With the explosion of the World Wide Web and TCP/IP in the early to mid-1990s, mainframes started looking old and slow. Who needed an old clunker when new, smaller servers were capable of zipping lots of data to client devices? The client devices themselves weren't green screen anymore -- they turned into powerful PCs that perform a range of operations on that data.

Mainframes seemed fated to join the scrap heap of obsolete technologies.

Instead they pulled off a surprising renaissance, and enterprises have taken interest in what these "new" mainframes can do and how different they are from their predecessors. The emerging 64-bit architecture gives mainframes even more processing power than before. Pairing this new horsepower with the open source Linux operating system allows enterprises to be flexible and move away from proprietary software environments. New virtualization capabilities, especially with Linux, allow enterprises to make hundreds of copies of the operating system on one mainframe, creating the software equivalent of a rack of server blades for a significant number of

Shane Peterson  |  Associate Editor