Slideshow: CES 2008
/ March 12, 2008
The network is the computer" - at least, it is according to Sun Microsystems. Sun's John Gage coined that phrase two decades ago, which was, in hindsight, about 20 years ahead of its time. Today, breakthrough innovations, such as distributing unused computing power to create a virtual supercomputer, are steadily transforming Gage's vision into reality.
There's a lot to look forward to on the horizon. Cloud computing might be the next step in the Internet's evolution. Advances in fields like nanotechnology are enabling robots to become truly ubiquitous; they may even be surprisingly helpful to government agencies confronting the baby boomer retirement wave. And at last, the keyboard and mouse may finally be on their way out - if Microsoft's new hands-on interface is the next big thing.
Technology is always on the march. Here's a look at where some of it is headed.
Intuitive Interface: The Power of Touch
It's pretty ridiculous that we still use keyboards. It's kind of like trying to fly an F-22 fighter jet with the controls used by the Red Baron. Keyboards are unfriendly and unintuitive. But for more than a century, nobody has come up with a seriously viable alternative - until now.
When Microsoft Surface debuted last year, it represented the first significant move toward a more immersive style of interface. Gone are keyboards and mice; a touch-sensitive screen replaces them. Commands are executed by touching, objects are moved by dragging and art is made by digital finger-painting.
Surface's guts aren't all that impressive - a PC running Windows Vista, a projector and some cameras - packaged inside a table. What's impressive is how Microsoft organized these ordinary elements into something extraordinary.
"Surface uses a series of cameras underneath the tabletop to see objects," said Kyle Warnick, group marketing manager for Microsoft Surface. "Hand gestures and touch - these user inputs are then processed with a standard Vista PC inside, and using rear projection, the input is displayed on the surface of the device."
The cool part happens when the inputs are displayed. Surface completely changes the way a user interacts with a computer because it can recognize more than four-dozen simultaneous, unique touches.
At the 2008 Consumer Electronics Show (CES) in Las Vegas, Microsoft, known more for force-feeding products down consumers' throats than beauty and innovation, showcased the elegance of Surface. Transferring digital photos from a camera to computer, for example, becomes as easy as dragging your finger across the surface. Photo editing is equally simple: Want the photo larger? "Grab" the corners and pull.
Music files work the same way. If you have a Zune digital music player, you can organize your music as easily as CDs.
But Surface is more than just an elaborate media center. The apparent limitlessness of applications is a pleasure to imagine. Microsoft initially hopes to deploy the technology in hospitality and leisure spaces; hotels and restaurants are likely candidates. As shown in Microsoft's CES demonstration, diners could eat their meals on the Surface tabletop, and along the way, the PC would recognize the specially tagged dishware and inform customers about the origins of their food and wine. Afterward, the bill would be paid on Surface by simply placing a credit card on-screen.
"Right now we're focusing with our current partners - T-Mobile, Harrah's, Starwood, IGT - in the retail, leisure and entertainment industries," Warnick said. "Since announcing Surface, we've received more than 2,000 inquiries from 50 countries around the world across 25 different industries. The possibilities are endless, and we believe that over time, surface computing will be pervasive in many industries and even the public sector."
How the public sector would utilize Surface remains to be seen. However, it's easy to imagine Surface in DMVs or social services
offices, where customers might handle transactions through the touch interface. Other applications might be GIS-related or even, heretofore unimagined document management software.
Cloud Computing: Is Software the New Hardware?
Surface is all about making the user computer experience more personal and tangible. Cloud computing, on the other hand, seeks to do the opposite by taking what we do further into the digital ether.
You've probably heard all the terms - grid computing, distributed computing, utility computing, cluster computing and on-demand computing. Although they don't mean the same thing, fundamentally the terms describe something similar: the concept of using another entity's infrastructure to enhance your own capability.
In June 2005, Government Technology published a story on utility and grid computing titled Witnessing an Evolution. The grid is a theoretical network of devices, most of which use only a fraction of their computing power at any time. The idea - that's often practiced many times - is to concentrate that excess processing power and focus it on a large problem.
Put another way, a major problem is "distributed" across a network of capacity. Stanford University's Folding@home project is one of the finest examples of distributed computing. Windows PC, Linux and Mac users, as well as Sony PlayStation 3 owners, can participate in the initiative by leaving their Internet-connected machines on standby mode when unused. Folding@home co-ops the machines' collective computing muscle to help solve the genetic riddles that plague efforts to cure diseases.
Utility computing is similar in some ways and dissimilar in others. In the utility computing model, rather than randomly dispersed machines working on a single problem, randomly dispersed people access computer farms to solve their own problems. It's called utility computing because it operates like an everyday utility, such as electricity, gas, water, etc.
Regardless of the exact strategy or definition in play, it all comes down to the cloud concept, which is the transformation of infrastructure to software. The machines themselves become less about performing a task and more about accessing computational power. If there was ever a philosophical goal underlying the creation of the Internet, cloud computing may be it - an infinite number of machines using an infinite number of resources to perform a task.
As the Information Age rushes onward, more data is continually created. IT professionals in the public sector routinely confront the challenges associated with maintaining this data onslaught. What if, instead of routinely investing in new infrastructure, an agency could instead access a global cloud of machines to process data? Google and other industry heavyweights are already preparing for the cloud-computing era.
Like Microsoft and IBM, Google has tens of thousands of machines that sit around the world. Accessing these machines unused computing power these machines would be like tapping into an enormous supercomputer capable of crunching the biggest numbers.
Christophe Bisciglia, a Google software engineer, recently launched the Academic Cluster Computing Initiative. Through a partnership with IBM and the National Science Foundation, Bisciglia connects universities worldwide to Google's cloud, and along the way teaches students to think and program on a massive scale.
"We started with the University of Washington, and we brought in a cluster of 40 machines, and we taught the first cluster-computing course for undergraduates," Bisciglia explained. "We used an open-source software system called Hadoop. It's an open-source distributed computing platform inspired by Google's published computing technology. It's a software system that gives you the ability to turn a cluster of hardware into a dynamic software system that allows you to manage and process large amounts of data."
What's the use of clusters?
As Bisciglia explained, organizations are being inundated with more and more data. Single machines become incapable of processing these vast amounts of information and eventually will fail. Buying more machines becomes unfeasible - particularly for public-sector organizations limited by budgets.
"Networks are getting faster and faster. Two computers connected to each other via network are much more like a dual processor machine than they were five years ago," said Bisciglia. "So basically you need to scale out horizontally now. When you want more computational power, you can't just wait for computers to get faster; you can't just buy a faster processor. You need to add more computers in a network's configuration and interact with another cluster, rather than as a single machine."
Cloud computing isn't as far off as it might initially seem. In fact, it's already happening in some respects, but it goes by yet another name: software as a service (SaaS).
SaaS has been around in one form (application service provider, or ASP) or another for a while. It functions via the same principles as cloud computing. Instead of users investing in more computing infrastructure to complete tasks, they can instead access someone else's cloud to do the work.
Salesforce.com has been a leader in the SaaS industry for years by hosting customer relationship management (CRM) solutions for organizations that can't or won't invest in the infrastructure to do it themselves. The company is now heavily involved in applications that extend beyond CRM, opening its cloud to anyone who wants access.
Salesforce.com also offers users a platform service that lets them create their own unique applications in the cloud - and users can keep the applications for themselves or share them with others.
"Platform user service really allows customers to have computing power delivered completely as a utility in the cloud," said Dan Burton, senior vice president of global public policy for Salesforce.com, "so customers can then use the cloud computing architecture to build, test, deploy and run applications in the cloud. What that really means for customers and developers is, instead of going to a preconfigured application, they can really go into the cloud, and using our programming language, APEX, they can custom build any application they want to."
It may not be the stuff that cloud computing dreams are made of, but it represents the inroads that are being made into cloud computing, which are available to an IT crowd desperate to produce more with less.
One obstacle to life in the clouds is security. Public-sector organizations trade heavily in sensitive data; the thought of letting that data loose in some ethereal cluster of random machines is likely to send shivers up CIOs' spines. It makes sense that early cloud activity takes place in an environment mediated by a large, established company like Salesforce.com, which is why several public-sector organizations are already taking their first steps into the cloud using Salesforce.com's tools.
Mike Goodrich, the director of administration at Arlington Economic Development (AED) in Virginia, said his foray into the cloud isn't about grand ideas of having a virtual supercomputer to do his bidding. Rather, it allows his agency to do business better.
AED creates economic opportunities for Arlington; generally this is accomplished by attracting tourists and businesses to the city. By putting some of its processes, such as event registration, into Salesforce's cloud, it frees IT staff to concentrate on providing better service instead of maintaining equipment.
"Our IT staff has not had to invest their time, effort and money into maintaining servers," he said. "They've been able to simply know Salesforce is maintaining our data. So there's very little involvement from our infrastructure support. It's not really
money saved. What it does is improve our business."
It's not just Salesforce.com and Google that are investing in clouds. Amazon offers its Web Services to small businesses that need some IT muscle but can't afford to put it in-house. Amazon customers basically can run any or all of their business processes on the retailer's array of servers, using only the processing power that's needed to do the job.
Nicholas Carr, former executive editor of the Harvard Business Review; author of The Big Switch; and recent keynote speaker at Government Technology's California CIO Academy in Sacramento, Calif., likens cloud computing to Alan Turing's theoretical "universal computing machine."
"With enough memory and enough speed, Turing's work implies a single computer could be programmed, with software code, to do all the work that is today done by all the other physical computers in the world," Carr wrote in IT in 2018: From Turing's Machine to the Computing Cloud. "Turing's discovery that 'software can always be substituted for hardware' lies at the heart of 'virtualization,' which is the technology underpinning the great consolidation wave now reshaping big-company IT."
From running day-to-day processes on far-flung corporate machines, to a global network of load-sharing clusters, the network is becoming the computer - and the clouds are on the horizon.
Robotics: Nerds' Revenge?
The booming nanotechnology industry is paving the way for advances in fields as diverse as cancer research and space exploration. The big science of creating such tiny things also exposes a glaring problem for industry, including the public sector: the severe shortage of new workers trained and skilled in math, science and engineering.
Fortunately there is a ray of hope in the form of something else nanotechnology is revolutionizing - robots.
There is plenty of conjecture about what robots will be like in five or 10 years. You can find plenty of guesses - educated and wild - about what capabilities robots will possess. What's underreported is another purpose of robots that they weren't designed for.
"Because of our shortage of people entering into engineering, we've got a crisis in this country," warned Glenn Allen, professor of mechatronics engineering at Southern Polytechnic State University. "The importance of getting and recruiting our future researchers - that's where we're going to fall short."
It's a familiar problem. What are organizations going to do when their knowledge base retires? Furthermore, how can businesses and government encourage the Millennial Generation to pursue careers in science and engineering, especially when all the evidence points to stagnating interest in scientific studies?
The answer may be robotics. Allen is the director of the Georgia BEST Robotics program. BEST (Boosting Engineering, Science, and Technology) and FIRST (For Inspiration and Recognition of Science and Technology) are two programs designed to foster student and community interest in engineering careers.
The programs hold regional competitions nationwide that bring together teams of students from all grade levels, challenging them to build robots that perform specific tasks. The goal is to move robotics away from a geeky subculture to something more akin to the local high-school football team - a lofty goal.
"In middle schools and high schools, as students start getting exposed to math and the sciences, they don't see the application, and they get bored with it and don't engage," Allen said. "When the kids get involved in these robotics competitions, they realize that if they want to continue to pursue this - stuff they love, stuff that's fun, and they want to make a career out of it ... they realize math and science do have applications."
It's long been known that kids love math and science - to a
point. Somewhere around the 11th grade, there is a precipitous decline in the number of students participating in technical pursuits. The numbers are a bad omen for companies and organizations looking for the future work force. Allen said that despite technology's massive expansion, the number of graduates with science and engineering degrees hasn't changed significantly since the 1970s.
Randy Schaeffer, regional director of New York/New Jersey FIRST, argues that a big part of the problem is the culture, as anyone familiar with IT projects can relate.
"On any fall afternoon, you don't have to go too far to find 22 kids out on a big, grassy field with hundreds and hundreds of community members, cheerleaders, pep bands, coaches and a lot of hoopla," said Schaeffer. "The local papers devote pages and pages to what those kids are doing.
As a result, they come away with the feeling that what they're doing is pretty cool and pretty important."
Allen echoed the same concern. Students need to be motivated as if they are star athletes to stay with these pursuits, as do those who volunteer their time to help mentor students in science and engineering.
"Think about football games," he said. "The coach, he gets paid to be there after school coaching those students in athletics. The robotics coaches that I know of in Georgia do it out of devotion. These guys are working every evening; they're working weekends - zero compensation in most cases. Think about the booster club for the football team, basketball team and soccer team. Where is the booster club [for areas like robotics], and where are the parents? There's not a mechanism to give these coaches the resources they need. We need to make robotics a lettering sport. We need to make it culturally acceptable."
The BEST and FIRST programs are making headway. The regional FIRST competition made the front page of several California newspapers. The numbers show progress too: Kids involved in FIRST and BEST are more likely than their peers to attend college. They're more likely to attain a post-graduate degree and major in science or engineering.
Changing the culturally accepted notion that athletes are cool and kids who like science are not isn't going to be easy. The roots of these perceptions reach into many facets of life. But there are signs of a shift. Pay attention to social networking sites and Web forums - the "nerdier" among us often rule the roost online. The onus to embrace math, science and engineering is as much on the community as it is on children.
Hopefully it isn't too little, too late. Regardless, people like Allen and Schaeffer are doing what they can to make geek chic.
And you thought all robots did was vacuum.