"Why is it that we hear so much ballyhoo about e-governmentbut see so little real accomplishment?" This question came up at a recent discussion hosted by the Center for Digital Government. Everybody present agreed with the premise of the question, but nobody had a good answer.

Not long after that discussion, I came across a book that offers some insights into this enigma: The Social Life of Information by John Seely Brown and Paul Duguid (Harvard Business School Press, 2000). The authors write about the "tunnel thinking" that seems to be endemic to most technologists and many futurists as well -- in particular, the unquestioned assumption that Moore's Law will overcome all obstacles to the inexorable advance of information technology and that any institutions that might stand in the way are inevitably destined for oblivion. The logic, such as it is, goes something like this: If the technology isn't powerful enough (or cheap enough) today, all we have to do is wait a few years and the performance (or the cost) will improve to the point that the problem goes away.

"Not so fast," say Brown and Duguid. Institutions that have served society over some period of time don't just up and die the instant that technology becomes available to perform the same functions faster, better or cheaper. The same applies to human behaviors -- in spades. Change just doesn't happen that way.

Daryl Conner, author of Managing at the Speed of Change (Villard Books, 1993), explains why. If a person or institution is in Situation A and is presented with the opportunity to move to Situation B, that will not happen unless Situation B is perceived to be "better" than Situation A. For change to occur, it is necessary that the new be perceived as better than the old.

However, that perception alone is not sufficient for an individual, group or institution to change. The reason is that, with any change, there is some pain involved. It may result from the effort or risk in getting from Situation A to Situation B and/or from uncertainty or doubt that Situation B will be as good as it was cracked up to be. In any event, if the pain of change is perceived to be greater than the pain associated with the status quo -- that is, staying in Situation A -- no change will occur. Ever.

Ignorance of, or disregard for, the pain of change is a primary source of technologists' "tunnel thinking." For "techies," technology is rarely painful. When it is, it isn't very. New technology is welcomed, and the newer and more revolutionary it is, the better. But the rest of the world doesn't necessarily see things that way. Instead, it sees the pain of change as large, usually much larger than the discomfort of stasis. (How often are would-be reformers confronted with the objection, "But we've always done it this way"?)

Even when presented with a well thought-out plan for implementing change, along with graphic predictions of impending doom if things stay as they are, many people will respond by citing Murphy's Law: "If anything can go wrong, it will." All technologists know of this law, but their enthusiasm tends to cloud their awareness of it, and that contributes to their tunnel thinking.

The Alternative -- S'More's Law

Beyond what Brown/Duguid and Conner have pointed out, there is, I think, a third element in technologists' limited perception. I call it "S'More's Law" (after "s'mores," those gooey concoctions of graham crackers, chocolate and marshmallows that tasted so good when we gathered around the campfire in our younger days.) It goes like this: A little technology is never enough. The primary goal of any technology is not to solve your problem but to get you to use -- or at least buy --