The need to develop data systems so intelligence agencies can access, analyze and share information has produced notable -- if not promising -- models, but government is finding that developing viable surveillance tools while protecting the public's privacy is tricky business.

Projects such as the late Total Information Awareness (TIA), Multistate Anti-Terrorism Information Exchange (MATRIX) and Computer Assisted Passenger Prescreening System (CAPPS II) have made headlines since 9/11, but mostly for the wrong reasons. Few dispute the need for shared information systems, but public watchdogs say these programs are ineffective and lack built-in privacy protections.

Some have derided the programs as ill-conceived. Congress cut funding to TIA, effectively strangling the controversial project, and MATRIX may be on its deathbed. The Transportation Security Administration's (TSA) development of CAPPS II is continuously delayed due to privacy and reliability issues.

The Markle Foundation's Task Force on National Security in the Information Age recently published its second report on information sharing and homeland security. The task force stressed the need for a decentralized network of information sharing and analysis to address the homeland security challenge. Though it didn't completely dismiss the aforementioned programs, the task force suggested there is much work to be done in developing systems that work in the public's best interests.

"The task force has clearly said there are appropriate uses of technology and there are more problematic uses of technology," said Jim Dempsey, task force representative and executive director for the Center for Democracy and Technology. Some forms of data mining fall into the "more troublesome" category, according to Dempsey.

There are different types of data mining and different definitions of it, depending on whom you talk to. There is evidence extraction and link discovery, where investigators search for links between a bad guy and a crime or other criminals, and social networking is the investigation of a subject's social contacts. Both are considered subject-based data mining.

More troubling to critics is predictive or pattern-type data mining, in which computer programs suggest a behavior pattern that may indicate criminal behavior, and concrete evidence is not a factor. In predictive or pattern-type data mining, intelligence officials run a person, place or subject through a data-mining program to see what connections it makes. Fear of this data mining led to the TIA's demise.

CAPPS II, an offshoot of the ineffective CAPPS I system -- which was in place on 9/11 -- seems to fall into this category as well, said Lee Tien, senior staff attorney for the Electronic Frontier Foundation. CAPPS II will generate a "risk score" for each airline passenger and categorize passengers in one of three color codes: green for nonthreatening passengers, yellow for passengers lacking sufficient data to classify them as green, and red for at-risk passengers.

Tien said CAPPS II would produce information that's no better than an anonymous tip, which for law enforcement purposes must be corroborated by evidence. "CAPPS II is based on the idea that you can score people for how dangerous they are relative to terrorism. We have not seen a single study -- any evidence -- that they've got terrorism risk programs that are any good."

CAPPS II a Necessity

But Dempsey believes CAPPS II has potential -- just not in its current form. CAPPS II breaks down into three parts, each of which raises some red flags, he said. First is the process of authenticating or verifying passengers' identities. Problems exist because of rampant identity theft, and better solutions for assuring identities are needed, Dempsey said.

Second, having watch lists for potential terrorists is a good thing, he said, but problems arise when one discusses who is on the list and how they got there. And once on the list, how does one get off if not involved in terrorism?

Jim McKay, Justice and Public Safety Editor  |  Justice and Public Safety Editor