IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Tricky Business

Data surveillance tools may have promise, but presently fall short.

The need to develop data systems so intelligence agencies can access, analyze and share information has produced notable -- if not promising -- models, but government is finding that developing viable surveillance tools while protecting the public's privacy is tricky business.

Projects such as the late Total Information Awareness (TIA), Multistate Anti-Terrorism Information Exchange (MATRIX) and Computer Assisted Passenger Prescreening System (CAPPS II) have made headlines since 9/11, but mostly for the wrong reasons. Few dispute the need for shared information systems, but public watchdogs say these programs are ineffective and lack built-in privacy protections.

Some have derided the programs as ill-conceived. Congress cut funding to TIA, effectively strangling the controversial project, and MATRIX may be on its deathbed. The Transportation Security Administration's (TSA) development of CAPPS II is continuously delayed due to privacy and reliability issues.

The Markle Foundation's Task Force on National Security in the Information Age recently published its second report on information sharing and homeland security. The task force stressed the need for a decentralized network of information sharing and analysis to address the homeland security challenge. Though it didn't completely dismiss the aforementioned programs, the task force suggested there is much work to be done in developing systems that work in the public's best interests.

"The task force has clearly said there are appropriate uses of technology and there are more problematic uses of technology," said Jim Dempsey, task force representative and executive director for the Center for Democracy and Technology. Some forms of data mining fall into the "more troublesome" category, according to Dempsey.

There are different types of data mining and different definitions of it, depending on whom you talk to. There is evidence extraction and link discovery, where investigators search for links between a bad guy and a crime or other criminals, and social networking is the investigation of a subject's social contacts. Both are considered subject-based data mining.

More troubling to critics is predictive or pattern-type data mining, in which computer programs suggest a behavior pattern that may indicate criminal behavior, and concrete evidence is not a factor. In predictive or pattern-type data mining, intelligence officials run a person, place or subject through a data-mining program to see what connections it makes. Fear of this data mining led to the TIA's demise.

CAPPS II, an offshoot of the ineffective CAPPS I system -- which was in place on 9/11 -- seems to fall into this category as well, said Lee Tien, senior staff attorney for the Electronic Frontier Foundation. CAPPS II will generate a "risk score" for each airline passenger and categorize passengers in one of three color codes: green for nonthreatening passengers, yellow for passengers lacking sufficient data to classify them as green, and red for at-risk passengers.

Tien said CAPPS II would produce information that's no better than an anonymous tip, which for law enforcement purposes must be corroborated by evidence. "CAPPS II is based on the idea that you can score people for how dangerous they are relative to terrorism. We have not seen a single study -- any evidence -- that they've got terrorism risk programs that are any good."


CAPPS II a Necessity
But Dempsey believes CAPPS II has potential -- just not in its current form. CAPPS II breaks down into three parts, each of which raises some red flags, he said. First is the process of authenticating or verifying passengers' identities. Problems exist because of rampant identity theft, and better solutions for assuring identities are needed, Dempsey said.

Second, having watch lists for potential terrorists is a good thing, he said, but problems arise when one discusses who is on the list and how they got there. And once on the list, how does one get off if not involved in terrorism?

Accountability and oversight on how the lists are constructed and used are lacking, said Dempsey. "Privacy is not only about secrecy. Privacy is about how information is collected and used. The issue of privacy and the watch list is really a due process issue."

The Markle Foundation Task Force report called for clear policies and guidelines for acquiring, using and retaining data. The Office of Management and Budget told a congressional subcommittee that it will withhold funding for CAPPS II until the TSA develops a business case and a risk-based approach rather than just another watch list.

The TSA did not respond to calls for comment, but the agency's Web site characterized CAPPS II as prescreening, not data mining. The site said the system will not collect any more information than air carriers and reservation systems already collect, and the data will be destroyed shortly after the person's travel itinerary is completed. The site also said a redress process will be established for anyone who thinks they've been incorrectly prescreened.

Watch lists should be based on clear evidence that warrants suspicion, Dempsey said. They shouldn't be compiled based on a combination of factors or behaviors that a computer was programmed to locate and link.

An Islamic lawyer from Oregon was suspected of being connected to the recent bombings in Madrid. The FBI initially said his fingerprint was found at the site, but later confirmed the print belonged to someone else, which begs the question, "How did they come up with the identity of a Muslim living in Oregon with a bum fingerprint?"

"It rings hollow the story, 'We just mixed up the fingerprints,'" Tien said. "Clearly there were other factors. That's the danger of data mining. All of this is deciding who is suspicious or justifying the government saying, 'This person is worthy of investigation."'

The FBI said it was dealing with a poor print in the Oregon incident, and insisted no other information led to the suspect being flagged.

The third part of CAPPS II is the predictive or pattern behavior component, which raises major concern for many critics, including Dempsey. "That's the part that I think, at this point, is very speculative, and I think there's growing recognition even within the administration -- and maybe within the Transportation Security Administration -- that it is both controversial and may actually be holding up the more justifiable parts of the program, namely the use of the watch lists."

CAPPS II testing is scheduled for this summer. A fact sheet on the TSA Web site said CAPPS II will be implemented after testing and after congressional requirements are met. Dempsey said the TSA is re-evaluating CAPPS II and that the task force met with TSA officials several times to offer an assessment. "Right now, I think we're in the worst of all worlds: We have a system, CAPPS I, which is both ineffective in a security sense and troublesome in a privacy and due process sense. We just can't leave this behind. Whether its CAPPS II or CAPPS III, we need an effective screening system."


Tough Road for MATRIX
The task force expressed skepticism about MATRIX, which Dempsey characterized as both an information retrieval system and a profiling or predictive system. Again, it is the latter that sparks worry. Of the 13 states that originally signed on to participate in the project, just four -- Florida, Michigan, Pennsylvania and Ohio -- remain.

The American Civil Liberties Union (ACLU) said it won't rest until that number is zero. "We've learned the hard way that you've got to finish taking your penicillin or the hardiest few will survive," said Jay Stanley, communications director for the ACLU's Technology and Liberty program.

MATRIX has a strong information retrieval component, which is useful for states to share information across jurisdictional lines. But the predictive analysis component of the system -- which Tien said helped produce a list of 120,000 high-risk people in Florida -- is suspect. "You have nothing specific about anyone. You have no evidence the guy is a bad guy except for the characteristics the computer has selected."

Florida officials did not return calls for comment.

Tien said one difficulty in getting these programs to work is that little is known about terrorists. He said predictive technology can work well in situations such as credit card fraud because that's a common crime where factors are known and consistent. The same can't be said for terrorism.

The ACLU's Stanley said privacy concerns were the main reasons the nine states withdrew from MATRIX. Specifically some were concerned about the privacy implications of housing sensitive public data at a private company. But some legislators were just outraged that the program was instituted without legislative approval, he said.


Dual-Benefit Solutions
It's imperative to develop clear privacy policies before implementing and testing predictive technology systems, according to Ruth David, CEO of Analytic Services Inc. (ANSER) and former CIA deputy director for Science and Technology

"In some cases, we've gotten the discussion of technology ahead of clear articulation and development of the policy that goes behind it," David said.

Complicating that process is that policies can shift drastically based on public perception. "It really is a problem, though, because the policy in terms of what is socially acceptable might change rapidly in case of an attack, and you can't develop technology like that overnight."

One solution is to further educate technology developers on how to include privacy protections in these systems from the beginning, she said. "They need to consider the privacy aspects. We somehow as a nation need to find a way to invest in and develop new tools, new technologies to aid law enforcement and security applications in parallel, and make sure they're proven before going operational."

One way to do that is to develop "dual benefit" technologies, David said. "When I say dual benefit, I'm really talking about how a solution is implemented so it yields benefit to daily operations and simultaneously enhances security."

That's important, she said, because terrorism is a low probability, but the threat can't be ignored. So systems that work for daily operations will be proven if and when they're needed for homeland security. David cited border security -- where an information management solution that tracks cargo could have benefits beyond security. She said speed of commerce could increase with a system that tracks cargo, and flags unfamiliar or unusual cargo thereby allowing familiar cargo to get through faster.

David believes systems such as CAPPS II and MATRIX fall short of those goals, but do have potential. CAPPS II could facilitate speedier travel for airline passengers, and MATRIX has day-to-day law enforcement benefits because of its information retrieval component. But the criticism of both programs and the difficulties associated with implementing them will not subside without some sort of consensus on policy.

"We very much need to have an informed debate about the policy issues and understand how to simultaneously build in the protections for personal privacy."