Big data has just begun to show us the potential for solving intractable problems, and powering innovation and economic growth, but analytics have the potential for misuse of personally identifiable information.
It is time to bring some analog thinking to today’s most interesting and important digital challenge — the intersection of innovation and privacy around big data. Digital is binary — on/off, yes/no. Analog is variable, it attenuates with the distance between things and it allows a little noise in the signal without causing failure.
Too often, the discussion of privacy and big data is binary. One is good. The other, bad. That is the starting point for most public interest groups that point to misuse of big data at the cost of privacy by government (massive NSA surveillance) and industry (federal regulators have settled privacy complaints against Google, Facebook and some 30 other companies). At the moment, these well organized and often well funded groups seem to have the momentum and mindshare on their side in the public debate over big data.
But the debate cannot end there. Big data, itself with powerful friends and considerable financial backing, has just begun to show us the potential for solving intractable problems and powering innovation and economic growth. Last month, even as Congress considered a bill to restrict NSA data surveillance, the White House released a pair of companion reports — one policy, the other technical — in an attempt to strike a balance between seizing the opportunities of big data and preserving values, including privacy and civil rights.
The policy report spotlights where big data is making a difference — from education and medical research to public safety and the digital economy, stating, “Unprecedented computational power and sophistication make possible unexpected discoveries, innovations and advancements in our quality of life.” Out of the blocks, the report affirms that “properly implemented, big data will become [a] historic driver of progress” while later recognizing that analytics have the potential for misuse of personally identifiable information in ways that could undermine “longstanding civil rights protections.”
Rob Atkinson, president of the nonpartisan Information Technology and Innovation Foundation, thinks about these issues in an analog way, with the inherent tradeoffs in mind. “There is this fragile ecosystem where you have this resource called data and you can figure out really cool ways to add value to our society,” said Atkinson.
He argues that good policy protects privacy while opening data to appropriate and innovative use. “Privacy people don’t see it that way. They choose privacy first, last and always,” Atkinson said. “In making that choice, they are choosing less money, less innovation, less health, less safety. … If we go down that path, we have to make it clear that those are what our choices are and that is what our choices mean.”
In some work I did for Intuit, the maker of TurboTax and other software, I met Laura Fennell, the company’s general counsel. She has been thinking in an analog way about this for some time too, so much so that she worries that the bright lights of big data may sometimes blind us to who actually owns the information and whose interests its use should serve.
“Companies should be obligated to think about data as their customers’ data,” said Fennell, and companies’ stewardship responsibility is to use that data to “gain insights into their behavior that can help them change their lives. If they do a good job at that, revenue will come. If they do a bad job at that, it won’t.”
The test in government is somewhat different, but the same caution applies to its relationship with citizens. And there is no binary solution in finding a workable balance, only analog.