IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Privacy, Public Safety Don’t Have to Be Zero-Sum Game

As many still advocate for a national privacy law, experts debate where to set guidelines on how police work with constituent data. The discussion isn’t as simple as personal privacy versus community safety.

A police officer standing beside their vehicle working on a laptop resting on the vehicle.
Ongoing debates around a potential national privacy law have been sparking questions over how policies should limit or permit law enforcement’s access to sensitive resident data.

Law enforcement has reasons to want to tap into the wealth of digital data. Officials might ask third parties for geofencing information — which identifies all devices within a certain area during a particular time frame — to determine who to add or subtract from their list of possible suspects, for example. Or officers might look to data from social media companies and other third parties to revive cases for which they otherwise lack investigative leads, said Teresa Jauregui, chief legal officer for the National Child Protection Task Force (NCPTF), during an R Street Institute panel. The NCPTF is a nonprofit that offers law enforcement support with child exploitation, human trafficking, and missing persons cases.

But when organizations collect vast amounts of data, harms can also emerge that put both privacy and safety at risk, other panelists said.

Today’s mass data collection practices increase some safety threats; for example, location data and facial surveillance tools have made it easier to stalk and harass people, said Woodrow Hartzog, privacy expert and professor at Boston University School of Law. Plus, data minimization practices are important because the more sensitive data an organization gathers, the more damage is caused by data breaches — “which unfortunately are inevitable to happen,” he said.

Sensitive information can also be processed in ways that cause harm: police facial recognition algorithms, for example, have made mistakes that led to wrongful arrests, said Emiliano Falcon-Morano, Technology for Liberty policy counsel for the ACLU of Massachusetts.

Debates often frame personal privacy and public safety as directly competing goals, but that doesn’t have to be the case, Hartzog said. Privacy protections can support community safety, and the right safeguards can enable some data to be shared in responsible, trust-preserving ways.

“Privacy and safety don’t have to be this sort of zero-sum game, where when you get a little of privacy, you lose a little safety,” Hartzog said. “It’s [actually] a relatively complex dynamic between safety and privacy ... [and] we can think about meaningful rules that protect privacy within relationships of trust, and how that data flows to other parties, and rules that attach to it when it does.”

Panelists also discussed the kinds of restrictions and legal language and framing that might support both goals.

FEDERAL DATA PRIVACY LAW?


The R Street panel discussion follows recent federal efforts to revisit the American Data Privacy and Protection Act (ADPPA), which failed to pass last year. That act would’ve set nationwide rules over organizations’ collection, sharing and retention of personal information and specified individuals’ rights around their personal data.

The House bill included a particular focus on major social media companies and large data brokers and included protections for minors. It had some exceptions and provisions intended to support and protect victimized or missing children and enable law enforcement activities.

The act failed to clear the House or Senate. But March 2023 saw a House committee meet to discuss advancing a “national standard for data privacy,” and calls for a federal data privacy law have continued.

PRIVACY EXPECTATIONS


When crafting privacy policies, it’s important to agree on the baseline assumptions.

One of those: that people do have the right to expect a certain level of privacy even while in public spaces, Hartzog said. Supreme Court rulings like Carpenter v. The United States back this up, even if some policies have taken the opposite stance, he said.

Rules written to safeguard against invasive surveillance in public spaces can fall into the trap of using a vague and weak metric for assessing where to draw the line, however: The “reasonable expectations of privacy test,” Hartzog said.

For one, such language is vague (what does “reasonable” actually mean?). Secondly, basing safeguards on public expectations means that protections get weaker as the public becomes more resigned to invasive practices. Policies should force organizations to step up to meet residents’ ideals, not allow them to sink to meet residents’ continually lowering expectations, he said.

Tethering privacy rules to public expectations “almost ensures that over time, through this small deployment of Amazon Ring cameras here and there, and … [with] Knightscope robots roving down the street and CCTV cameras, that almost without fail, we’re going to become used to being watched … and not really have an opportunity to ask along this path, ‘is this ultimately something that we think is good?’” Hartzog said.

POLICE’S DATA REQUESTS


Police sometimes issue requests for third-party companies to turn over data, or officers purchase it directly from the companies.

These practices have sparked controversy at times, especially when law enforcement agencies purchase information from data brokers that the officers would’ve been prohibited from directly collecting themselves without first obtaining a warrant.

Falcon-Morano said companies should be barred from selling or providing data to law enforcement officials who lack warrants for that information. That would bring practices around search and seizure in the digital world into alignment with such policies in the physical one, he said. And companies should be blocked from selling location data, because it is too sensitive and revealing.

“There is some information that these companies collect about us that basically should be inaccessible, absent a warrant or exigent circumstance,” Falcon-Morano said. “… There is an issue here with what companies can and can’t do with regards to law enforcement. So, I think, that that’s the relationship that we should be more concerned about.”

More formalized procedures would also prevent bad actors from masquerading as police to trick companies into handing over data to them, Falcon-Morano said.

Cyber crime journalist and blogger Brian Krebs reported last year that cybercriminals were taking over police email accounts and using these to scam social media companies, Internet service providers (ISPs) and phone companies into sharing sensitive customer data with them. The cybercriminals sent phony emergency data requests (EDRs), which are intended for situations in which any delay in obtaining the data would cause imminent harm. As such, EDRs do not require police to first get or show a warrant or subpoena, and they urge immediate action; but this leaves companies little ability — or time — to vet if the request is genuine. Apple and Meta both reportedly fell prey to such tricks, per The Verge.

Per Krebs, “[An EDR] largely bypasses any official review and does not require the requestor to supply any court-approved documents. It is now clear that some hackers have figured out there is no quick and easy way for a company that receives one of these EDRs to know whether it is legitimate.”

Side by side videostreaming headshots: Emiliano Falcon-Morano (right) speaks into the camera; Teresa Jauregui (left) listens with a neutral expression.
Emiliano Falcon-Morano (left) and Teresa Jauregui (right) participate in R Street's panel on "The Intersection of Privacy and Law Enforcement"
Screenshot

Jauregui agreed that law enforcement should need warrants, but rejected the perception that officers usually have an easy time obtaining third-party data.

During her times as a prosecutor in Boston and Pennsylvania, Jauregui ran into challenges in which large companies “push[ed] back” against requests or in which the company employees answering data requests were unaware if, or where, the desired data was stored.

Jillian Snider is a former New York City police officer and current policy director of Criminal Justice and Civil Liberties for R Street. She said that even when she had a warrant, she’d faced hurdles to getting companies’ data that sometimes “would delay investigations for weeks.”

Still, Snider said establishing clear rules governing such data requests would be important to fostering public trust.

“If we had some kind of framework where there were specific rules in place — specific thresholds that had to be met in order for law enforcement to get that data — it would not only make it standardized, but it would also improve people’s comfort level with the government and private entities maintaining this information,” Snider said.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.