Drawing a distinct line between privacy rights and national security has been an increasingly intense debate during a time when technology and the threat of terror have increased in locked step. Some, on the side of more liberal access to smartphones and other technology, have cited increasing international threats as reason enough to give law enforcement another tool to hunt criminals.
On the other hand, privacy advocates have held the ground that offering up privacy as a sacrifice to security does little more than expose the data of otherwise law abiding citizens — a cost they see as a steep one.
During an Intelligence Squared debate hosted by the National Constitution Center on June 7, experts on both sides of the issue discussed the finer points surrounding backdoors, encryption and what is expected of technology companies when it comes to collaborating with law enforcement.
As Stuart Baker, former assistant secretary for policy for the Department of Homeland Security under the George W. Bush administration, argued in favor of company cooperation with law enforcement. He said the matter boils down to little more than an obligation to assist whenever possible — something he said technology companies like Apple have not been doing.
“Everybody is required to help law enforcement in the right circumstances,” Baker said. “If you have a unique ability to help law enforcement, and law enforcement can’t solve the problem on its own, you have an obligation to assist law enforcement. This has been true for hundreds of years, well before the United States was founded.”
Comparing it to the obligations faced by a landlord presented with a warrant for a tenant’s apartment, that landlord has the obligation to use the master key to open the door to the apartment in question. In the 2015 Apple v. FBI case, where authorities sought access to the San Bernardino, Calif., shooter’s iPhone, Baker argued the company ignored this obligation by failing to comply with the FBI’s requests.
“It’s not different for tech companies — there is no Silicon Valley exceptionalism policy that applies,” he said.
On the other side of the argument, Catherine Crump, acting director of Samuelson Law for the Berkeley School of Law, focused her argument on the dangers of providing a backdoor to any device to the government and expect it to only be used by the “good guys.”
She drew a parallel between handing the FBI a second master key to iPhones and recent worldwide WannaCry ransomware attack, which was launched using leaked National Security Agency (NSA) exploits.
“The problem with that is you cannot build a backdoor that works only for the U.S. government, good guys or other people with good motives," Crump argued. "If you build it for them, encryption will be weakened for everyone."
Where the Constitution is concerned, both sides of the argument held their ground. Proponents of increased access and cooperation cited a need to better define “reasonable” as part of a national legislative discussion, while opponents, like former Secretary of Homeland Security Michael Chertoff, said the Constitution doesn't require citizens to store information in a form that is accessible by law enforcement at some point in time.
“What the government is arguing for, and what this resolution is arguing for, is that the tech companies have to go further, they have to organize themselves so that they have the ability to decrypt, with a duplicate key, all the data that gets transferred so they have the ability to store things that you think you have deleted, so they can turn that over if there is a request,” Chertoff countered. “The fact of the matter is, under the Constitution and the traditions of this country, we do not require people to organize their lives so that they store everything that they say and everything that they write so that it can be available if somebody wants to come along later and investigate them."
But at a time when the threat of terror is arguably at its highest point, Berkeley Law’s John Yoo, a former attorney for the Department of Justice, said technology is increasingly being used by criminals and terrorist networks to communicate and coordinate. “It’s going to get worse, not better,” he said.
While he noted that stronger encryption might be a positive step in the larger national security picture, he said it shouldn’t be left to technology companies to decide when, how and if they cooperate. He argues the Legislature should make the final determination as to the balance of privacy and security.
“It’s possible that a consequence of more encryption might actually be more security for our country. I just don’t see why Apple gets to decide that for the United States," Yoo said. "I think if that is really a consequence of increasing encryption, then our government, who we elect and send to Washington, should make that call.”
The turn in the debate toward the terror threat prompted skepticism from pro-privacy debaters, Chertoff and Crump, who challenged that a host of invaluable tools already exist for law enforcement agencies, including data handed by companies when pressed legally.
“No one is denying it can be a serious cost to law enforcement not to be able to access the content of someone’s phone,” Crump said. “The question is how do you balance that cost against the cost of not having encryption, and particularly in an era where law enforcement has lots of other information available to you.”
While both sides seemed to stick to the common argument around the issue, Baker held to his point that the challenge will continue to increase as technology evolves and criminals continue to take advantage where they can.
“Technology is transforming crime in the same way it is transforming crime busting," Baker said, "but it’s not clear on balance that law enforcement ends up better.”