IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Apple Encryption Battle Points Out Lack of Unified Cybersecurity Ethics Code

There is no single reason cybersecurity ethical standards don’t yet exist, or even a consensus that they should — let alone exactly what they might consist of or what impact they might have.

(TNS) -- Apple’s resistance to providing the FBI with access to a mass killer’s iPhone has laid bare more than a disconnect between cybersecurity and national security. It has also exposed the absence of a unified code of ethics in an increasingly crucial realm of technology.

Unlike older professions — medicine with the Hippocratic oath, the law with its codes of conduct — cybersecurity has no codified ethical standards.

That’s made much of the debate over the FBI’s demands and Apple’s resistance about more than the legal argument.

There is no single reason such ethical standards don’t yet exist, or even a consensus that they should — let alone exactly what they might consist of or what impact they might have.

While there is overwhelming support for Apple inside the industry, outside — in law enforcement, in government and even around dinner tables — the lack of clearly communicated ethical lines contributes to some of the questions raised.

“If you could prevent a 9/11-type attack from happening by unlocking an iPhone, would you do it?” wrote one recent commenter on a Chronicle story on the issue.

Another shot back: “If Apple loses this, your kids will be less safe. Their locations will be exposed to hackers and the government.”

Perhaps this dispute, now playing out in federal court, might lead to a clearer idea on what rules should apply.

On Thursday, the Cupertino company filed a motion to vacate a federal magistrate’s mandate that Apple comply with the FBI’s request.

In fighting the court order, Apple is making both a legal and moral appeal.

If it agrees to break into just one iPhone on behalf of U.S. law enforcement, it argues, it could face an undue burden to do so elsewhere.

Some experts have made the case that it would leave Apple open not just to demands from other U.S. law enforcement agencies, but possibly also to similar pressure from authoritarian countries with abysmal human rights records.

The company says it should not be compelled to take that risk, or the risk of creating software that it fears could be used to access data securely stored on its other products.

On a practical level, Apple’s argument might be meant to assure individuals that they can entrust their iPhone with financial information (Apple Pay), health information (ResearchKit) and even fingerprints (Touch ID).

But without agreed-upon standards of ethics, says Ryan Kalember, a senior vice president of cybersecurity strategy at Proofpoint, a Sunnyvale cloud-security company, it’s difficult for people to determine whether Apple is on solid ground.

“In what ways should they be a conscientious objector?” he said. “On what principles and morals?”

Mixed emotions

Basically, “Apple is saying we could (help the FBI), but we shouldn’t,” said David Brumley, the director of CyLab, Carnegie Mellon University’s security and privacy institute.

Law enforcement, meanwhile, is saying that while that might be so, it still needs to solve crimes, he said.

Brumley agrees that the lack of ethical standards makes the situation difficult to judge. “I don’t think there’s an industry-wide definition,” he said. “There’s a lot of discussion of ethics, but that really hasn’t involved modern computer security and privacy experts.”

While computer science students learn basic codes of conduct, professional security engineers are often more driven by a sense of morality. A majority are committed to universal rights, chief among them a user’s right to privacy. It’s a tenet apparent in their work.

Safeguards, such as those that thwart attackers from guessing an iPhone’s passcode an unlimited amount of times, are installed with individual security in mind.

“We are in the largest crime wave the human race has ever seen, and that massive amount of crime (the loss of information through cyberattacks) cannot be ignored,” said security researcher Dan Kaminsky. “There are individual events that are awful, but we have to talk about the universal vulnerability threatening our civilization.

“People are losing their businesses. People are losing their jobs,” he added. “People are losing faith in information technology itself, with good reason.”

That sentiment, shared by many in the cybersecurity industry, is pervasive in Silicon Valley. Kaminsky concedes, however, that the sense of moral urgency he and other insiders feel isn’t being articulated well to the rest of society.

Morality different

But morality, or an individual’s personal belief, is not the same as ethics, or what the community as a whole expects from individuals, said Brumley.

On another level, Apple, which is famously close-mouthed about its products, is essentially the only source for what is possible or not regarding accessing its software. That makes some skeptical of its supposedly principled stand.

“I think it’s good that Apple is framing things in this matter, trying to attend to the consequences to human, democratic rights, but I also think that Apple is a little disingenuous in making these arguments,” said Phillip Rogaway, a professor of computer science at UC Davis.

While he is firmly on Apple’s side in the showdown, he said: “They don’t seem to have acknowledged that it’s really a failure of their own security architecture that they have the ability to comply with this order.”

“Apple has been claiming for quite a long time that even their latest phones are designed in such a way that they can’t unlock the data at rest,” Rogaway added. “I think that’s not really true.”

“The challenge, in cases like this, is the definition of welfare, of the public good, and the definition of safety is being defined differently by Apple on the one hand and the (Justice Department) on the other,” said Andrea Matwyshyn, a law professor at Northeastern University.

A cybersecurity ethics code could include exceptions for instances where the normal rules wouldn’t apply, said Nate Cardozo, a staff attorney at the Electronic Frontier Foundation.

“In almost all (written codes of conduct), there are law enforcement exceptions,” he said. “If your client is going to tell you they are going to commit a crime, for instance, attorney-client privilege goes out the window.”

But not everyone thinks that such a code can, or even should, exist.

“The difference with doctors and lawyers is that they are licensed by the states and therefore the state’s law enforces ethical standards,” said Tenable Network Security strategist Cris Thomas, a member of the hacker collective L0pht Heavy Industries, who goes by the name Space Rogue, in an email.

“There are no such review boards for car mechanics or plumbers, which are usually also licensed by the state,” he said. “Should there be an ethics review board for (information security) professionals like there are for doctors and lawyers?”

His answer: “Not until the states force infosec people to become licensed like doctors and lawyers.”

©2016 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.