The FBI Might Be Apple’s Best Ally In iPhone Encryption Flap

More than 40 tech companies, trade groups, and privacy advocates filed amicus briefs this week in support of Apple in its dispute with the FBI over San Bernardino shooter Syed Farook’s iPhone. The government won a court order in February demanding that Apple create a custom OS for the phone that would allow law enforcement to bypass security features and break into the device. Apple has refused.

But Apple’s best ally in the case, which could have long-term implications for personal data privacy in the U.S. and around the world, may be the Justice Department itself. Federal law enforcement, through its various forms of public statements, its politicking, and its technology fumbles, may already have sunk its efforts to require tech companies to provide a “backdoor” to encrypted data stored in consumer devices and apps.

It’s understandable that the Justice Department would want a legal or legislative guarantee that search warrants granted in criminal or national security investigations can access that data. It’s entirely possible that encrypted data on a smartphone could help law enforcement convict bad actors or prevent future calamities.

But, from where I’m sitting, the way in which the government is going about achieving such a guarantee looks highly calculated, a bit cynical, and maybe even extra-legal.

Backdoor Back Story

Close observers say the Justice Department has worked hard over the past few years to advance a law in Congress that would require tech companies to keep encryption keys to user data, and share the data with investigators with a search warrant. So far it’s failed. It’s also failed to enlist the White House in the cause; beyond urging dialog among the stakeholders, the president has largely kept his distance from the issue.

As a third option, the DOJ decided to go to the courts. It’s pointed to a 1789 law called the “All Writs Act” as a legal basis for court orders to tech companies to provide backdoors (or, in Apple’s case, to create a piece of software that disables the backdoor’s security alarm.)

The government has seen mixed results in the courts. The FBI won an order from California’s Central District Court that uses the All Writs Act to direct Apple to write custom firmware to help break into Farook’s iPhone 5c. Just days later, however, a federal district court in Brooklyn, New York, ruled that the act can’t be used to compel Apple break into another iPhone in another (criminal) case.

Choosing Battles

The government has carefully chosen the cases and courts in which to fight for the creation of a backdoor into encrypted iPhone data. Its end goal goes far beyond conscripting the help of one tech company to access one device. It wants to establish legal precedent.

The Justice Department wants to use the emotionally and politically charged investigation as the legal frame inside which it can paint its demand for an guaranteed encryption backdoor in the best possible light for the public.

Framing the issue that way, Apple’s refusal to help could be seen as arrogant, self-serving, unreasonable, and unpatriotic. And, sure enough, a recent Pew Research poll found that 51% of Americans believe Apple to be in the wrong in the matter.

But the DOJ’s tactic didn’t go unnoticed.

Mistrust In Congress

Here’s Rep. John Conyers (D-Michigan) speaking to FBI director James Comey during Tuesday’s House Judiciary Committee hearing on encryption: “Iā€™m deeply concerned that the federal government is exploiting a national tragedy to bring about a change in the law.”

He brought evidence to be entered into the recordā€”a Washington Post article from last September quoting the intelligence community’s top lawyer saying a terrorist event might be just the thing to swing Congressā€™s opinion on encryption backdoors in favor of law enforcement.

Conyers also wanted to know why the DOJ went to courts in the first place. He wondered if the government might be doing what he called an “end run” around Congress to find some legal foundation for backdoors.

But DOJ is most likely not using the courts as an alternative to Congress, legal experts say, but rather as a forum to drum up positive public opinion for its cause, and to move Congress to act on the issue sooner rather than later.

Indeed most court motions and orders in such cases are kept under seal and out of the public eye, experts say, but the FBI decided to keep filings in the San Bernardino matter unsealed. It knew very well that a court order with Apple’s name on it would quickly command the attention of the media and the public. The matter of Syed Farook’s iPhone was very quickly elevated to lead story status even for mainstream outlets like NPR and CNN.

A Credibility Gap

The manner in which the FBI turned Apple from an ally to an adversary in the San Bernardino matter says a lot about its real motives. The FBI didn’t even tell Apple it planned to file a motion in court, or that it had done so afterwards, much less that the order would be unsealed. Apple had been helping the FBI in its investigation for 75 days before the FBI suddenly ran to the court. Tim Cook said he and his people found out about the court order through the media. The FBI’s Comey gave no clear reason for the move when asked about it in the hearing Tuesday.

The Justice Department has contradicted itself on what would likely be done with the custom OS it’s ordered Apple to build. Comey and the Justice Department’s attorneys said and wrote in filings repeatedly that the custom OS would be used once and only once, on Farook’s phone, calling the matter a “one-off.” But Comey has had to admit that the FBI would likely use the OS to access to other iPhones in other cases. The FBI, in fact, has sought court orders to unlock as many as 17 iPhones since last October.

Comey has also had to admit that the FBI made a big mistake when it instructed San Bernardino County authorities to reset the passcode on Farook’s phone in first days of the investigation. The bureau had been less than forthcoming on who was behind the passcode reset, but now acknowledges that it was an FBI decision to do so. Had the passcode not been reset, authorities could have accessed the encrypted data in a cloud backup after connecting the phone to a familiar Wi-Fi network.

Justice Department officials have claimed, and are still claiming, that without a backdoor to encrypted data on phones, a large area of its investigative vision will “go dark.” This claim has been called into serious question by a new report from the Berkman Center at Harvard, which says law enforcement’s vision into user data is actually improving due to the rapidly increasing number of available inputs, meaning the millions of new sensor inputs coming online with the Internet of Things.

Good Cop, Bad Cop

In a motion filed in support of the FBI in the San Bernardino case, Department of Justice lawyers were dismissive of the reasoning behind Apple’s refusal to create the custom code for Farook’s iPhone. The lawyers argued repeatedly that Apple misunderstood the terms of the court order, and that it was misrepresenting the real privacy implications of complying with it. They argued that Apple is resisting the court order not because of some moral/ethical position on privacy, but because of its marketing and public relations goals.

But during Tuesday’s hearing, FBI director James Comey was highly respectful and complimentary of Apple, calling it a “great American company.” He praised Apple’s ability to protect user data, even though it has failed to protect its users’ data in the past. Comey talked about the importance Apple places on protecting user data, and didn’t once suggest that Apple is being unreasonable in the current dispute over accessing Farook’s phone. At the same time, even as members of the Judiciary Committee thanked Comey for his candor, Comey didn’t diverge from the Justice Department’s talking points on the matter, and on numerous occasions he declined to answer questions because, he claimed, he didn’t understand the technical aspects of the work the FBI has ordered Apple to do on Farook’s phone.

Law enforcementā€™s posture in its encryption debate with Silicon Valley has betrayed the age-old belief that the governmentā€™s authority, especially in national security matters, should always supersede the private sector’s concerns over data privacy. The DOJ’s brief to the court in the San Bernardino case reads like it was written by the party that holds all the cards in a dispute. One gets the impression that some members of the intelligence, law enforcement, and national security circles truly believe that, when it really comes down to it, they can work with the courts to get whatever they want, in the name of “fighting terror.”

That attitude fits a little less easily with the post-Patriot Act, post-Snowden world we live in now. Many Americans’ trust for federal law enforcement programs has been severely eroded. And, looking further back, many Americans can see now that they gave up too much personal liberty in exchange for increased government surveillance power in the Patriot Act. We were, after all, just waking up to a new world in which terror can hit anywhereā€”including in the U.S.ā€”and in which technology helps mobilize and spread terror like never before.

Actually, the U.S. remains more secure than most other developed countries. We still react with outrage at events like the shootings in San Bernardino. Had that same attack occurred in Israel, for example, the response would likely have been a bit more measured.

Two Roads Forward

The current imbroglio with Apple is part of the government’s most recent attempt to force a rebalancing of two of the values we hold most dearā€”our privacy and our security.

Step back a little and it’s easy to see that the stakes will be high: The worldā€™s biggest and most powerful tech company is refusing to help in the investigation of what might be the biggest attack on U.S. soil by the biggest and most dangerous terrorist group on earthā€”ISIS. Considering the players involved, the values as stake, and the limited room for compromise, it’s not hard to imagine this contest ending up in front of the Supreme Court. It could end with a historic, Pentagon Papers-level court decision.

As journalist Dan Carlin points out in a recent Common Sense podcast, the outcome of the current standoff could send us trotting down the road to one of two very different futures.

The first is a future where there’s no way to feel absolutely sure that one’s personal data is protected. And at each step along the road to that future, Carlin says, might be a legal case in which sacrificing another little piece of personal privacy seems like a small price to pay for access to the data needed to convict a drug dealer or a terrorist.

Or, let’s say the court chose to take the long view, making a decision rooted in the desire to preserve privacy over the long haul. Law enforcement might watch in frustration as an obviously guilty party in the immediate case is declared “not guilty” and set free. That sort of outcome in the San Bernardino case might mean the data on Farook’s phone is never seen, and that an opportunity to establish connections between the shooters and their fellow ISIS operatives in Iraq or Syria would be lost.

Justice Antonin Scalia put it perfectly in a 1987 opinion on a case involving the limitations of search warrants: “There is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.”

Apple has an excellent chance of eventually prevailing in its current standoff with the FBI over Farook’s iPhone, especially if it does a better job of explaining to Congress and the public the real long-term harm that might be done if it’s forced to help the FBI break into Farook’s phone. But if the DOJ loses the battle, the issue won’t have been settled. The DOJ will simply look for a new case to make another stand. It might be a case concerning an even worse terrorist attack than the one that took place in San Bernardino. It will keep trying until Congress creates a set of rules dictating whether or not tech companies must keep encryption master keys, and under what circumstances they must hand over the keys to law enforcement.


Fast Company , Read Full Story

(22)