Homeland Security’s big encryption report wasn’t fact-checked
If you watch Marvel’s Agents of S.H.I.E.L.D., Blacklist, or any other TV show with make-believe espionage, you probably hear the term “going dark” at least once a week.
In the real world, “going dark” has become FBI shorthand for when baddies can’t be spied on, or manage to vanish into thin internet, at the fault of encryption. And it’s at the heart of an oft-virulent tug-of-war between entities such as the FBI, Apple, civil liberties groups, conspiracy theorists, and lawmakers.
This past week, everyone’s been so focused on Hillary, Trump, police shootings and Dallas that few noticed that the Majority Staff of the House Homeland Security Committee finally released its encryption report — with some pretty big falsehoods in it. “Going Dark, Going Forward: A Primer on the Encryption Debate” is a guide for Congress and stakeholders that makes me wonder if we have a full-blown American hiring crisis for fact-checkers.
The report relied on more than “100 meetings with … experts from the technology industry, federal, state, and local law enforcement, privacy and civil liberties, computer science and cryptology, economics, law and academia, and the Intelligence Community.” And just a little bit of creative license.
The first line of the report is based on flat-out incorrect information.
“Public engagement on encryption issues surged following the 2015 terrorist attacks in Paris and San Bernardino, particularly when it became clear that the attackers used encrypted communications to evade detection — a phenomenon known as ‘going dark.'”
In the Paris attacks, they didn’t use encrypted apps, iPhones or encryption in general; the attackers used burner phones. Worse, the terrorists were known to French authorities before the tragedy. As you may recall, after the devastating attacks, US officials rushed to the press insisting that messaging apps using end-to-end encryption be “backdoored” for surveillance access — until the facts emerged, and they were called out for using scare tactics. All of which the “Going Dark” report seems to utterly ignore.
So clearly the problem here isn’t “going dark,” but rather a different kind of failure.
Similarly in San Bernardino. Encrypted communications or apps were not used by attacker Syed Farook; access to his work iPhone was what law enforcement screwed up, by fumbling around with an iCloud password reset and locking up the phone themselves. Then authorities made up crazy fantasies to other authorities and press, suggesting there was a “dormant cyber pathogen” on the phone and later retracting the false statement with an admission of guilt.
Clearly, the problems here aren’t about encrypted communications, or “going dark.” Rather, they are about law enforcement who themselves are in the dark about preventing and investigating digital crime scenes.
The same wee problems crop up when the guide attempts to explain how encryption has failed to protect healthcare data. We’re told that “since 2009, the Health Insurance Portability and Accountability Act (HIPAA) Breach Notification Rule has encouraged healthcare providers to secure their data through encryption by requiring those that suffer a data breach to notify their clients within 60 days.”
And this is true: It is encouraged. Just like it’s “encouraged” that people wear a helmet on their bike, but they technically don’t have to. Still, the “Going Dark” guide goes on, saying that, “despite this move, the American health system has fallen victim to a number of high-profile data breaches.”
Except there was no American healthcare “move” to encryption, even though there damn well should be. The HIPAA rule only suggests that healthcare institutions and providers use encryption — it is not required. The big-ass breaches this “ultimate guide to going dark” refers to have been happening at places that did not encrypt systems and files. Remember the Anthem hack? The records of 80 million people were snatched, and that data wasn’t encrypted. Many said that the disaster could have been mitigated had the data been encrypted. And in February, when the Hollywood Presbyterian Medical Center was famously held hostage by ransomware, its files were encrypted by the ransomware, not before.
Maybe what the report meant to say was that if everyone “went dark” with their data, our personal, private, and very sensitive records would be safe from attackers.
The 25-page guide put in a good effort. But thanks to its inaccuracies, I doubt it will do much to unite what have become diametrically opposed camps on the messy knot of encryption, security and public trust.
Right now the FBI, lawmakers and everyone with a horse in the encryption race seem to be wielding the term like a threat, in negative fantasies where all the terrorists (and only the terrorists) are using encrypted communications to hide, or “go dark.”
One side says it’s about preventing terrorism, another says it’s about privacy, and ultimately it’s about a security protocol that doesn’t have a “halfway” setting.
Like I’ve said before, regarding encryption in computer security: You either have it completely or you don’t. On some things, the room for passive-aggressive political maneuvers is effectively zero.
Worryingly, it’s hard to tell what lawmakers actually understand about the issue, especially when they seem to think everything around the issue is an equally black-or-white matter. On one hand, a bill called the ENCRYPT Act of 2016 (Rep. Ted Lieu, D-CA), in February, firmly proposed that no authorities should be able to prohibit the use of encryption or force it to be cracked.
The exact opposite was proposed in April, brought to us by the camp that basically thinks encryption is tech’s version of giving the middle finger to law enforcement. The Feinstein-Burr Compliance with Court Orders Act of 2016 would compel encryption to be crackable on demand, user privacy and security be damned.
For a work of historical fiction, the guide is fairly entertaining. But if these writers want to keep working, next time they should workshop the ending a bit before sending it off to the printer. Spoiler: It’s a cliffhanger.
At the end, we find out that the commission recommends … another commission.
“House Homeland Security Chairman Michael McCaul (R-TX) and Senator Mark Warner (D-VA) have proposed the formation of a National Commission on Security and Technology Challenges (hereinafter, ‘Digital Security Commission’) to bring these experts together to engage one another directly and, over the course of a year, develop policy and legislative recommendations to present to Congress.”
At least we have a guide to just how lost in the thicket of encryption and “going dark” our lawmakers really are.
(21)