FYI.

This story is over 5 years old.

Tech

Let’s Take Rep. King's Wild Nuke-iPhone Hypothetical to Its Logical Conclusion

Everything the government creates, from nukes to crypto backdoors, eventually becomes a thing the government needs to protect itself from.
Screengrab: House Judiciary Committee

Tuesday's House Judiciary Committee hearing on encryption was very very long, but it was also highly entertaining. We had lawmakers actively challenging the FBI Director James Comey in a way we never see them challenge him, we had Comey offhandedly wonder how Apple's engineers would react if they were "kidnapped and forced to code software," and then we had the nuke hypotheticals.

About three hours into the proceedings, Iowa Republican Steve King wondered if the conversation about encryption and the FBI's quest to unlock an iPhone belonging to one of the San Bernardino terrorists might be a little different if the Islamic State had nuclear capabilities.

Advertisement

"I think it's a known or a given that ISIL is seeking a nuclear device," King said. "If we had a high degree of confidence that they were on the cusp of achieving such capability and perhaps had the capability of delivering it—if that became part of the American consciousness, do you think that would change the debate we're having here today?"

A little later, Democratic Rep. Cedric Richmond asked Apple how long it would take to backdoor the iPhone "if there is a terrorist that has put the location of a nuclear bomb on his phone and he dies."

Both are bad, fearmonger-y hypotheticals. But taking King's scenario to its logical extreme is actually kind of instructive in explaining why asking Apple to build vulnerabilities into its own software is almost certain to backfire.

In King's scenario, the Islamic State is on the verge of acquiring or developing a weapon that America itself built. Time and time again, the US government has developed technology that later got used in a way it wasn't expecting. The US Navy developed the fundamental technology behind the anonymous browsing protocol Tor to protect US intelligence community communications, now the Department of Justice says it's fostering a "zone of lawlessness" for criminals, pedophiles, and terrorists.

King and the FBI are advocating the creation of a weapon that once deployed cannot be put back away. Once a vulnerability for the iPhone is built, it immediately becomes "one of the most valuable targets for foreign intelligence," according to security researcher Dan Guido. If Apple creates the backdoor, it's said there's no way it can guarantee that it won't eventually make its way into the wild. The only way to keep the iPhone secure is to never create an iPhone backdoor.

Advertisement

As more and more of our lives and critical infrastructure go online it's likely that hackers and hacking-enabled terrorists will be able to do real, physical damage with encryption backdoors.

A terrorist group wouldn't "develop" a crypto backdoor, it would steal one—which is all the more reason we shouldn't ask Apple to make one in the first place

The comparison between nukes and crypto backdoors isn't a perfect one—nukes have a demonstrated ability to wreak mass destruction, while the ability to do violence with hacking remains mostly theoretical at this point. Even if the US hadn't developed a nuclear weapon, there was nothing stopping other countries or groups from making one themselves. Meanwhile, it's widely believed that strong encryption can likely only be broken reliably using yet-to-be-developed quantum computers at a very high price tag many years in the future.

Without a backdoor, breaking encryption is more of a really, really difficult math problem that must be solved over and over again (for each individual phone or system—a process that, even for weak encryption, can take hundreds of years using current technology) rather than a "hack." A crypto backdoor, meanwhile, circumvents that process and creates a piece of tangible software that, in this case, could become a target to be stolen from Apple's systems using more traditional hacking techniques. So a terrorist group wouldn't "develop" a crypto backdoor, it would steal one—which is all the more reason we shouldn't ask Apple to make one in the first place.

If Apple loses this legal battle, I wonder if one day down the line some lawmaker questioning a still unborn company about a brand new terrorist organization might find him or herself asking: "It's a given [terrorist group] is seeking an encryption backdoor. If we had a high degree of confidence that they were on the cusp of achieving such capability and perhaps had the capability of delivering it—if that became part of the American consciousness, do you think that would change the debate we're having here today?"

Something to think about.