The last few turns in the Apple/FBI fiasco have been illuminating and highly disturbing -- it’s becoming a war. Late last week, the Department of Justice filed a motion against Apple that included this incendiary tidbit: “Apple’s rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights.”
Apple’s public response called out the DOJ on a number of issues, not the least of which was in reference to that charge: “Everyone should beware, because it seems like disagreeing with the Department of Justice means you must be evil and anti-American. Nothing could be further from the truth.”
Among all of the attacks on data privacy in the past few years, this case stands to be the most important. It will go a long way toward determining digital privacy rights for the foreseeable future. This is not a case about a single phone or a terrorist or a technology company. This is a case that will likely set a precedent for whether tech firms will be required to provide backdoor access to their products.
To be clear -- this backdoor is not to provide access to the data present on one of Apple’s products when asked, or with a warrant, but instead to ensure that all of its products can be accessed if needed. As much as the DOJ denies it, this will require building a master key. And that isn’t good for anyone.
I find it’s difficult to relay the gravity of this situation to nontechies. If viewed solely through the prism of the San Bernardino tragedy, it might seem that Apple is being obstinate. However, if viewed as it really is -- that the DOJ is asking Apple to do something that would jeopardize the privacy of everyone using its technology -- it’s a different matter altogether.
The problem comes down to this: A master key, or global method of accessing a protected, encrypted device, is absolutely indistinguishable from a critical vulnerability. It’s an exploit. Technologically, it’s no different than a buffer overflow bug that allows remote root access. To ask a technology company to build a vulnerability into its products is insanity. It’s also a massive liability. Interestingly, the former director of the CIA agrees.
We have seen what happens when sensitive personal data from phones is released following a successful exploit. In August 2014, hundreds of extremely personal and compromising pictures of celebrities were accessed and released on the Internet. Other personal data were accessible as well. Celebrities were targeted, but anyone using iCloud was exposed. That exploit was possible due to a bug in the iCloud API that allowed for unlimited brute-force password attempts.
The ability to perform such brute-force attacks is precisely what the DOJ is demanding -- specifically, that Apple remove the function that destroys the data on an iPhone if more than 10 failed attempts have been made. Call it a bug or a feature, it’s the same exploit. Now the DOJ wants Apple and other companies to willingly implement it in their products.
Speaking Friday at SXSW, President Barack Obama displayed the same obtuse and backwards view of the issue. “I suspect the answer is going to come down to, how do we create a system that, encryption is as strong as possible, the key is secure as possible, and it is accessible by the smallest number of people possible for the subset of issues that we agree is important,” the President said. But those goals are mutually exclusive.
As with all the talk about backdoors in encryption standards, master keys, and the like, this is anathema. We work extremely hard to prevent such problems. Mandating their existence is lunacy.
There is no such thing as a safe and secure backdoor. Master keys will not remain in the possession of “trusted” authorities. If a backdoor or master key exists, it will be exploited by others. It will be used to compromise much more sensitive data than exists on a single government-owned iPhone. These are certainties. We need only look back on one of the largest data breaches in history, courtesy of the U.S. government, to know this is true.
It's in the best interests of us all that Apple -- and every technology company -- keep the data that its customers store on its products private and secure. These companies shouldn't want or need to have any sort of access to that data. We salt and one-way encrypt passwords and financial data for the same reasons. This is basic data security and privacy protection.
This is what keeps the wolves at bay.