Everyone wants secure systems, and they want solid encryption on their computers and mobile devices to prevent corporate espionage, hacking, and so forth. Yet when a terrorist or criminal uses an encrypted phone, that's a different story -- to many people, anyhow. The challenge for OS developers is where to draw the line on being able to crack their own software.
The debate is polarizing, as is clear in the current fight between Apple and the FBI, where Apple is refusing to create a special version of iOS to let the FBI crack the iPhone used by a terrorist -- and to help the FBI break into another dozen iPhones used by criminals.
The technical fear is simple: Any backdoor or other method to get the contents of a secured device will end up available to governments (good and bad), criminals, and others. A good example of that fear realized is a recent case of a backdoor in network gear used by a foreign government to access government servers. Apple says that risk is too high, so it won't go that far in helping the FBI.
Much of the tech community agrees with Apple, even if most Americans do not. But not all techies agree. Former Microsoft chairman Bill Gates has taken a more nuanced view in an interview with the Verge:
I do believe that with the right safeguards, there are cases where the government on our behalf, like stopping terrorism which could get worse in the future, that that is valuable.
Gates said the government shouldn't have to be completely blind, but that the courts and Congress would ultimately determine the matter, although it's good to have the discussion: "You want to strike that balance," he said.
Gates' view is similar to that of Microsoft, whose chief legal officer, Brad Smith, wrote that tech companies "should not be required to build in backdoors" but should also be committed to "providing law enforcement with the help it needs." He also tweeted, "Essential to have broad public discussion on these important issues."
Despite the strong feelings on both sides, it's not an easy issue. On one hand, we all want our data secure and private. On the other hand, we don't want the bad guys to have that same security or privacy.
It's easy to look at this one situation -- a terrorist attack that has already occurred where the people responsible were killed, much of the terrorists' communications is already in the hands of the FBI, but an encrypted phone may have additional data -- and say it's not worth the risk of privacy invasion or, worse, of a technology implementation that could make every iPhone vulnerable.
But what if the situation were different? What if there were an imminent attack expected, and the government had an encrypted phone with information that might help stop the attack if only the device could be cracked?
The debate is very similar to that around Second Amendment gun rights: Americans want the right to bear arms to be unimpeded -- except for bad guys. But how do you accomplish that? And if you reverse the right to bear arms, the fear is that then only the bad guys will have them.
That's perhaps the next line of thought in the encryption debate: If there is an way to circumvent the encryption, wouldn't the bad guys find other ways to hide their data, so only regular people would have their data at risk if the master key fell into the hands of other bad guys?
Time, debate, and evolving circumstances will determine the outcome. I certainly don't have all the answers. But, as Gates says, it's good to have the discussion.