On July 8th, 2015, FBI Director James Comey testified in front of the Senate Judiciary Committee regarding perceived threats to the ability of law enforcement to monitor and collect data on encrypted communications in the course of executing legally obtained search warrants. Less than one year later, the FBI was embroiled in a heated fight with Apple after ordering the company to deliberately introduce a backdoor into iOS software in order to crack the iPhone of the perpetrator of the San Bernardino terrorism attack.
With theoretically unbreakable device and data encryption becoming more and more widely available, it’s a safe bet that we haven’t seen the last of the privacy-safety debate. While I understand the importance of the ability of law enforcement to execute valid search warrants, I think the Apple lawsuit constituted a dangerous overreach of the power of the FBI.
Disregarding the privacy aspect of the argument for a moment, while the FBI does have the power to compel companies like Apple to comply with warrants requesting user data stored on their servers, they do not and should not have the power to compel a company to act as the FBI’s personal contractor, ordering them to build from scratch a feature that directly undermines the value of their products. Apple CEO Tim Cook said the following in response to the FBI request:
“The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe,” he declared. A federal judge is effectively ordering these unnamed people to write code that would indisputably harm their company and that would, in their view, harm America. They are being conscripted to take actions that they believe to be immoral, that would breach the trust of millions, and that could harm countless innocents. They have been ordered to do intellectual labor that violates their consciences.”
The FBI is clearly not operating within their given power. They are not ordering Apple to comply with a warrant, nor provide data to assist with the investigation; they are ordering Apple to dedicate a significant amount of time and resources to undermine the work of their engineers and the quality of their product. Such an order is unprecedented. When law enforcement needs to gain access to a locked safe, it would be ridiculous for them to approach the safe makers and order them to not only open the safe, but to make sure all safes they manufactured in the future had a secret second combination that would be known only to law enforcement (and, of course, mandate that people only use these compromised safes).
Apple even offered to help the FBI access the data from the iPhone through other means (syncing the phone to iCloud through the terrorist’s home wifi) without compromising the security of the iPhone, and completely within the FBI’s current power. Yet the FBI ignored Apple’s instructions and reset the iCloud password on the account, rendering the attack vector useless. If the FBI were merely concerned about this particular iPhone, they would have accepted Apple’s help and legally retrieved the data. However, this was clearly about more: they wanted a backdoor not just to this particular iPhone, but every iPhone.
Perhaps this doesn’t scare you. After all, this is the American government we’re talking about. “I have nothing to hide, so I have nothing to fear,” you might say. If the government wants a backdoor to legally execute search warrants, why can’t they have it?
First, there is no such thing as a private backdoor. Companies pour billions of dollars every year into product security, and yet it seems like new security breaches are occurring every week. If we have such a hard time securing our systems right now, how the hell are we going to secure a system against criminal hackers when it has a gaping security hole by design? Would you feel comfortable trusting your iPhone with your credit card information when you know that it has a security vulnerability that can be exploited by anyone with sufficient knowledge? At the very best, this group of people with “sufficient knowledge” is limited to law enforcement and Apple employees. However, as we’ve seen from the Snowden leaks, and the leaking of NSA hacking tools last year, the myriad of email leaks this year, etc. the government just isn’t that good at protecting data. It only takes one bad actor to leak the secret to accessing to every single iPhone in circulation, and once that door is open, it’s going to be very difficult to close. Do you trust every single person employed by the government? All 21.9 million of them? Every single person from the local police department to the FBI and NSA to the president himself? Even if you trust the government as a whole, it only takes one malicious actor abusing their access to compromise your personal privacy, or worse, leaking access to the backdoor.
Second, and perhaps more importantly, rights don’t just go away when you don’t feel like exercising them. Snowden once said in an interview that “arguing that you don’t care about privacy because you have nothing to hide is like arguing that you don’t care about free speech because you have nothing to say.” I would take this one step further: everyone has something to hide. If you really, honestly, truly believe that you have nothing to hide, post your iPhone passcode on Facebook. Put your credit card number on a bumper sticker. Hand out printed copies of your text message history on the street corner. Obviously I’m exaggerating, but the point is that “something to hide” isn’t limited to “something criminal to hide.” Everyone has information that could potentially be used against them, either by the government, criminals, or even ordinary citizens, and you have a right to protect that information.
Compromising your right to privacy can even help compromise your other rights, like the right to free speech. For example, an investigation by the FCC is currently underway to determine whether law enforcement officials illegally used “stingrays” to capture cell phone calls from protesters at the Dakota Access Pipeline last year. Even if the investigation comes to nothing, numerous reports surfaced of law enforcement using social media to track DAPL protesters.
Finally, let’s expand our view to the rest of the world. Thankfully, as United States citizens we have rights that are protected by the government and are (ideally) not violated on a regular basis. However, consider the greater worldwide community. What happens if we take secure encryption away from a reporter in the Middle East covering the atrocities committed by ISIS? Do we have to now give access to the (hypothetical) Apple backdoor to every government who demands it, even those who commit horrendous human rights violations, and who will unashamedly use it to spy on their own citizens?
Privacy is not an easy subject to talk about. Issues like Apple vs. FBI are laden with emotion-laden arguments (from both sides) designed to scare people without providing any real substance. While it’s often tempting to compromise personal privacy for promises of security and public safety, all this accomplishes is to further make insecure the systems we rely on and trust on a daily basis.