Last week’s big cybersecurity news was that the FBI obtained a court order to force Apple to develop new software that would bypass several iPhone security features so the FBI can attempt to unlock the work phone of one of the San Bernardino shooters. Apple plans to challenge that order. (Full disclosure: I am planning on writing a technologists’ amicus brief on Apple’s side in that challenge.)
The ruling was one of those rare moments where digital security developments grabbed a big share of the public limelight. There were technical explanations, legal explainers, and policy pieces. The editorial boards of The New York Times, The Wall Street Journal, and The Washington Post all weighed in to say they believed the government had overstepped by seeking to force Apple to write new code that would undermine the security of its devices. The House Energy and Commerce investigation subcommittee indicated it wants to jump into the mix, asking Apple CEO Tim Cook and FBI Director James Comey to testify about the challenge.
Meanwhile, the federal government is on a full public relations tear, with Comey disclaiming a desire to obtain legal precedent for future investigations, and cloaking himself in the PR-friendly goal of ameliorating the sorrow of the San Bernardino shooting victims and their families. Meanwhile, the DOJ wags its finger at Apple for being motivated by business interests. The government is waging this battle for the moral high ground despite last week’s leak of a confidential National Security Council “decision memo” setting out a broader Obama administration initiative to handle the so-called “Going Dark” problem by finding new encryption workarounds and identifying laws that agencies might want to change.
This story and its subsequent developments (including the government’s motion to compel and the updated briefing schedule) has been everywhere since the story broke last Tuesday. The story will continue to unfold, and as it does so, here are some things to think about.
We live in a software-defined world. In 2000, Lawrence Lessig wrote that Code is Law — the software and hardware that comprise cyberspace are powerful regulators that can either protect or threaten liberty. A few years ago, Mark Andreessen wrote that software was eating the world, pointing to a trend that is hockey sticking today. Software is redefining everything, even national defense. But, software is written by humans. Increasingly, our reality will obey the rules encoded in software, not of Newtonian physics. Software defines what we can do and what can be done to us. It protects our privacy and ensures security, or not. Software design can be liberty-friendly or tyranny-friendly.
This battle is over who gets to control software, and thus the basic rules of the world we live in. Who will write the proverbial laws of physics in the digital world? Is it the FBI and DOJ? Is it the US Congress? Is it private industry? Or is it going to be individuals around the world making choices that will empower us to protect ourselves — for better or for worse?
Some news outlets have returned to the familiar but tired and inaccurate trope of privacy versus security. This isn’t a privacy versus security case. The FBI has a search warrant that honors and overcomes the San Bernardino shooter’s privacy interests in the phone. (Of course, there won’t be a warrant in all or even most of the cases where governments demand forensic workarounds for phone security. In the US ,warrants are endangered — for international communications, intelligence investigations, border crossings, and more. Outside the US, we can’t count on even democracies to have judicial review or probable cause requirements, or human rights-respecting laws.)
There are other interests at stake here too. Apple has a liberty interest in not being dragooned into writing forensic software for our government or any other. As Judge James Orenstein of the Eastern District of New York wrote in October when he sparked a conversation about the proper scope of government power over communications providers by refusing to immediately sign an order compelling Apple to unlock a handset, Apple is “free to choose to promote its customers’ interest in privacy over the competing interest of law enforcement.” For this reason, it’s surprising that the more libertarian-leaning organizations and lawmakers in our nation have not come out more strongly and persistently on Apple’s side.
Finally, there’s a public safety issue here. This is a security versus security case — the government’s interest in investigations versus the public interest in increasingly secure communications. Government demands like this have security externalities. For technical, legal, and geopolitical reasons, it’s hard — probably impossible — to break security measures for just a few devices and only under the right circumstances. This matters because we also live in a world of rampant communications insecurity. Governments exploit security vulnerabilities to surveil people — both their own citizens and foreigners. They use such vulnerabilities to conduct drone assassinations, spy on journalists, and engage in mass surveillance. And that’s just the US. (See here, here, and here for the very tip of the iceberg elsewhere.) While the FBI’s request seems to go beyond what other governments have sought from Apple so far, if Apple is forced to develop code to exploit its own phones, it will only be a matter of time before other countries seek to do the same.
The big question then becomes: Are people going to be forced to live in a surveillance-friendly world? Or will the public be able to choose products — phones, computers, apps — that keep our private information, conversations, and thoughts secure?
Right now, the FBI wants to decide these questions with reference to a law that was originally passed in 1789. The All Writs Act allows courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” Obviously, Congress wasn’t considering iPhone security at the time. The AWA has no internal limits and provides no guidance for courts on how to weigh individual privacy interests with corporate liberty and business interests with public safety interests. It is an utterly inappropriate vehicle for compelling forensic assistance.
Where Congress has actually authorized law enforcement to make demands of providers, it’s been far more nuanced than the AWA. The Communications Assistance for Law Enforcement Act (CALEA) passed in 1994 requires surveillance-friendly telephone networks. CALEA is a complex statute and its regulations are based on public hearings and an explicit consideration of public security. Still, CALEA mandates have led to insecure design and serious privacy breaches. Provider assistance provisions in the pen register statute, the Wiretap Act, and the Foreign Intelligence Surveillance Act allow the government to compel cooperation, but only for particular classes of providers and a limited set of data. The statutes also limit the burden our government can impose on private entities.
This is not to say that Congress should act. Communications security is global, complicated, critical, and we are very bad at it. Government policy should be, and often is, to improve it and not to tear it down. But Congress, when confronted with this issue in the past, has done and would do a far more thoughtful and nuanced job than the FBI and DOJ are doing.
Finally, this case is not about this particular phone. Contrary to a host of statements that claim the FBI’s request is narrow and will only apply to a single shooter’s work phone (here, here, and here), if it wins, the government will do this again. And so will others. The Manhattan DA has already indicated his appetite for such a workaround, as have foreign countries. This won’t be “exceptional access,” a phrase I would like to strangle and bury. There’s nothing “exceptional” about it. Apple has said that the software the FBI is seeking would be effective on every iPhone currently on the market. As soon as the code is out there, its use will be widespread.
Some people are trying to draw a line between design mandates, which this isn’t, and obligations to create forensic tools. Design mandates are a disaster, but this is nearly as bad. As soon as the legal precedent is out there, compelled forensic workarounds will quickly become routine. Legal precedent is bigger than the particular request in the specific case. It gets handed down and applied in a variety of contexts, many of which look vastly different than the facts that originally led to its development. If the All Writs Act can be used in this way — to force a company to develop forensic software that the government wants to deploy in a single case of terrorism — it could be used in any number of other (currently unforeseen) circumstances.
In other words, design mandates will be next. In fact, maybe it’s already happening behind our backs. When the Snowden documents showed that Microsoft had created surveillance backdoors in Skype, Outlook.com, and Hotmail, the company issued a statement. It said:
Finally when we upgrade or update products legal obligations may in some circumstances require that we maintain the ability to provide information in response to a law enforcement or national security request. There are aspects of this debate that we wish we were able to discuss more freely. That’s why we’ve argued for additional transparency that would help everyone understand and debate these important issues.
At the Center for Internet and Society, we’ve been trying to figure out what those legal obligations are. I wonder if these AWA arguments are part of it.
To make sound policy in this space, the public needs to know what the government is forcing companies to do, the full picture. This San Bernardino case is just one salvo in the ongoing war between a surveillance-friendly world and a surveillance-resistant world. The stakes for liberty, security, and privacy — for control over our software-defined world — are high.