Skip to Content

Cybersecurity & Standards, Government Surveillance

CDT Files Brief in Support of Apple and Strong Encryption

2016-02-29-encryption-graphic_blog2

The ongoing quest to protect our devices from backdoor weaknesses that allow government surveillance and third party attacks has once again moved into the courts.  The FBI is attempting to compel Apple to create code that would significantly weaken the security of its iPhones and make them less secure.  CDT has long opposed government-mandated backdoors in technology, and believes that strong encryption strengthens the security of our nation.

Earlier this week, CDT collaborated with the wonderful lawyers at Wilson Sonsini to file an amicus brief in support of Apple in the federal district, asking the court to refuse the FBI’s request. While it’s highly unusual to file amicus briefs at the district court level, the issues in the case go far beyond this one investigation or a single phone. Should the government prevail, it would set a precedent under which any company could be forced to spy on unknowing customers on behalf of law enforcement, and in the process be required to override its own security measures in ways that expose its users to malicious attacks. Moreover, all of these orders could be issued behind closed doors, ex parte, with little or no opportunity for the company or public to be heard.

Moreover, all of these orders could be issued behind closed doors, ex parte, with little or no opportunity for the company or public to be heard.

This case arose out of the December 2, 2015 attack in San Bernardino, where one of the shooters had a work iPhone in his possession.  The FBI approached Apple for help in its investigation, and Apple devoted substantial resources to support the government, including providing all data that it possessed relating to the attackers’ accounts and providing technical assistance. Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, requested that the San Bernardino local government change the iCloud password associated with the phone, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data, and precluding the ability to pull the data off of the cloud. Apple has no access to the data stored locally on its phones, and was unable to assist the government with retrieving the data from the phone.

On February 16, 2016, the government persuaded a federal judicial magistrate to compel Apple to assist in the government’s investigation under the authority of the All Writs Act.  Without hearing from Apple or security experts, the judge granted the government’s request and signed the government’s proposed order, compelling Apple to create new software to allow the government to access the phone.

The government’s request should not have been granted for a multitude of reasons, including significant constitutional concerns.

The government’s request should not have been granted for a multitude of reasons, including significant constitutional concerns.  This week, there was an outpouring of support from civil society organizations, academics, cryptologists and companies that filed amicus briefs in support of Apple, detailing the Pandora’s box of problems that will be opened if this order is granted.

CDT’s amicus brief addresses two main points:

First, ordering a private company to defeat its own security measures by creating a new version of its software is an impermissible expansion of the All Writs Act.  The legal issues around the All Writs Act are complex, but at its core, the Act gives federal judges the power to issue orders to compel people to do things within the limits of the law.  The Act allows courts to enforce laws that are not otherwise covered by statute. Where a statute specifically addresses the particular issue at hand, it is that authority, and not the All Writs Act, that is controlling.

Here, the government’s surveillance authority has been laid out by Congress in CALEA, which does not support conscripting an innocent private company to weaken its security standards.  A federal judge in New York agreed earlier this week when rejecting a similar request brought by the DEA. As Judge Orenstein explained, the courts should decline any invitation to “transform the [Act] from a limited gap-filling statute that ensures the smooth functioning of the judiciary itself into a mechanism for upending the separation of powers by delegating to the judiciary a legislative power bounded only by Congress’s superior ability to prohibit or preempt.”

The courts should decline any invitation to “transform the [Act] from a limited gap-filling statute that ensures the smooth functioning of the judiciary itself into a mechanism for upending the separation of powers by delegating to the judiciary a legislative power bounded only by Congress’s superior ability to prohibit or preempt.”

Second, compelling private companies to weaken the security of their products will undermine device security and decrease public trust in connected devices and emerging technologies. The broad new power that the government is seeking cannot be limited to a single company, and certainly not to one phone. Compelling Apple to write code to undermine its security features will set the stage for similar requests aimed at a wide range of other technology providers and connected devices. That will have pernicious and far-reaching consequences. Allowing the government to force providers to rewrite or rewire their products at the direction of law enforcement will fundamentally alter the relationship between those companies and their users. It will erode public trust across a variety of devices and applications. All of this will render those technologies—and those who use them—less secure, not just from the government but from third party attackers, thieves, and repressive regimes.

Compelling Apple to write code to undermine its security features will set the stage for similar requests aimed at a wide range of other technology providers and connected devices. All of this will render those technologies—and those who use them—less secure, not just from the government but from third party attackers, thieves, and repressive regimes.

We live in a world that is increasingly interconnected. You can monitor your sleeping baby through a webcam. You can use your phone to adjust your thermostat on your drive home, and then use it to turn on your house lights. You can receive messages on your phone if your carbon monoxide alarm goes off. Your medical devices can make an emergency call for help if you become incapacitated. These are amazing and positive developments for the human experience, and they better our lives.

But these systems need to be safe from malicious third party attacks. A decision compelling Apple to weaken critical security features on its phones will leave the creators of a wide range of Internet-connected consumer products—cars, televisions, personal fitness trackers, even refrigerators and home security systems—vulnerable to government conscription, not only by the United States but also by foreign governments, and render these products vulnerable to malicious attacks by criminals and state actors alike. When your whole house is capable of listening to you, poor security features on these connected devices will mean that you will have no control over who is hearing your most private moments. And this will have been enabled by the very companies that create this technology, work hard to make it secure, and in whom users must necessarily put their trust.

Simply put: All connected technology needs to be secure, and a decision for the government in this case will erode critical safeguards, exposing all of us to unnecessary risk.  Oral arguments have been set for March 22.