Last week, a discussion draft of the “Compliance with Court Orders Act of 2016,” cosponsored by Senators Richard Burr (R-NC) and Dianne Feinstein (D-CA), was leaked to the public and met with harsh criticism from the tech policy community. It required device manufacturers, communications service providers, and app developers to respond to court orders for information by either providing that information in an “intelligible format” or giving the government any technical assistance necessary to make the information intelligible. In addition, it would have required “license distributors,” such as app stores, to ensure that any product, service, application, or software they distribute is capable of compliance. Although the draft promised not to impose any “design limitations,” in reality it did just that: in order to comply, every electronic communication device, communication service, and application would have to come with the very backdoor to encryption that CDT has warned would put the security of internet users at grave risk.
…and may be an indication that the crypto “debate” happening on Capitol Hill is not so much of a debate as it is a misunderstanding of what tech policy advocates are concerned about when it comes to backdoors.
Yesterday, an “official” discussion draft was released. Unfortunately, the minor changes in this draft do not address the major problems with the bill. The fact that the new version contains only small, technical changes is concerning, and may be an indication that the crypto “debate” happening on Capitol Hill is not so much of a debate as it is a misunderstanding of what tech policy advocates are concerned about when it comes to backdoors. Before there can be a meaningful policy debate about encryption, there must be agreement on three important points:
First, while no one should be above the law, the law should not jeopardize exactly what it claims to protect. The Burr-Feinstein draft emphasizes that “no person or entity is above the law,” and that “economic growth, prosperity, security, stability, and liberty” require adherence to the rule of law. On this point, there is widespread agreement. Whenever CDT and others have argued against mandated backdoors, it has never been because of a belief that some people should not have to comply with the law. Rather, it is because what tech companies are being asked to do in order to comply with the law will make people all over the world significantly less safe. In other words, the “solution” proposed by this draft discussion is actually the antithesis to the core mission of all federal, state, and local law enforcement agencies.
Second, forced compliance with the bill’s mandates in some cases actually forces compliance in all cases. The only substantive change in the new discussion draft purports to limit the mandates in the bill to investigations involving serious crimes (such as threats of death or serious bodily harm and terrorism). This change was presumably meant to assure people that the bill’s extraordinary requirements would only apply to extraordinary circumstances. However, our concerns were never about the types of cases in which the bill’s mandates would arise. In order to be capable of complying with a court-ordered mandate for even one serious crime case, communications service providers, device manufacturers and app developers would have to build all of their products to accommodate that mandate. The encryption mechanisms that keep data, apps, and devices secure would have to be weakened across the board, which would leave us all more vulnerable to hackers, malicious governments, and terrorists who would love to exploit that weakness. Encryption prevents hackers from flummoxing the electrical grid by remotely adjusting “smart” home thermostats. Encryption protects every one of us from cybercriminals and malicious governments who can otherwise hack transiting data, networks, and even individual devices remotely. Encryption is, by and large, a matter of national security.
In order to be capable of complying with a court-ordered mandate for even one serious crime case, communications service providers, device manufacturers and app developers would have to build all of their products to accommodate that mandate.
Third, banning strong encryption simply will not work. CDT coordinated a group of 20 of the world’s leading experts in cryptography and security in May of 2013 – one month before the first Snowden revelation – to write a whitepaper discussing how futile it would be to try and mandate backdoors: “CALEA II: Risks to Wiretap Modifications to Endpoints.” As those experts point out, much of the software that mediates secure communications is increasingly open source, where it is not possible to hide the addition of backdoor mechanisms. Moreover, it would be trivial to remove those features and build a version of the software that did not contain those backdoors. Finally, the cryptography used for secure communications is not difficult, amounting to exercises commonly given out to high school and undergraduate computer science students. It should be no surprise that a recent study by Schneier et al. found more than 800 encryption products offered by organizations or companies outside of the United States, where this law would not apply.
Yes, there are many problems with this draft bill. But before diving into the weeds, let’s be clear about our fundamental concern with this proposal: our opposition does not stem from a desire to put anyone above the law, or concern about the types of crimes in which courts will order compliance. It stems from the fact that the proposal’s requirements will undermine the security of all internet users and that efforts to ban encryption are completely futile. Surely we can all agree that weakened security does not support safety.