Skip to Content

Government Surveillance, Privacy & Data

The List-Building Has Begun: How the Tech Sector Should Respond

I recently walked into a policy meeting in Washington, DC, and a friendly voice called out, “Nuala, you know you and CDT are on the enemies list.” It was supposed to be a joke, and it was. But it wasn’t.

The power of the state lies in its ability to deprive individuals of rights, benefits, life, and liberty. This power is magnified when the government amasses hordes of information on its citizens, particularly on those citizens with whom they may disagree. These datasets are certainly made easier by new technologies. That is why I have always maintained that there is an important distinction between information collected by the private sector and information collected by the government.

We now have an Administration that is starting its list-building in profound ways.

One of the most compelling messages of the Snowden disclosures was that the bright line between those two datasets is far less clear, and far more permeable, than we had previously imagined. We now have an Administration that is starting its list-building in profound ways. There are calls to investigate alleged voter fraud, which will naturally include closely examining voter registration records across the country. We also witnessed an executive order aimed at securing our borders that adds new layers of examination for not only visitors, but also legal visa and green card holders. We’ve heard stories of forced disclosure of social media identifiers and passwords, and contacts from cell phones. This is how it begins.

This list-making will require help, though, and as Kate Crawford calls out in her powerful letter to Silicon Valley, the list of private-sector actors aiding and abetting government data collection is a long one. So how do we limit data collection, misuse, and potential abuse?

While individuals certainly play a role in taking control of their data, there is an essential role for industry, particularly our friends in the technology sector, to reinforce policies and enhance technologies that limit unwanted and unwarranted intrusion into our digital daily lives. Here are but a few concrete and simple — not easy, but simple — steps in that direction.

Delete the data: One of the least sexy and in many ways most difficult ideas is to simply get rid of the data you don’t need any more — data deletion or minimization. This is partnered with the “only collect the minimum amount needed” concepts, collection limitation, and purpose specification. Asking corporate systems and marketing leaders to get rid of data they rightfully hold sounds about as appealing as you asking me to get rid of any of the dozens of textbooks I have from my law school days. I might need them someday? Those notes in the margins can’t be recreated? I get that it’s challenging. There is potential value in data, though I would argue there is less and less as it ages. But we must remember that large and interesting sets of personally identifiable data are not only a potential asset; they also contain inherent risk and costs. There is a risk of breach, and, increasingly, a reputational risk should the government come a-callin’ for interesting data about a particular population. Creating a data lifecycle that categorizes data and creates parameters for its end should be core operational practices for all companies that have personal data, as should minimization, de-identification, masking, and deletion-by-encryption. CDT will soon release a paper on this topic, and I’m more convinced than ever that one of the best defenses is disposing of data or making it irretrievable. I’ve long said — even back when I was in government service — that the best way to prevent the expansive or inappropriate use of data by any institution is to not have the data at all. And delete must really mean delete, if that is the intention of the individual.

Make strong technical security the default: Security of internet transactions and communications is essential to preventing inappropriate collection of personal information. CDT has taken strong stances on end-to-end encryption and HTTPS adoption. This work is founded in our belief that privacy is essential to human dignity and development. Individuals must trust that their digital lives, like their real-world lives, contain safe places for personal thoughts, communications, and collaborations. As I have said before,  we are fighting to protect the autonomy, the individuality, and the creativity that comes from the quiet spaces. While consumers can — and should — take some steps to adopt protections, the defaults set by companies become the norm, and the companies that hold encryption keys will also become targets. While we recognize that there are legitimate and compelling law enforcement and national security reasons to investigate criminal and terrorist acts and communications, first and foremost, we must protect the rights of an individual citizen to be left alone or we have utterly failed in our mission to uphold American values. Companies upon which individual citizens rely to engage in ordinary, daily life in the digital era — whether hardware, software, apps, or platforms — must consider these individual expectations and external threats, and design their products and services to be responsive to both.

Reject overbroad government requests: I’m incredibly proud of some now-very-old stories I have about criticizing and rejecting government demands for data from the private sector. I understand the compelling nature of security threats, and the intensity with which we want to keep our friends and families safe. But demands on the private sector, when made by the government, must be made in accordance with law and legal process. It is very difficult to say no when you are told by someone in a position of power that “people are going to die” if they are not given what they want. I know — someone once said that to me.  But we are a nation of laws and principles. While the onus is certainly on the government not to make egregious requests of the private sector to turn over data, it is also on companies to reject and push back against unreasonable requests and searches, and to demand transparency and accountability in these processes wherever possible.  

In a digital age of fluid boundaries between the individual, company, and state, it is imperative that companies defend the rights of their individual customers and take steps — whether in technology design or in institutional policy — to limit disclosure of personal data to the government. The list-building has begun, but companies must not become willing partners.