Skip to Content

Privacy & Data

Big Data Report Shows Privacy Matters

After a ninety day review, (including our own comments) the White House review group, led by John Podesta, released its report on the challenges and opportunities of big data, and how it may affect individual privacy. In tandem, the President’s Council of Advisors on Science and Technology (PCAST) released a report focused on technical issues. There have been a lot of promises made about how big data will improve our lives, and there are absolutely some uses of big data that will have positive social effects. But big data raises very real privacy concerns, and we’re pleased that the White House has raised public awareness of these issues through its review process. Users need to be able to control their personal information — Big Data must empower individuals, not reinforce existing power imbalances that benefit companies and governments.

A Strong Set of Policy Recommendations

At the highest level, the Review Group report issued six recommendations, all of which we support:

• Advancing the 2012 Consumer Privacy Bill of Rights

• Enacting national data breach legislation

• Extending privacy protection to non-U.S. individuals under the Privacy Act

• Ensuring that student data is used for educational purposes

• Expanding technical expertise to prevent discrimination

• Amending the Electronic Communications Privacy Act (ECPA)

The other recommendations in the report would, if properly enacted, improve existing protections for individual data. For example, the recent high profile data breaches suffered by Target and other companies have pointed to a need for better security for consumer data, especially when companies hold sensitive financial or location data. A federal law would likely preempt state laws, so a weak federal standard could weaken the obligations on companies. A federal law with strong provisions and coordinated enforcement between the federal government and state attorneys general would help alleviate those concerns and promote strong consumer protections.

The report also discusses the possibility of discrimination – on racial, ethnic, class, gender, sexual orientation, or other grounds – as a real concern when companies, government, or other entities have access to vast amounts of data that can be used to profile individuals. There have already been reports of data brokers using categories like “Ethnic Second-City Strugglers,” “Tough Start: Young Single Parents,” and “Rural and Barely Making It.” These categories could easily be used to discriminate against minority and underserved populations, and we’re pleased to see the White House echo our calls for a prohibition on such practices. Digital redlining – the 21st century analogue to the practice designed to keep racial minorities out of certain residential areas – is unacceptable; the fears that such practices could occur creates chilling effects that will discourage adoption of new services that rely on data collection and big data analytics. While there are existing laws, like the Fair Credit Reporting Act, that do some work to prevent data discrimination, there needs to be a strong framework in place to ensure that all individuals – regardless of the background – are treated fairly in the big data context. We therefore also support the recommendation of an extension of the Privacy Act to cover non-U.S. individuals.

Finally, the White House calls for ECPA reform, which CDT has long argued for; we wrote yesterday on how updating ECPA from 1986 to the current digital age is long overdue.

Taylor Rodriguez Doesn’t Care About Data Collection – But She Should

One of the areas where the report falls short is on empowering users to make choices about their privacy. CDT has long advocated for limits on the collection of data, rather than relying upon use limitations to protect individuals. It’s disheartening to see that the PCAST report assumes a world in which collection is rampant, pervasive, and unceasing, and that neither report discusses government surveillance in detail.

The PCAST report imagines a future world in which a fictional character, Taylor Rodriguez, prepares for a business trip. Her bag has an RFID tag; “the cloud” knows her itinerary; and airport security has access to data collected in her home. The report argues that today, we might find this “creepy,” but that social notions of privacy change and that Taylor is probably OK with the situation because of the benefits. That’s a pretty big assumption to make.

If Taylor chooses to let “the cloud” hold her data, track her movements, and watch her in her house, there need to be limits on who has access to that data. And there needs to be a discussion about government access to that data; both the Podesta and PCAST reports don’t discuss that issue. It’s obvious that commercial collection of data and the NSA’s over broad surveillance practices are linked. To address commercial collection and use of data without discussing the danger of government access is a half answer at best.

Despite PCAST’s claims in the Taylor Rodriguez story, collection of data should not be an automatic assumption. Companies, governments, and other entities that collect data from individuals need to make affirmative decisions about what data they collect, for what purposes, and for what lengths. While some have argued that the value of big data lies precisely in unanticipated uses, we think that context matters, and that consumers should know generally what kinds of applications their data might be used in. That may not require specifically delineating precisely what data may be used for, but relying on context – and the notion of contextual integrity, proposed by Helen Nissenbaum.

Indeed, when companies have failed to affirmatively decide what data gets collected and how, they have often inadvertently sucked up too much data – in some cases resulting in FTC fines and consent decrees. Making deliberate choices about when to collect data is important to protect consumers, and pervasive collection is far from a foregone conclusion. Encouraging limitations on data collection, rather than relying on use limitations to protect consumers, is both beneficial to consumers and to companies.

The incorporation of more devices that collect data doesn’t mean that all individuals will want a world in which their devices are always on and always collecting data. Automatic collection by all devices, all the time, is not a desirable or inevitable outcome, despite what some may think. Allowing for flexibility of choice – and empowering individuals to create privacy protective spaces in their daily lives – will be of vital importance.

The Need to Respect Privacy

There have already been instances of consumer devices collecting data out of context and without consent. The FTC settled with a company that was spying on people in their homes via laptop webcams – in some instances, watching people in intimate moments in their homes. LG TVs collected data about how individuals watched TV in their homes, and then sent that data back to the company without telling the TV’s owner. If the TV was connected to a home network, it would scan the file list and send that to LG as well.

It should be obvious that consumers don’t want this. When consumers purchase devices and sign up for services, they should have some control over what data those devices and service collect, and be empowered to make affirmative choices to limit those collection and use practices. The White House report criticizes the current notice and choice framework as inadequate, and we agree that it needs to be reformed given current practices. But let’s not throw the baby out with the bathwater – notice and choice should be improved, not discarded, in order to more effectively protect and empower consumers.

Empowering Consumers in the Big Data Age

The major lesson here is that individuals need more power over their devices and services. The Podesta report recognizes this and discusses some good solutions, including endorsing Do Not Track and global privacy controls across sites. CDT has long supported the use of consumer controls – such as mobile device settings – to help consumers make choices regarding data collection and usage.

User controls should be intuitive, scalable, and strong. Consumers should have a clear understanding of how the controls work, and what they actually mean. We recently wrote about Lookout’s open source mobile privacy policy, which communicates data collection practices and sharing provisions to users. Encouraging similar practices in the big data context will be crucial in allowing users to make effective choices about what products they use and what data gets collected and used. EFF has just released Privacy Badger, which allows users to stop online tracking by blocking ads and cross-site cookies. Empowering users in this manner allows for more effective choices in how individuals use services, and it also educates the public in how services actually work and what they collect.

The White House’s call for more technical measures to empower users, and for technical analysis of how existing laws can be used to prevent discrimination in big data promises, are welcome, and we hope that government and the private sector will work to promote research and development. Big data holds a great deal of promise, but consumer rights and civil liberties need to be placed at the center of the debate from the beginning. The White House report is a step in that direction, but actual, specific practices and protections will need to be created and implemented as soon as possible.