Skip to Content

AI Policy & Governance, Privacy & Data

CDT Big Data Comments Stress Importance of Consumer Empowerment

Yesterday, CDT filed comments with the National Telecommunications Information Administration on the implications of big data on the digital economy. These comments follow previous government inquiries into big data, including reports released in May by the White House and the President’s Counsel of Advisors on Science and Technology following a “big data review.” The reports raised a number of intriguing questions, many of which revolved around a central concern: can the Consumer Privacy Bill of Rights support the innovations of big data, while still responding to potential risks?

In our comments, we argue that the White House’s Consumer Privacy Bill of Rights does not need to be rewritten in the name of big data. The White House’s privacy report that introduced the Consumer Privacy Bill of Rights was written only two years ago with the potential (and pitfalls) of big data in mind. If anything, in the age of big data, privacy protections and personal control should be strengthened to empower individuals to make informed choices about how companies collect, share, and maintain their personal information. Specific points we raise in our comments include:

  • Big data doesn’t mean we lose any notion of privacy. The knowledge of some by some certainly does not mean the knowledge of all by all. While personal privacy does not necessarily override all considerations, regulations must ensure that we retain privacy rights even in information that is shared or otherwise observable.
  • Notice and choice should be improved, not abandoned.  It’s currently difficult for consumers to make informed privacy decisions due to dense, inscrutable privacy policies. Privacy legislation should steer its efforts toward offering more clarity and control to users, instead of assuming consumers can’t make those decisions on their own.
  • Responsible use and accountability are necessary but not sufficient for privacy protection. Complete reliance on responsible use presumes that institutions can perfectly control the data they obtain, but situations such as data breach or illegitimate government access show that this is obviously not true. Additionally, shifting privacy protection to opaque internal privacy processes offers consumers and regulators less visibility into data practices and fewer opportunities to hold institutions accountable.
  • Deletion and de-identification aren’t perfect, but they provide meaningful privacy protections. Requiring companies to exercise reasonable diligence to delete data, coupled with a commitment not to try to recover it, offers consumers reasonable protection. And while de-identified data still holds risks, policy safeguards can go further with enforceable public commitments to do de-identification well and to not allow re-identification, coupled with contractually binding downstream recipients to not re-identify or further release de-identified data. Certainly, these approaches to deletion and de-identification are preferable to doing nothing to mitigate the amount of identifiable information on consumers.
  • Big data should empower individuals. Technical solutions, such as legally sanctioned “privacy preference profiles” and metadata privacy rule sets, are promising concepts that may offer consumers considerable control. Using these methods, a consumer could tell a company that he or she feels comfortable sharing information with it, so long as the company agrees to delete or de-identify the data after a period of time. The company could then decide whether to offer the user service under those terms. Creating a functioning market for privacy would represent a huge advance over current practices, where users have poor visibility into commercial data practices and few options to register persistent preferences.

Of course, no privacy protective framework can allow for maximum usage of data, while still completely safeguarding personal privacy; trade-offs will have to be made. Ultimately, we believe that these trade-offs should be evaluated and decided by the individuals whose information is affected — not by governments or corporations.