Despite its continued transition to a data-driven economy, the US does not have a set of basic privacy protections that apply across the life cycle of consumers' data – from its creation to collection to deletion. A baseline privacy law could provide such protections. Of the 34 industrialized democracies that make up the membership of the Organization for Cooperation and Development (OECD), the United States is one of only two – along with Turkey – that have failed to implement baseline privacy protections for consumer data.
Instead of being governed by one law that sets a floor of protections, data in the United States are governed by an ineffective patchwork of laws that apply to discreet economic sectors and protect only a small subset of consumers' personal data. The absence of an adequate baseline privacy law is particularly troubling given the accelerating proliferation of tracking technologies. Consumers have been left without sufficient control over their information or the means to see how it is being used.
|1. Why We Need a Baseline Privacy Law|
|2. Fair Information Practices|
|3. Safe Harbor|
|4. Key Resources|
Incomplete Patchwork of Privacy Laws Leaves Consumers Unprotected
U.S. federal privacy law covers only a limited number of economic sectors as well as select types and uses of data. In a report released in December 2010, the U.S. Department of Commerce described why this approach is inadequate for the 21st century:
Much of the personal data traversing the Internet falls into these gaps.
Data that falls outside these sectoral laws does receive a minimal level of protection through Section 5 of the Federal Trade Commission (FTC) Act, which allows the FTC to bring actions against companies that engage in "unfair or deceptive" practices. For the most part, however, the FTC has interpreted this statute to only prohibit companies from making affirmative misrepresentations about data use practices. As long as they don’t go out of their way to lie about what their doing with consumer data, companies can do whatever they want with their customers’ information, without their customers’ understanding or consent.
This has been the case for years, but the stakes are considerably higher today in the age of constant connectivity, social sharing, and low cost storage and processing. Personal data that would have taken laborious effort to collect and correlate just a few years ago can be analyzed and stored in individualized profiles today in a matter of milliseconds. Consumers who want to find out how their data is being used and shared often can’t figure it out because there’s no requirement for companies to tell them in a straightforward manner. Our online (and offline) experiences are increasingly customized as a result of tracking, often by companies with which consumers have no direct relationship at all. This can lead not only to personalized content and advertisements (and personalized price discrimination as well), but also to an awareness of constant surveillance and tracking, which has a chilling effect on consumers’ free speech and association, and adoption of emerging technologies.
Bad for Business
Critics of baseline privacy laws have long argued that companies can regulate themselves. However, this theory hasn't panned out in the U.S. In many industries, self-regulatory efforts have been non-existent or ineffective. Meanwhile, companies that do go out of their way to follow strong privacy policies often perceive themselves to be at a disadvantage to other U.S. companies that operate without limitations, leading to a "race to the bottom" on privacy. When consumers feel that companies are in an arms race to collect, use, and sell increasing amounts of personal data, everyone in the ecosystem loses as people will be increasingly leery about engaging with new services and products. Moreover, in the long term, the lack of a baseline privacy law in the U.S. has the potential to diminish U.S. companies’ competitive edge. For example, leaders in the cloud computing industry have consistently voiced concern that the lack of a baseline privacy law in the U.S. is affecting their ability to compete globally. In fact, foreign cloud companies are increasingly advertising the fact that their cloud services are not based in the U.S. and are therefore better on privacy. The implication is clear: Not having a baseline privacy law is bad for business
- CDT Op-Ed, Ars Technica: Why the US needs a data privacy law--and why it might finally get one
FAIR INFORMATION PRACTICES (FIPs)
Any discussion of consumer privacy – whether in Congress, at the FTC, or within industry – must be grounded by a comprehensive set of FIPs. FIPs have been embodied to varying degrees in the Privacy Act, Fair Credit Reporting Act, and other “sectoral” federal privacy laws that govern commercial uses of information online and offline.
The most recent government formulation of the FIPs offers a robust set of modernized principles that should serve as the foundation for any privacy legislation. These principles, as described by the Department of Homeland Security in 2008, include:
- Transparency. Entities should be transparent and provide notice to the individual regarding its collection, use, dissemination, and maintenance of information.
- Individual Participation. Entities should involve the individual in the process of using personal information and, to the extent practicable, seek individual consent for the collection, use, dissemination, and maintenance of this information. Entities should also provide mechanisms for appropriate access, correction, and redress regarding their use of personal information.
- Purpose Specification. Companies should specifically articulate the purpose or purposes for which personal information is intended to be used.
- Data Minimization. Only data directly relevant and necessary to accomplish a specified purpose should be collected and data should only be retained for as long as is necessary to fulfill a specified purpose.
- Use Limitation. Personal information should be used solely for the purpose(s) specified in the notice. Sharing of personal information should be for a purpose compatible with the purpose for which it was collected.
- Data Quality and Integrity. Companies should, to the extent practicable, ensure that data is accurate, relevant, timely and complete.
- Security. Companies should protect personal information through appropriate security safeguards against risks such as loss, unauthorized access or use, destruction, modification, or unintended or inappropriate disclosure.
- Accountability and Auditing. Companies should be accountable for complying with these principles, providing training to all employees and contractors who use personal information, and auditing the actual use of personal information to demonstrate compliance with the principles and all applicable privacy protection requirements
Properly understood, FIPs constitute a comprehensive privacy framework that federal legislation should reflect. Unfortunately, most privacy schemes to date have focused only on a subset of the FIPs: some have been confined only to notice and consent. Relying exclusively on notice, consent, and security compliance regimes places the entire burden for privacy on the consumer to navigate an increasingly complex data environment. In most instances, little practical privacy protection is achieved by reliance on this narrow set of protections. The privacy challenges posed by the vast array of 21st-century technology and business practices require a greater emphasis on a broader set of substantive protections. Notice and consent are crucial, but they are simply not enough to adequately protect consumers today.
Any baseline privacy law must set basic standards that all companies collecting data must abide by. However, no single law can be tailored to perfectly fit the practices of each individual company and industry. If a law is too detailed and prescriptive, it will apply unevenly across different industries and will be difficult to adapt to evolving technologies and practices. On the other hand, if a law is too broad and vague, companies won’t have certainty about whether their practices are illegal or not. One approach could be to allow the Federal Trade Commission to issue (and revise over time) regulations interpreting a principles-based statute; CDT supports giving the FTC such authority, but while a regulatory agency can revise rules more quickly than Congress, it would still be very challenging to keep up with every new data-intensive application.
Another idea that is gaining traction both in the U.S. and around the world is to allow for privately-run "safe harbor" programs that set guidelines for companies in specific industries to follow in order to be deemed in compliance with a privacy law. A baseline privacy bill could explicitly recognize such a process in which industry coalitions can propose—and individual businesses can apply to join—a “safe harbor” privacy program. Such programs would incentivize businesses to offer protections that exceed certain standards in the privacy law while offering flexibility to divergent industries in complying with the baseline law.
Safe harbor status should only be granted under unique circumstances. First, the safe harbor program must be subject to FTC review and approval, or some other formal regulatory recognition process. Also, safe harbor programs should be accompanied by a legitimate auditing and a self-regulatory enforcement regime on the part of companies. Finally, the privacy protections of any safe harbor program should be at least as strong as those required by the baseline statute.
- CDT Testimony: The Best Practices Act and Other Federal Privacy Legislation
- CDT Testimony: Privacy Implications of Online Advertising
- CDT Chart: The BEST PRACTICES ACT and Other Federal Privacy Legislation
- CDT Policy Post: Recommendations for a Comprehensive Privacy Protection Framework
- CDT Paper: Top-Level Analysis of the Commercial Privacy Bill of Rights Act of 2011
- The Fair Credit Reporting Act (FCRA) - as amended by the Fair and Accurate Credit Transactions Act of 2003, protects consumers against "credit reporting agencies." Unless someone has a legitimate purpose, they are forbidden from looking at consumer reports.
- The Privacy Act - puts limits on the information collected about individuals by federal agencies. Agencies cannot disclose information about an individual without their consent.
- The Family Educational Rights and Privacy Act (FERPA) - regulates access to educational records.
- The Right to Financial Privacy Act - limits federal government access to bank customer records.
- The Cable Communications Policy Act - restricts access to cable television subscriber information.
- The Electronic Communications Privacy Act (ECPA) - makes electronic surveillance by businesses illegal, as well as the possession of electronic surveillance equipment and the use of data gathered from electronic surveillance.
- The Video Privacy Protection Act - limits how information regarding purchases or rentals of videos can be dealt with.
- Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule - a national standard that protects individually identifiable health information.
- The Gramm-Leach-Bliley Act (GLBA) - mandates that financial institutions make their privacy policies available to customers. Additionally, customers can restrict the use of their personal information. Institutions are restricted from sharing account numbers with third parties.
- The FCC’s Customer Proprietary Network Information (CPNI) Rules (2007) – requires that carriers obtain opt-in consent from customers before disclosing their information to third parties “for the purpose of marketing communications-related services to that customer.”
Notable Privacy Law and Regulations