House Committee on Commerce Subcommittee Telecommunications, Trade, and Consumer Protection
July 21, 1998
|I. Introduction and Summary|
The Center for Democracy and Technology (CDT) is pleased to have this opportunity to testify on the issue of individual privacy in the online environment.
CDT is a non-profit, public interest organization dedicated to developing and implementing public policies to protect and advance civil liberties and democratic values on the Internet. One of our core goals is to enhance privacy protections for individuals in the development and use of new communications technologies.
In the invitation I received to testify, the Committee identified several key questions that must be answered if we are to have an informed dialogue about privacy on the Internet. The crux of the Committee's questions can, I believe, be summed up as: "who should lead?"
|II. Who should lead: the role of privacy legislation in regulating private sector activities|
The Federal Trade Commission's report served as a wake up call. It jump-started industry efforts, galvanized privacy and consumer groups, and intensified the Administration's review of its chosen stance on privacy protection. As you heard this morning, the FTC's review spurred some increased efforts in the private sector. While we welcome the renewed efforts of industry, we strongly believe that industry acting alone will continue to fall short of the goal of providing pervasive privacy protections. It is clear to us that a more comprehensive and vigorous approach is required. However, we must also recognize that legislation will not on its own provide complete privacy protection. Privacy protection must build upon the strengths of existing efforts -- self-regulatory and technical -- but fold them into a comprehensive system of enforceable privacy protections.
A. A purely self-regulatory system is inadequate
Recognizing that both individual company efforts and broader industry efforts to provide clear rules to protect privacy are a necessary component of achieving privacy protection, we believe that, on its own, self-regulation will fail to provide meaningful privacy protections for individual privacy. Some believe the current lack of privacy protection to be purely an issue of timing, and that given more time industry will successfully provide wide-reaching privacy protections. While systems to protect privacy are likely to become more prevalent and robust, there are structural flaws in a purely self-regulatory system which have repeatedly proven to undermine consumer protection and there are specific difficulties that arise from the nature of the Internet.
1. Systemic flaws of self-regulation
The four primary shortcomings of industry self-regulation in the privacy area have been: 1) the failure to incorporate core elements of fair information practice into substantive guidelines; 2) the lack of oversight and enforcement; 3) the absence of legal redress to harmed individuals; and, 4) the inability to set enforceable limits on government access to personal information. Self-regulatory efforts in general and those specifically presented today continue to exhibit these structural flaws.
In general industry guidelines depart from the shared language and structure of the Fair Information Practices (FIPs) as articulated in the Code of Fair Information Practices developed by the Department of Health, Education and Welfare (HEW) in 1973 and the Guidelines for the Protection of Privacy and Transborder flows of Personal Data, adopted by the Council of the Organization for Economic Cooperation and Development in 1980. The failure to incorporate important baseline concepts of privacy protection -- such as limits on data collection, data subject access and correction rights -- renders existing guidelines, on their face, inadequate to protect privacy. Private sector efforts, and government efforts, to address privacy concerns must be based upon the widely agreed upon Fair Information Principles. The various privacy principles provided here today muddy the water with new and less comprehensive frameworks which are likely to confuse the public, confound a business community that must deal with the looming implementation of the EU Data Protection Directive, and provide American's with less privacy protection than appropriate. The HEW principles and the OECD guidelines should serve as the baseline for both public and private sector activities to protect privacy. None of the current models for industry self-regulation embody the FIPs. Without the ability to hold industry to these core privacy principles, no self-regulatory plan will adequately protect individual privacy.
Oversight and enforcement often have been missing components of industry self-regulation. Without a strong commitment to ensuring adherence to policies, self-regulation is doomed to be inadequate -- and will appear to the public, policy-makers, and advocates as window-dressing designed to squelch needed regulatory activity. For example, while many trade associations put forth model policies, few have the ability to enforce member companies' adherence. Even those self-regulatory regimes that do have censure policies must face the troubling prospect of having to censure companies supporting the program. Often it is precisely the lack of oversight and enforcement power that eventually drives industry good-actors to seek legislation codifying self-regulatory principles in an effort to bind industry bad-actors who are tarnishing the reputation of the industry as a whole.
Assuring compliance requires: initial assessment of compliance; ongoing review; internal and external accountability; openness; a process for investigating and responding to suspected failures; and sanctions for violations of policy. Internal assessments are a necessary element of assuring compliance with fair information practice principles, however external methods of overseeing and monitoring compliance should buttress the system of internal audits. While industry has proposed a number of methods for ensuring compliance, CDT believes that a baseline established and enforced through law provides the soundest method of ensuring compliance without burdening smaller businesses.
The absence of effective and responsive legal redress to harmed individuals is a recurring problem with self-regulatory solutions. Industry-generated policies rarely offer consumers meaningful relief in instances of a breach of policy. Although a few companies have compensated individuals for egregious privacy lapses, industry agreement on a broad-based program which would grant consumers significant compensation (monetary or otherwise) has not yet materialized. A redress mechanism must provide individuals with a fair forum to present grievances, and must be able to mete out meaningful compensation for violations of policy. Without a means for grievances to be identified and addressed, and the ability to receive compensation for breaches of policy, consumers will find self-regulation, on its own, inadequate.
By its very nature, self-regulatory structures are incapable of devising controls to address forces outside industry. The inability of self-regulatory guidelines to set limits on law enforcement access to information held in the private sector is a structural flaw that cannot be remedied. It is important to acknowledge that a primary focus of laws protecting privacy is to limit the government's access to and use of personal information that is held by third-parties. As data collection continues to escalate in the private sector and "data mining" and other practices tying sensitive and personal information together increase, concerns over law enforcement and other government access to personal information held in the private sector grow. Self-regulatory efforts -- even if grounded in the FIPs, fully overseen and enforced, and coupled with sanctions and meaningful individual remedies -- cannot effectively limit the ability of government to access the growing stores of privately held personal information.
2. Particular difficulties of implementing policy in the online environment
Implementing policy in a global, networked environment like the Internet presents a series of challenges. The diversity and multiplicity of players, the ease of crossing national borders, and the lack of centralized control mechanisms create challenges to those seeking to regulate activities on the Internet. The increased generation and collection of information made possible by new technologies, and the increasingly revealing nature of transactional that is blurring traditional distinctions between the content of a communication and the transactional data used to route a message to its destination, present privacy-specific challenges. The successful protection of privacy depends upon our ability to meet these challenges. The complexity of implementing policy in this global and distributed environment suggests that rather than relying on a single tool to implement policy, we should use the strengths of each in combination. This suggests that self-regulation, regulation and technology each must play a role in protecting privacy.
Regardless of the policy at issue, questions of effectiveness and enforcement rise to a new level in the decentralized, global and borderless environs of the Internet. Traditional top-down methods of implementing policy and controlling behavior -- be they international agreements, national legislation, or sectoral codes of conduct enforced by the private sector -- offer incomplete responses to the privacy issues arising on the global information infrastructure. Providing a seamless web of privacy protection to individuals' data as it flows along this international network will require self-regulation, regulation and technology.
|III. Crafting Legislation|
Debate over the capacity of self-regulation and market forces to adequately address privacy concerns is common in the privacy and consumer protection arenas. Advocates often take the position that self-regulation is inadequate due to both a lack of enforcement and the absence of legal redress to harmed individuals. Industry tends to favor strongly self-regulation, stating that it results in workable, market-based solutions while placing minimal burdens on affected companies. These positions, while in tension, have both accurately described the self-regulatory process. A close look at the enactment of federal privacy legislation over the years reveals some common elements that move both parties toward supporting legislation, and supports CDT's belief that the time is right to enact a federal baseline of privacy protections for the electronic environment.
A. Learning from successful efforts to enact privacy legislation
Industry positions on the desirability of legislative or regulatory privacy solutions have varied. While industry has frequently opposed legislative efforts, at times, it has vigorously supported, and even actively pursued, privacy legislation where it believed a law was necessary to build public trust and confidence in a particular industry or technology. The majority of industry-supported privacy efforts have resulted in legislation that limits the ability of government -- particularly law enforcement -- to gain access to information about individuals. However, a number of industry-supported privacy laws have actually placed limits on the private sectors use of personal information. In such instances good industry actors have led the way crafting self-regulatory policies that are the prototype for subsequent legislation supported by self-regulated players who for reasons of public trust, liability, and/or government concern want to bind bad industry actors.
It is instructive to examine the factors that have led industry and the public interest community to join together in support of privacy legislation aimed at regulating both government and private sector use of personal information.
The Electronic Communications Privacy Act of 1986 (ECPA), which updated the 1968 Wiretap Act, was the result of a collaborative public interest/private sector effort. Industry feared that without legal protection against eavesdropping and interception, consumers would be reluctant to use emerging electronic media, such as cellular phones and email, to communicate. The resulting law extended legal protection akin to that provided First Class mail, and was developed and supported by a diverse coalition of business, civil liberties, and consumer advocates who understood that consumers would be unwilling to fully embrace electronic mail and other new technologies without strong privacy protections.
Similarly, 1995 amendments to ECPA crafted privacy protections for transactional information that was content-like in its ability to reveal facts about a person's life. In these instances, developing and enacting a legislative privacy regime was viewed by the business community as a necessary component of creating and supporting a flourishing market for their products. The nexus between privacy protection and business necessity resulted in a diverse public interest/industry coalition supporting increased protections for transactional data.
Historically, for privacy legislation to garner the support of industry it must build upon industry efforts to regulate itself -- binding bad actors to the rules being followed by industry leaders -- or, be critically tied to the viability of a business service or product as with the Video Privacy Protection Act and the Electronic Communications Privacy Act. Today we have the opportunity to support the work of privacy-aware companies and increase the viability of online commerce, by developing a regulatory framework of privacy protections in the private sector.
B. The stage is set for privacy legislation
The Federal Trade Commission's recent Report to Congress: Privacy Online confirmed what advocates, industry representatives and the public knew: privacy on the Internet was far from a reality. The Federal Trade Commission's three year focus on privacy had raised the level of attention and concern, but had not delivered concerted action by businesses operating online. Despite commendable efforts, some quite recent, such as BBB Online, the Children's Advertising Review Unit, the Online Privacy Alliance, and TrustE the overwhelming majority of Web sites had yet to take the first step of "openness" about their use of personal information. The statistics were grimmer than many expected, with only 2% of the survey's sample posting comprehensive privacy policies and only 14% posting any notice at all.
The current state of self-regulatory activities and the brief review of successfully enacted privacy legislation above suggest that we have the opportunity to craft and pass legislation that will protect privacy and ensure the development of online commerce. Numerous surveys have documented the public's overwhelming concern with privacy online. Many responsible industry actors are engaged in efforts to craft privacy rules; unfortunately many more companies have yet to take any visible action to protect privacy. We have the opportunity to develop privacy rules that establish strong protections for individuals, a fair baseline for a competitive marketplace, and a framework of trust for electronic commerce.
|IV. Recommended legislative proposals to protect individual privacy including children's privacy|
At this time, legislation is needed to accomplish two of these goals: the adoption and implementation of privacy policies in the private sector and the creation of legally enforceable privacy rights for individuals.
A. Privacy rules for the private sector
Over the past three years, the FTC has accumulated the knowledge and expertise necessary to set comprehensive guidelines for privacy online and in electronic commerce. However, the FTC has indicated that it do not believe it has the authority to affirmatively establish baselines in this area. The FTC's authority over privacy should be clarified and it should be explicitly given authority to establish baselines to protect the privacy of personal information (based on the Fair Information Practices discussed above). Specifically the FTC should be directed to establish baselines that require companies to:
In the area of children under 13 years of age, the FTC should be directed to establish specific rules that address the inability of young children to comprehend and consent to the collection and use of personal information; the need for parental involvement in children's online activities involving personal information; the potential risk to children posed by the public posting of information that facilitates contact (both online and offline) with a child; and the need to ensure that business practices and privacy protections do not inappropriately interfere with children's ability to access information and receive information that they have requested and the benefits of interactivity.
B. Limits on use and disclosure of personal information by Web sites and Internet service providers
In addition, to ensure that personal information collected by Web sites and Internet service providers is protected from disclosure, and that government access to personal information collected and stored at Web sites is limited, the Electronic Communications Privacy Act (18 U.S.C. 2703) should be amended to:
|V. Technology and Privacy|
In addition to crafting federal rules to protect privacy, we must look to technologies that protect privacy. Such technologies can provide protection across the global and decentralized environment of the Internet where law or self-regulation may fail. Technology can provide a shield around the individual's actions, communications and identity, providing confidentiality, pseudonymity or anonymity. It can also serve as a mediator or facilitator capable of expressing and monitoring data practices and policies.
Unfortunately, society is best acquainted with technologies that enable personal activities and commercial behavior to be tracked. Traditionally advances in technology have met the government and private sector's proclaimed need to monitor, evaluate, and trace the behavior of individuals. Technology has eroded individual privacy by enabling massive data collection and manipulation, enhancing the ability to track activities, and fostering the use of data for purposes unintended and unforeseen by the individual data subject. In past years, national legislatures and international bodies have often stepped in to address risks to individual privacy posed by advances in these kinds of technologies.
Current trends in computing offer an opportunity to shift the relationship from privacy versus technology to privacy enhanced by technology. The effects of distributed network computing are not yet clear in the area of privacy. Interactive media has increased the non-consensual, surreptitious collection of personal information and greatly facilitated tracking of personal and commercial behavior. These trends, if left unaddressed, will continue technology's tradition of eroding individual privacy. However, there is growing evidence that the rapid decrease in cost and expertise needed to develop and use information technology coupled with the decentralized nature of the global network can be harnessed to significantly alter technology's traditional relationship to privacy.
A number of technologies have been put forward for protecting or enhancing privacy in networked environments. They vary from tools that provide near anonymity to those that seek to provide openness about data practices and foster informed decisions by individuals. The technologies differ in their ability to respond to and support the varied privacy concerns that arise in relationships, interactions and roles.
As discussed above, networks generate, collect, and store vast amounts of data -- yet individuals are rarely aware of these activities. In that context, technologies are under design to facilitate transparency of data practices, enable consent to be withheld or communicated, minimize data collection, provide anonymity, and enable secure exchanges of information where appropriate. Many technologies that support privacy rely on cryptography. Cryptography is essential to ensuring individual privacy in network environments. Various applications of cryptography provide individuals, and entities, with mechanisms to protect communications and information while in transit and during storage, and to shield the individual's identity. It is a key element of technologies such as digital certificates and electronic cash. Cryptographic methods also offer new opportunities to minimize the collection of personal data, by enabling secure but anonymous payments, transactions, and interactions." Technology coupled with policy can play an important role in fostering the implementation of privacy protections on global information networks like the Internet.
Technology can provide a shield around the individual's actions, communications, identity, or any combination thereof, providing confidentiality, pseudonymity or anonymity. Digital technology generates, collects, and captures a vast amount of data about the flow of information, communications, and interactions. Where the individual's identity is revealed, or can be readily derived, in connection to these activities the digital environment creates an unsurpassed capacity for tracking of personal activities and commercial behavior. Technologies that minimize or eliminate the collection of information about the individual's identity are essential to privacy protection in the online environment.
A number of technologies have been developed that eliminate the collection of identity information, thereby enabling anonymous transactions. Eliminating the collection of information eases the task of protecting privacy. Technologies that prevent entities from collecting data allow individuals to engage in activities without privacy ramifications. By eliminating data collection, tools of anonymity mitigate the need for other principles of data protection. However, anonymity alone will not support the full range of interactions, relationships, and communications individuals engage in on international networks.
"Anonymizers" protect an individual's identity while "surfing," or browsing, the World Wide Web. Functioning as a proxy or intermediary between the individual's browser and the server from which they are retrieving information, the Anonymizer removes information that could potentially reveal the individual's identity to others. In general they should be accountable for complying with measures which give effect to Fair Information Practice principles. Because they may provide a central point of information about an individual's activities, their compliance with these principles is especially important, particularly with regard to the establishment of: frequent cycles of log destruction; limits on reuse; limits on access by third-parties; and security safeguards.
Digital Cash can vastly reduce the need for the collection and revelation of identity information. By providing alternative methods of authenticating value, the online environment can afford cash-like anonymity while providing some of the protections against theft associated with traditionally data intensive payment mechanisms. The ability to engage in cash-like transactions in the online environment is important to the protection of privacy. The enhanced data generation and collection that occur during the process of browsing a virtual store front (a merchant's World Wide Web site) increases the privacy concerns associated with the revelation of identity during the payment process. The capacity to connect information far in excess of the specifics of a given financial transaction to the individual's identity increases the risks to individual privacy relative to the offline world.
Like Anonymizers, the development of electronic payment mechanisms that protect privacy hinges on the use of strong cryptography and the creation of a robust public key infrastructure to support its use.
Digital Certificates can allow for the verification of an individual's permission to engage in activities, access information, or enter restricted areas without verifying the individual's identity. They can also be used to verify identity. Digital certificates and other credentialing mechanisms can limit the need to collect personal data by verifying attributes rather than identity. However, they are just as likely to be used to tie identity and attributes together with a single certified digital identity. Digital certificates can be issued on a purpose-specific basis, in which case it would be possible to limit the collection of information that could be used for other purposes. However, digital certificates can also be designed for multiple purposes, making it harder to control the collection and use of information.
In response to public and policy-maker concerns regarding the surreptitious collection of information some of those responsible for network specifications and standards are moving towards designs and implementations that make data generation and collection more obvious. Concern over the privacy implications of "cookies" and particularly the collection of information about individuals' activities across unrelated Web sites enabled by some implementations, caused a ripple through the technical community. The initial response was the addition of a "cookie prompt" which alerts individuals that a Web site wishes to place a "cookie" on their browser. Broader responses include the current attempt by members of the Internet Engineering Task Force (IETF) to address privacy concerns with a rewrite of the "cookie" standard, and the availability of various technological tools that allow users to delete and/or disable "cookies."
The World Wide Web Consortium's (W3C) Platform for Privacy Preferences (P3P) is a technical effort to provide a framework for implementing fair information practice principles on the Internet. The P3P effort attempts to leverage the unique characteristics of the Internet -- interactivity, real-time communication, and capacity to facilitate and support end-user decisions -- to facilitate privacy protection. The goal of the P3P project is to provide a common framework upon which various privacy policies and laws can be expressed, communicated, and complied with.
The Platform for Privacy Preferences provides a simple communication tool and language for the expression of data practices. In addition, the Platform for Privacy Preferences allows individuals to consider the data practices of an entity before interacting with it. Openness about data practices is likely to enhance the individual's ability to make choices that protect privacy and assist with the implementation of national and international policies.
The privacy "language" recently released by the W3C's Platform for Privacy Preferences Vocabulary Working Group is intended to be descriptive, as opposed to normative. It allows various statements of information practice, thereby supporting various policies and legal regimes. As it is intended for global use, the language was crafted with attention to existing fair information practice principles as reflected in national laws and self-regulatory codes. While it has been critiqued for being both over- and under-inclusive, the vocabulary is a first attempt to provide a language for privacy practices on the Web.
P3P does not establish preset limits on the collection of personal information, however it promotes the ability of the individual, or those acting on their behalf, to set their own limits on the collection of information by others.
The development of technological tools that enhance privacy should be promoted. Tools that facilitate anonymous interactions and those that allow individuals to control the flow of personal information once revealed are important to the protection of privacy in the online environment. The technological mechanisms examined above are responsive to some of the obstacles the Internet poses to traditional methods of policy implementation. Many can be independently deployed by the individual and require no reliance on, or agreement with, the government or other party. They may provide protection in environments that lack legal or other policy protections for privacy, lessening concerns about citizens' interactions with entities outside national borders. They may also provide protection which exceeds that available under existing law. In addition, while they may not answer the normative question, "What is the appropriate policy?" the existence of technologies that support data privacy will force decisions about data collection and use into stronger relief.
The rise of technologies that empower individuals to affirmatively control personal information on international networks presents an opportunity to fundamentally shift the balance of power between the individual and those seeking information. However, they must be viewed within the larger context of other efforts to produce cohesive privacy protections in the online environment. Currently US encryption policy is interfering with the availability of technical tools that protect privacy. Congress should seek to increase the availability of encryption and promote the development of other privacy-enhancing technologies.
Privacy protections must keep pace with changes in technology and society's use of technology. As we consider privacy in the changing communications environment we must question past assumptions and the legal distinctions based upon them. More importantly, we must ask whether they provide protections reflective of our commitment to individual privacy autonomy, dignity, and freedom. Privacy protection in the electronic commerce environment will best be achieved through a combination of legislation, self-regulation and technology.
Establish limits on the disclosure and use of personal information by private entities. Both the Federal Trade Commission and the Department of Commerce are engaged in initiatives designed to promote "fair information practice principles" in the online environment. We are encouraged that Congress is exploring protections for individual privacy during private sector activities. In considering this issue we recommend that Congress: 1) authorize the Federal Trade Commission to establish baselines for protecting privacy grounded in the Code of Fair Information Practices developed by the Department of Health, Education and Welfare (HEW) in 1973 and the Guidelines for the Protection of Privacy and Transborder Flows of Personal Data, adopted by the Council of the Organization for Economic Cooperation and Development in 1980; and 2) amend the Electronic Communications Privacy Act to clarify limits on government access to personal information and limit disclosures to third-parties.
Encourage the development and implementation of technologies that support privacy on global information networks. Technological mechanisms for protecting privacy are critically important on the Internet and other global medium. Developing meaningful privacy protections in the online environment requires us to realize that our laws and Constitutional protections may not follow our citizens, their communications, or their data as it travels through distant lands. Technology can provide protections regardless of the legal environment.
Strong encryption is the backbone of technological protections for privacy. Today technical tools are available to send anonymous email, browse the World Wide Web anonymously, and purchase goods with the anonymity of cash. The World Wide Web Consortium's Platform for Privacy Preferences, currently under development, will provide an underlying framework for privacy -- allowing Web sites to make their information practices available to visitors and individuals to set privacy rules that control the flow of data during interactions with Web sites. This effort has involved non-profit, for-profit, and government representatives.
The US should encourage the development of privacy-enhancing technologies that address the need either to eliminate data collection, or where data collection occurs: to limit the data collected; to communicate data practices; and to facilitate individualized decision-making where consistent with policy.
Thank you for the opportunity to participate in this important discussion about protecting privacy in the online environment.