for Democracy and Technology Testimony of Jerry Berman, Executive Director Center for Democracy and Technology
Before the Subcommittee on Telecommunications, Trade and Consumer Protection September 11, 1998

My name is Jerry Berman, Executive Director of the Center for Democracy and Technology. The Center is pleased to participate in this hearing at the request of the Subcommittee. We welcome the opportunity to address a critical issue: how to achieve the goal of protecting children from inappropriate material on the Internet consistent with constitutional values and the growth and health of the Internet.

The Center for Democracy and Technology (CDT) is an independent, non-profit public interest policy organization in Washington, DC. The Center's mission is to develop and implement public policies to protect and advance individual liberty and democratic values in the new digital media. The Center achieves its goals through policy development, public education, and coalition building.

From it inception in January 1995, the Center has played a leading role in policy debates on how to protect children from inappropriate material online. In particular, we view this issue through the experience of the legislative process that resulted in Congress’ first attempt to regulate content on the Internet — the unconstitutional Communications Decency Act (CDA). As the coordinator of the Citizens Internet Empowerment Coalition (CEIC), CDT joined with the American Library Association, and others, to rally civil liberties organizations, the library and publishing communities, Internet service providers, and individual users of the Internet to challenge the CDA. In federal district court in Philadelphia, the coalition undertook an educational effort to demonstrate for the judges the unique nature of the Internet — something Congress had failed to consider when it enacted the CDA. We gave the court a tutorial on the Internet. The Supreme Court decision in Reno v. ACLU (hereinafter the "CDA decision") striking down the CDA on First Amendment grounds was largely based on the factual findings of the lower court detailing the nature and characteristics of the Internet.

Our message today is simple: The legislative proposals before the Subcommittee today repeat the mistakes of the CDA. They fail to take into account the special aspects of this potentially powerful medium. They are ineffective, unconstitutional, or unnecessary.

I. The CDA decision

In the CDA decision, the Supreme Court struck down a sweeping attempt by Congress to regulate a broad and undefined category of speech, "indecency," across a wide range of Internet interactions including email, chat groups, and the World Wide Web. As the Supreme Court recognized, the Internet offers new and unique opportunities to maximize the ability of individuals and families to choose the content worthy of their attention. The Court found that users of the Internet are not assaulted by material, and that the risk of encountering unwanted "material by accident is remote because a series of affirmative steps are required to access specific material." The Court concluded that the Internet should not be treated like a broadcast medium. As the Court stated, "Unlike the conditions that prevailed when Congress first authorized regulation of the broadcast spectrum, the Internet can hardly be considered a scarce expressive commodity. It provides relatively unlimited, low-cost capacity for communication of all kinds..." The Internet is a global medium and much of the material that would be considered offensive is produced overseas. "Unlike other media, there is no technologically feasible way for an Internet speaker to limit the geographical scope of his speech…or to ‘implement [] a system for screening the locale of incoming’ requests."

II. The Current Proposals

Before the Subcommittee today are at least seven well-intentioned but flawed efforts to address the complex problem of protecting children from speech that is considered inappropriate for them. The bills take three distinct approaches, reflecting the complexity of the issue and the diversity of opinions about the role federal legislation can play in solving it. Each of the seven bills before you today is narrower than the CDA and each reveals an effort to more appropriately balance constitutional values in the effort to protect children. Nevertheless, none of the bills succeeds in this effort.

The bills can be placed in three general categories:

The Child Online Protection Act (H.R. 3783), sponsored by Representative Oxley (R-OH), requires entities that sell or transfer information considered "harmful to minors" to restrict access to those under 17. The bill seeks to erect around inappropriate information on the World Wide Web virtual walls that those under 17 cannot climb — in effect zoning the Web.

Two bills, the Safe Schools Internet Act (H.R. 3177) sponsored by Representative Franks (R-NJ) and the Child Protection Act (adopted by the Labor, Health and Human Services, and Education Appropriations Subcommittee) sponsored by Representative Istook (R-OK), would condition federal e-rate funding for schools and libraries on the use of filtering technology. A third bill, the E-Rate Policy and Child Protection Act (H.R. 3442) sponsored by Representative Markey (D-MA), conditions e-rate funding on the adoption of policies outlining "acceptable use" of the Internet.

Three bills - the Internet Freedom and Child Protection Act (H.R. 774) sponsored by Representative Lofgren (D-CA), the Communications Privacy and Consumer Empowerment Act (H.R. 1964) sponsored by Representative Markey (D-MA), and the Family-Friendly Internet Access Act (H.R. 1180) Representative McDade (R-PA)— require Internet access providers to make filtering software designed to limit children’s access to inappropriate information available to subscribers at the time they sign up for service.

III. By what standards should we measure the proposals?

There are three key factors to consider in weighing proposals to protect children from inappropriate content:

Assessing the seven bills in terms of effectiveness, protection of constitutionally protected speech, and burdens on those who provide access to the Internet, CDT concludes that the bills fall short. Some appear unconstitutional on their face while others are likely to be applied in a fashion that violates First Amendment and privacy values. Others, while probably constitutional, are unlikely to substantially address the problem at issue. Several require the private sector, libraries and schools to engage in efforts that are already well underway.

The bills will prove ineffective at meeting the goal of protecting children. In the past, it was assumed that governments could control print or broadcast material within their borders, and that publishers had some ability to control and direct the distribution of their materials. The physical nature of the media by which information and ideas were produced and disseminated meant that they were controllable.

On the Internet, neither governments nor publishers can control the distribution of material made available over the Web. As the findings in the CDA case state, "Once a provider posts its content on the Internet, it cannot prevent that content from entering any community. Unlike newspaper, broadcast station, or cable system, Internet technology gives a speaker a potential worldwide audience."

The global and decentralized nature of the medium and the fact that it does not allow publishers to easily discern who is seeking and requesting information are barriers to the effective implementation of laws to protect children from information online. In the CDA decision, the Court found that objectionable information is likely to come from outside the US and be unreachable by US laws. "The district court found that a ‘large percentage, perhaps 40% or more, of content on the Internet originates outside the United States’…" "Because of the global nature of the Internet, material posted overseas is just as available as information posted next door." In addition it is difficult to discern, and make access decisions based on, age.

Several of the proposals will limit access to constitutionally protected speech and are not narrowly tailored to meet the government’s interest in protecting children from inappropriate information.

Three of the bills - the Child Online Protection Act (H.R. 3783; Oxley), the Safe Schools Internet Act (H.R. 3177; Franks), and the Child Protection Act (Istook) — will in their application limit adults’ and older minors’ access to constitutionally protected information. The "harmful to minors" bill attempts to nationalize a standard the Supreme Court has always tied to local community standards. The school filtering bills are likely to result in the filtering of speech far beyond what is considered obscene or harmful to minors. As the Supreme Court restated in the CDA decision, "The level of discourse reaching a mailbox simply can not be limited to that which would be suitable for a sandbox’ and this is so ‘regardless of the strength of the government’s interest’ in protecting children." Despite the more limited scope of these bills, we believe they are not narrowly tailored to address the problem at issue.

Several of the proposals will burden those who provide access to the Internet with little benefit to children.

Under four of the bills, Internet access providers are required to take steps to control the content available to minors. While seeking to exercise control over content through ISPs may at first glance seem attractive, making them responsible for information that merely travels through their systems would fundamentally change the nature of the Internet and is practically impossible. ISPs cannot easily monitor the enormous quantity of network traffic to stop the incoming flow of material. Selectively disabling access or limiting transmission to particular users is complicated and in many cases practically impossible. Electronic networks typically do not allow for the identification of particular users or their geographical location.

The goal of providing children with enriching experiences on the Web that reflect the norms and values of their parents, and the communities in which they live, is one shared by many organizations who oppose these bills, including CDT. CDT believes that as a society we have a responsibility to protect children from information deemed inappropriate, and to provide those responsible for our children’s well being with the information, resources and tools to accomplish this goal. But we firmly believe that achieving this goal must be accomplished in a manner that is consistent with First Amendment values and respects the diversity of parental and community values across the nation.

IV. Analysis of proposals

A. Harmful to minors

The Child Online Protection Act (H.R. 3783; Rep. Oxley) is narrower than the CDA. It requires entities engaged in the business of transferring or selling over the World Wide Web information deemed "harmful to minors" to place it behind a barrier surmountable only by those over 17. Unlike the "indecency" standard of the CDA, the Child Online Protection Act (H.R. 3783; Rep. Oxley) seeks to use a term that has been recognized "harmful to minors." However the bill strays from existing "harmful to minors" law, which is based upon local community standards, seeking to establish a national definition of information that is considered "harmful to minors."

Harmful to minors should be based on community norms, not a national standard

The core of the Child Online Protection Act (H.R. 3783; Rep. Oxley) is to set a single national standard defining speech that cannot be made available to minors over the World Wide Web. The creation of a national "harmful to minors" standard will constrain the ability of communities to determine what information is appropriate for their children. Centralizing content decisions in the federal government runs counter to existing "harmful to minors" law as articulated by the Supreme Court.

The US Supreme Court has never approved of a single, national obscenity standard, nor has it approved a "harmful to minors" statute based on a national, as opposed to local, standard. The Court’s decisions defer to local community standards. As the Court stated in the landmark obscenity decision Miller v. California, there cannot be:

fixed, uniform national standards of precisely what appeals to the "prurient interest" or is "patently offensive." These are essentially questions of fact, and our Nation is simply too big and too diverse for this Court to reasonably expect that such standards could be articulated for all 50 states…

Replacing local decision-making with federal standards will have unintended consequences. It may create a "lowest-common denominator" where the community that is least tolerant of speech is able by default to set the national standard. This could greatly reduce the amount of information that children and adults can access in areas with great tolerance for speech. In the alternative, a national standard may limit conservative communities’ ability to adopt standards that go beyond a federally-defined base line.

In addition, the novel approach of a national "harmful to minors standard" raises vagueness concerns. How can affected entities determine what a cross section of the nation will find harmful to minors? Without clearer guidance — which the bill on its face suggests is necessary — this new and novel national standard provides little information about the activities prescribed by the bill.

Broad scope

While the bill seeks to govern commercial actors, it covers all entities engaged in "the business of selling or transferring" material that is "harmful to minors" by means of the Web. Entities affected by the bill go well beyond commercial pornographers. This definition potentially includes Internet access providers, bookstores, and non-profits that offer items for sale. The bill places in jeopardy not only the creator of the content but also all who may sell or transfer it, whether or not it is their business and regardless of whether money is exchanged, over the Web. ISPs do not know what information is transferred across their system. Many of the entities likely to be affected by the bill are unable to make use of the age verification techniques that comprise the affirmative defenses due to cost and/or availability.

The affirmative defenses found in the bill will spur the collection of personal information about individuals and their First Amendment activities.

Under the First Amendment, a barrier to accessing information must be the least restrictive form that the medium supports. Where the barrier conditions access in ways that may chill individuals’ exercise of their First Amendment right to read or access information the Court has struck down burdens. Due to the state of current age verification systems, the affirmative defenses found in the bill will push individuals into the position of having to disclose personal information — in some instances including name, address, social security number, in addition to credit card — to the publisher or a third party in order to access information. Current age verification technologies tend to be identity driven. Reliance on such systems will create records of individuals’ First Amendment activities. Currently there are no rules limiting the private sector use of such information and it is unclear whether law enforcement access to these records would be constrained by existing law. Conditioning adult access to constitutionally protected speech on a disclosure of identity raises troubling First Amendment and privacy issues. The defenses pose a Faustian choice to individuals seeking access to information -- protect privacy and lose access or exercise First Amendment freedoms and forego privacy.

The bill does not use the least restrictive means.

The CDA decision sent a clear signal to Congress that, when seeking to regulate speech on this new medium, government must use the least restrictive means available. As the Court restated in the CDA decision, ’the level of discourse "reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox" and this is so "regardless of the strength of the government’s interest’ in protecting children."

While H.R. 3783 seeks to regulate access to a narrower category of speech than the CDA, that does not mean that it will pass the least restrictive means test. The burdens placed on speech by this bill may be found too great in light of the inability of national censorship laws to effect the availability of information from non-domestic sites on the World Wide Web and from a variety of other Internet media such as Usenet newsgroups chat, bulk-email, electronic bulletin boards, not to mention non-electronic media.

The CDA decision, and the findings of fact upon which it is based, identified filtering and blocking technologies as a narrow, media-appropriate means of providing families with the means of protecting their children while meeting the diversity goals of the First Amendment. Congress has not held hearings to determine whether technical tools or this bill would be the least restrictive means of protecting children. There has been no study, no discussion, and no comparison of the effectiveness of various approaches, their likely impact on speech, and their appropriateness for the Internet.

B. Protecting children in the school and library setting

The Safe Schools Internet Act (H.R. 3177; Franks) and the Child Protection Act (adopted by the Labor, Health and Human Services, and Education Appropriations Subcommittee; Istook) condition federal e-rate funding for schools and libraries on the use of filtering technology. In contrast, the E-Rate Policy and Child Protection Act (H.R. 3442; Markey) conditions e-rate funding on the adoption of an acceptable use policy. While all three are aimed at ensuring that libraries and schools take steps to protect children from inappropriate information when they are outside their parents’ eyes, they are likely to have very different impacts on constitutionally protected speech. Of all the bills, the E-Rate Policy and Child Protection Act is likely to be the most respectful of local authority and is least likely to pose constitutional problems.

Requirements to adopt filtering technology will effectively usurp local communities’ ability to set standards that reflect their values.

While a goal of the Safe Schools Internet Act (H.R. 3177; Franks) and the Child Protection Act (Istook) is to maintain local autonomy, the actual impact of the bills is likely to mirror the Child Online Protection Act’s drive toward a national standard. Unlike the national "harmful to minors" standard discussed above, the bills on their face are quite protective of community prerogatives. However, due to several factors the impact of the bills is unlikely to meet this intent:

The impact of requiring schools and libraries to implement filters is likely to be the replacement of the existing diversity of local community norms with a narrower set of views offered by companies that provide off the shelf filtering and blocking tools. In order to maintain funding libraries and schools may find themselves out of step with their communities’ values. This in turn may subject them to litigation.

Similarly, the requirement to install filtering software interferes with decisions by local communities, educators, and librarians to protect children through other means. These institutions are actively pursuing solutions that are responsive and appropriate to their specific missions, goals, and constituencies. Thoughtful local decision-making would be replaced by the decisions made by private companies — many of which are shut off from public scrutiny due to lack of disclosures about the process or guidelines for blocking sites. The prospect of schools and libraries being forced by budgetary constraints to choose between forgoing funding or delegating their traditional power to unchecked private entities raises troubling First Amendment issues.

Restricting speech

While the Supreme Court has upheld the government’s right to restrict speech that it funds where the speech reflects government policy, the government may not restrict speech where the purpose of funding is to propagate a diverse range of private views. E-rate funding is explicitly designed to facilitate access to the Internet — a broad range of ideas and views — not to express a specific government policy. Several studies of commercial available filters suggest that they curtail access to information on topics ranging from gay and lesbian issues, women’s health, conservative politics, and many others. If libraries and schools are faced with a limited set of options, this approach may force them to censor more than they would choose and in effect discriminate against specific viewpoints.

The bills will alter adults’ ability to access constitutionally protected material in ways that will constrain and in some instances violate their First Amendment rights.

Currently adults and children are able to access information that falls into the "harmful to minors" category in the same way they access other information online. In schools and libraries with only one terminal the requirement to install and activate filtering software will require adults and older minors to affirmatively request access to constitutionally protected information. As noted above the Court has stated that the government may not require adults to affirmatively request controversial but protected material in order to receive it. Acceptable use policies would avoid this problem.

C. Providing parents with access to content selection software

The Internet Freedom and Child Protection Act (H.R. 774; Lofgren), the Communications Privacy and Consumer Empowerment Act (H.R. 1964; Markey), and the Family-Friendly Internet Access Act (H.R. 1180; McDade) are aimed at making screening software designed to limit children’s access to inappropriate or unsuitable information more readily available to parents in the home. They require Internet access providers to offer subscribers such software at the time they sign up for service. These proposals are unnecessary. Private sector efforts are already well advanced to place technical tools within easy reach of parents. Congress would be wise to let the market continue on its own for a number of reasons:

V. Alternatives to legislation

While the Congress and courts around the country have been debating whether censorship laws can protect children online, companies and non-profit organizations have responded with wide-ranging efforts to create child-friendly content collections, teach children about appropriate online behavior, and develop voluntary, user-controlled, technology tools that offer parents the ability to protect their own children from inappropriate material. Unlike legislative approaches, these bottom-up solutions are voluntary. They protect children and assist parents and care-takers regardless of whether the material to be avoided is on a US or foreign Web site. They respond to local and family concerns. And they avoid government decisions about content. We would like to describe some of these initiatives to emphasize their diversity, their user-controlled nature, and their responsiveness to parental concerns.

Education, Green Spaces, and Other Initiatives

Many public-private initiatives are underway to help parents and children learn to navigate the Web safely, create kid-friendly content zones, and to work with law enforcement to ensure children’s safety. They include:


In addition to ongoing efforts to develop resources, educational tools and child-friendly materials, the Internet community has sponsored several public events to highlight the issue of children’s safety online, including access to inappropriate content, and inform the public of the resources and tools to address it. The Internet Online Summit: Focus On Children was held on December 1st - 3rd 1997. More than 650 participants representing over 300 organizations came together to assure that steps were taken to make the Internet online experience safe, educational and entertaining for children. Several major Initiatives emerged from the Summit, including:

Next week, the America Links Up: A Kids Online Teach-In, a broad-based public awareness campaign to ensure that every child in America has a safe, educational and rewarding experience online, kicksoff. Based upon the findings, recommendations and commitments made during the December 1997 Summit, the America Links Up coalition has committed to working with the online industry, families, teachers, librarians and other children's advocates to:

The campaign begins with a National Town Hall meeting in Washington, DC. The teach-in will discuss the importance of the Internet to our children's future, the pitfalls that parents and teachers should be aware of, and how adults can keep children safe when they are online. Participants will include parents and kids, industry leaders, government experts, children's advocates, teachers and librarians. The meeting will also feature the unveiling of:

Acceptable use policies

Schools, libraries, and other educational and cultural community centers are already seeking ways to provide children with enriching and safe online experiences. A central component of these efforts is protecting children from inappropriate information. Approaches range broadly.

The United States Catholic Conference has developed an "Ethical Internet Use" policy under which each school or diocese adopts a policy detailing the rights and responsibilities of students, parents and teachers in Internet use. The policies are buttressed by contracts signed by students, parents and teachers. For example, Freemont Public Schools in Freemont, Nebraska, like many other public institutions, uses Acceptable Use Policies that educate students on how to access appropriate information and emphasize classroom supervision.

Other schools have chosen to incorporate into their Internet use tools that filter access at the desktop or network level and/or monitor access by students into their Internet strategy. School districts such as the New Haven Unified School District in Union City, California offer schools the ability to choose from filters that help limit access to content and access logs that help teachers monitor classroom use to ensure children’s safety. Others such as Macomb County, Michigan, have established a countywide Internet filtering solution but allow individual schools to decide whether to employ it.

Voluntary use of blocking and filtering technology

Blocking and filtering technologies offer parents who voluntarily choose to use them an additional method of addressing children’s access to information online. While filters may be considered over - or under - inclusive by various individuals and communities, for some parents they offer a useful tool.

Filtering is widely available today. Every family that brings Internet access into the home for children has the option, often at no cost, to filter out information judged inappropriate for children and invite in that which is appropriate according to that family’s own values. In the United States, filtering software is readily available to Internet families:

Blocking and filtering technologies are easy to use and more effectively shield children from inappropriate material than a law. Filtering software is able to keep up with a proliferation of content from millions of Internet sites around the world and across jurisdictional boundaries. Filtering software can block inappropriate material coming form foreign Web sites.

Filtering software is capable of accommodating a diversity of family values and educational needs. As filtering software and services develop, they enable parents to share their children’s Internet experiences as appropriate to the particular child’s upbringing and maturity level.


However, to ensure that the development of filtering technologies moves forward in service of the free flow of information and the protection of children it is crucial that parents be offered:

Parents who choose to use these tools will only be able to choose ones that support their values if information about the products is available.

VI. Conclusion: What should Congress do now?

The infirmities of the proposed legislation ought not to lead to the conclusion that there is nothing to be done about the very real problem of Internet speech that is inappropriate for children. While communities across the country are grappling, with this issue, Congress has yet to provide a forum for sustained, substantive dialogue. For this reason, the Subcommittees attention to this issue is particularly welcome. Increased awareness, generated by local communities, advocates, and the activities described above, has encouraged parents around the country to become more involved in their children’s use of the Net and spurred the development of voluntary blocking, filtering, and other content selection tools that assist parents in creating a positive experience for their children. Support from Congress would further and speed these important efforts.

This Subcommittee could provide a needed forum for a serious discussion of this important issue. It could begin the process of examining the alternatives available to achieve the goal of protecting children. Can we zone the Internet, and what are the risks of doing to? Should we seek to verify the age of those seeking certain materials, or in doing so will we create new problems? Should we develop easier to access and use resources and tools for parents and communities? What approach will effectively achieve our shared goals? Such an effort, not a continuing cycle of hasty legislation and time-consuming litigation, is the process through which we will ultimately make the Internet a safe place for children and realize our most cherished First Amendment values.


Other Speech Issues.

The Center For Democracy And Technology
1634 Eye Street NW, Suite 1100
Washington, DC 20006
(v) +1.202.637.9800 (f) +1.202.637.0968

For more information, write [email protected]