Skip to Content

Free Expression

Proposals to Children’s Privacy Rule Pose Real Problems for Free Expression and Innovation

The FTC is proposing changes to the Children’s Online Privacy Protection Act (COPPA) rule that will increase uncertainty for website operators and app developers and could bring a whole new set of sites and services into COPPA’s scope. COPPA requires operators of websites and online services that are targeted to children, or who know a particular user is a child under the age of 13, to obtain verified parental consent before collecting the child’s personal information.

A lot has changed about the collection and use of personal information online since COPPA was enacted in 1998, and the FTC started the current Rule review process in 2010. CDT weighed in on previous rounds of comments, recognizing the need to bring COPPA up to date but cautioning the FTC that changes to COPPA’s age limit or the range of sites it covers would have severe consequences for minors’ and adults’ First Amendment rights.

The FTC has been a strong voice in keeping COPPA focused on children under 13, but, as we discussed in Ars Technica last week, several of their most recent proposals introduce vagueness and uncertainty into COPPA’s scope, which could have real impacts on online innovation and free expression. CDT, joined by the American Library Association, filed comments yesterday that discuss how.

How do plugin providers know where on the web their code is installed?

Some plugins require websites that want to use their code to sign up and receive permission (e.g., through a mechanism like an API key).  There are many examples of plugins–such as those used by YouTube–that don’t. Anyone can embed a YouTube video by copying a bit of code and pasting it into their web page.

When you visit a website and a plugin is activated by your browser, the originator of that plugin is sent a bit of code containing the site’s URL–and nothing more–by what’s known as a “referrer header.” A referrer that points to say, “moshimonsters.com,” doesn’t tell the plugin operator whether the site serves up animated cartoon characters or the characteristics of classic cars. For that kind of data, plugin operators have to look elsewhere, such as analytics providers.

However, an analytics provider may only provide an estimate of a site’s typical user. For example, Quantcast provides this type of estimated data for moshimonsters.com. Such estimates may not be precise enough to measure an audience “disproportionately” made of kids, as contemplated by the FTC under the proposed update to the COPPA rule.

Who Counts as an “Operator”?

One of the biggest changes the FTC proposes is to require the operator of a third-party plugin – meaning analytics providers, advertising networks, social widgets, and any other third-party code running on a given site – to obtain verified parental consent if it “knows or has reason to know” it is collecting information through a site directed to children. This provision is vague: operators are not given clear guidance on what type of notice would be sufficient to trigger this provision, but are just warned that they will not be able to ignore “credible information”. But even if the FTC proposed a thoroughly clear notice-and-action regime for plugin operators, it would still be unfair to put these obligations on third-parties who aren’t targeting children themselves and can’t control whether child-oriented first-parties use their code. [See sidebar for more.]

Fear of these consequences could prevent plugin services from sharing their code with other sites and services, since it’s not clear that there’s anything the plugin developer could do to avoid incurring COPPA obligations. CDT argues that the responsibility for complying with COPPA should lie with the first-party operators who have the direct relationship with users, except in rare circumstances when a plugin purposefully targets children or has actual knowledge that it’s collecting children’s information.

The issue only arises because of another of the Commission’s proposed changes: adding “IP address or other persistent identifier” to COPPA’s definition of personal information, except in cases where this information is collected for support of internal operations. In the broader consumer privacy context, CDT has argued for recognizing the ability for pseudonymous identifiers to act as “personally identifiable information” in certain circumstances. In the COPPA context, we have persistently raised the issue that wholesale coverage of IP address as COPPA-covered “personal information” would lead to the unintended consequence that sites directed to children could not comply with the law, because they would necessarily collect IP address before having an opportunity to obtain parental consent.

The FTC’s proposal to provide exemptions for uses of persistent identifiers to “support the internal operations of a site or service” is a good approach, and these exemptions should be clearly and specifically stated by the FTC. We have advocated for exemptions for operations such as content delivery, site analytics, contextual advertising, identity transaction, and fraud prevention in our work on a universal Do Not Track tool.

But plugins themselves must have the exemption, too – running on the first-party website, they will be “collecting” personal information under COPPA through no effort of their own, as IP addresses are automatically transmitted to them. Without an exemption for the basic, functional uses that plugins make of IP address and other persistent identifiers, it will be difficult for either the first-party operators or the plugin operators themselves to understand when the use of such information is exempted and when it is not. Children’s sites should be able to do contextual advertising and analytics via the use of plugins. The effect of COPPA should not be to make children’s sites shoddy and impoverished, but a failure to extend the exemption to those plugin operators would make it difficult indeed for plugins to comply.

We asked the Commission for another key point of clarification, regarding the liability they envision for platform operators – those services, like the Apple AppStore, that act as general-audience services that could support development of a wide range of content, including content aimed at children. Fundamentally, we think responsibility under COPPA should lie with the entity making the decision to collect data from children, be it the first-party children’s site that’s choosing to use plugins and ad networks, or the app developer who is choosing to make apps for children.

What Sites Are Covered?

Another recent proposal would muddy the understanding of which sites and services the FTC considers to be “directed to children.” The FTC proposes expanding the definition of “directed to children” to include sites and services that are “likely to attract an audience that includes a disproportionately large percentage of children under 13 as compared to the percentage of such children in the general population.” Yet the FTC gives no sense of what would count as “disproportionate”, and does not adequately address its own previous acknowledgement that demographic data “is neither available for all websites and online services, nor is it sufficiently reliable, to adopt it as a per se legal standard”. It would be exceptionally complicated for site operators to gauge what proportions of their audience fall into precise age categories. And attempts to get more information about site demographics would just result in more tracking and data collection from all users.

This “disproportionate” standard would blur the line that the FTC has drawn over the years between sites intentionally, actively aiming for an audience of children and the rest of the Internet. The FTC’s current test for “directed to children” involves a number of variables – including whether the site has cartoon characters or celebrities that appeal to children, uses language pitched at a young audience, or deals in subject matter designed for children – that, taken as a whole, identify sites that appeal to children and likely don’t appeal to anyone else. Shifting away from that standard toward one that pulls in sites aimed at a general audience that happen to appeal to children as well as teens or adults would radically upset the balance that COPPA has thus far achieved.

The FTC goes on to propose that sites who may fall into the “disproportionate” category could be saved from liability if their operators ask for age information from all users prior to collecting any personal information. Implementing age-screening technology would place financial and resource burdens on operators. However, between the potential breadth of the “disproportionate” standard and operators’ general inability to determine whether they meet the standard, this age-screening carve-out for liability would likely seem the least of three evils to operators attempting to know where they stand under the law.

But the decade-long litigation over the Child Online Protection Act (the confusingly similarly named COPA) established that federal laws burdening operators’ ability to provide constitutionally protected material are suspect under the First Amendment. Further, requiring the provision of personal information prior to accessing protected speech is a violation of users’ First Amendment right to access information anonymously. The FTC is putting COPPA on a dangerous path by introducing even a soft version of an age-verification mandate into the Rule.