Skip to Content

Free Expression, Privacy & Data

Do Not Track Kids Bill Revives Minors’ Online Privacy Debate

The Do Not Track Kids Act (DNTK) has resurfaced, bringing the debate over minors’ online privacy back to the federal level. Sponsored by now-Senator Markey and Representatives Barton and Rush, this year’s bill is largely the same as the Markey-Barton bill of 2011. As we noted in 2011, the DNTK bill’s use of the Fair Information Practice Principles framework is a good approach to protecting the privacy of users’ information – but extending those protections only to users in a certain age bracket raises significant complications for users and operators alike.

The 2013 bill also brings back the “Eraser Button” concept (though it’s now simply called “Removal of Content”). Online ‘eraser buttons,’ or the European counterpart the ‘right to be forgotten,’ inherently raise critical questions about the interaction between one user’s privacy interest in data she’s shared and another’s free expression right to quote or comment on public information. I’ll discuss the challenges raised by ‘eraser buttons’ in more detail below, but one thing is clear: Piecemeal privacy regulations – whether it’s a state-by state approach or laws that only protect certain age groups – are not going to achieve the kind of comprehensive protection for personal data that Internet users deserve. Congress should focus its attention on developing and passing a baseline consumer privacy bill that recognizes the same privacy rights for everyone and avoids drawing operators into a tangled thicket of regulations.

“Fair Information Practice Principles” at the heart of the bill

CDT has long advocated that the Fair Information Practice Principles (FIPPs) should guide the development of flexible, substantive protections for user privacy, and it is heartening to see legislators using this framework when they contemplate privacy rules. The FIPPs were codified by the OECD in its 1980 guidelines on the protection of privacy in transborder data flows. They have since been used around the world as organizing principles for data protection. Broadly speaking, the FIPPs require companies to provide users with notice and choice about what data companies will collect, use, and retain; institute meaningful security and redressability measures; and promote data hygiene. The FIPPs have been embraced by many as the basis for data privacy protection, in large part because they create transparency around companies’ data practices and allow users to entrust their personal data to companies through affirmative notice and consent.

The DNTK Act of 2013 would codify FIPPs protections by creating several principles that companies must follow in protecting minors’ data. These include collection limitation, data quality, purpose specification, retention limitation, security safeguards, openness, and individual participation. In general, the FIPPs principles described in the bill encompass the typical FIPPs framework of privacy protections that CDT has supported in previous baseline privacy legislation.

Age-based privacy regulation is not the right approach

One critical fault with DNTK 2013 as a privacy bill, though, is that it would only require website operators to provide FIPPs-based protections if their sites are “directed to minors” or if they know that a particular user is a minor – and the bill defines minor as someone “over the age of 12 and under the age of 16.” Setting aside the significant issue of determining when a site is “directed at” (only?) 13-, 14-, and 15-year-olds, this is obviously an absurd outcome. I doubt anyone involved in drafting this legislation really believes that a website operator should provide these protections to user data only when they know a user is between the ages of 13 and 15. This arbitrary and narrow scope would leave the majority of users outside the important FIPPs protection framework.

DNTK 2013 is framed, in part, as an update to Children’s Online Privacy Protection Act (COPPA), meaning the bill is stuck with COPPA’s existing legal definition of the term “child” as an individual under the age of 13. COPPA requires parental consent for the collection of children’s personal information, which is appropriate for younger children, but as we have argued and the FTC has recognized, it would be inappropriate to require parental consent for older teens to share their own personal contact information, since teens have independent First Amendment rights to access information. In trying to embody both parental-consent obligations for younger children and user-consent requirements for teen users, however, the DNTK 2013 bill creates a new legal age-grouping, 13-to-15-year-olds, and then awkwardly applies certain protections only to them.

All of this illustrates why age-based privacy protections are a bad idea: They require legislators to justify why the government has an interest in extending certain protections only to certain age groups; they raise the potential for significant confusion among users of all ages about when and how their data is protected; and they put disparate regulatory burdens on operators. This incentivizes operators either to collect more information about age or date-of-birth from all of their users, or – as we’ve seen with COPPA’s under-13 category – to simply prohibit users in the defined age group from using the site or service. Neither of these outcomes is beneficial to the privacy or free expression and access to information rights of users of any age, which makes it difficult to justify these kinds of age-based regulation.

Eraser Buttons aplenty

DNTK 2013 also takes an age-based approach in the content-removal section of the bill, creating a right for users of all ages to “erase or otherwise eliminate” content or information that they provided to the site that contains or displays information about children or minors under 16. This provision is something of an improvement over the 2011 bill’s eraser button, in that it more narrowly scopes the erasure right to information the user provided to the operator himself. (Without this narrowing, an erasure right could allow a user to order the takedown of truthful public information about him that another user – a reporter, or political opponent, or dissatisfied customer among others – has posted on their own online space.)

The scoping of the DNTK 2013 content removal right is a bit odd, focusing only on content that includes statutorily defined personal information about someone under 16. The Eraser Button law passed in California this summer, in contrast, covers any content or information posted by a user, but is only available to users under 18 (who live in California). With other states considering their own versions of this law, the potential for inconsistent state legislation in this arena is obvious. The DNTK 2013 bill, as federal law, would preempt inconsistent state law, but it’s far from clear how the overlapping rights granted by California (let alone other states) and this federal provision would interact.

It also raises fundamental questions about the purpose of a content-removal right for users: ‘Eraser button’ laws seem to be primarily concerned with allowing users to remove information they have previously made public. The California law provides for removal of any content or information posted by a minor user, likely under the theory that a minor user should be able to revise and amend her digital identity as she ages. The federal proposal would provide for removal of any personal information (defined in COPPA as a list of information that allows the contacting of the child) about people under 16, possibly under a child-safety theory that treats this type of information as more sensitive. Both of these approaches are necessarily limited by third-party users’ rights to display and discuss truthful public information posted by or about other users, even minor users. And neither of these approaches deals with the retention of this information by operators or the ability of law enforcement to demand access to information held on third-party servers. In short, eraser button regulations may change user experience for some users in a way that makes them feel they have more control over information they share, but they are far from effective at truly “erasing” information from the Internet.

What this really points to is the need for Congress to focus on passing comprehensive user privacy legislation at the federal level. Piecemeal protections, which only apply certain FIPPs, only cover certain sub-groups of users, or only protect residents of certain states, are not the way to accomplish the clear and consistent privacy protections that all users deserve. They are more likely to pull operators, states, and users into protracted legal wrangling as online service providers struggle to comply with a tangled thicket of regulations and users remain confused about what rights they actually have.

We’d much rather see Congress take this bill’s good idea of using FIPPs as a framework and use it to develop privacy protections that will benefit all of us.