Skip to Content

Free Expression, Privacy & Data

Senate Commerce Should Reject Bills Jeopardizing Online Safety for Kids and Adults

After numerous letters signed by national civil rights groups opposing the Kids Online Safety Act (KOSA; S.1409), the bill goes to markup again this week along with the Children and Teens Online Privacy Protection Act (COPPA 2.0; S.1418).

Although protecting kids online is a vital goal, the version of KOSA introduced in May does little to address the concerns CDT and many other groups have raised repeatedly: the bill jeopardizes the privacy and safety of all internet users; effectively requires online services to use invasive filtering and monitoring tools, which will restrict users’ ability to access information and speak freely online; and creates disproportionate risks for already vulnerable children.

In November 2022, more than 100 civil rights organizations sent a letter to Congress opposing KOSA. We followed up in December 2022 with a letter clarifying that the amendments to the bill did not resolve our concerns. Just last month, a number of the LGBTQ+ organizations who signed onto previous letters reaffirmed their opposition to the current version of KOSA. The latest drafts of the bill still does not resolve several key concerns:

KOSA will lead online services to implement age verification, jeopardizing the privacy and safety of all users. 

Compared to the version of the bill debated last Congress, new language in Sec. 14 (b)(2) of KOSA expressly notes that nothing in the bill should be interpreted as mandating age verification functionality. But this is essentially meaningless if the very nature of the bill requires online services to treat minors differently from adult users. Doing so would require online services to know the ages of their users, adults and children alike.

For example, KOSA requires online services to limit by default minors’ ability to communicate with other users and to enable a parent or caregiver account to manage their child’s privacy and account settings. These sorts of settings would not be appropriate to apply to adult users’ accounts, as they would limit key functionality for adult users and put adults’ privacy and safety at risk by giving another user the ability to control their communications. Applying these sorts of restrictive settings only to the accounts of minors, and not all users, will require online service providers to collect additional data from all users, in order to distinguish adults from minors.

Methods for conducting age verification that require users to hand over sensitive personal information or ID numbers raise serious privacy concerns and will exclude users like adults facing homelessness or child users who lack the proper credentials. Moreover, requiring users to provide identification information threatens their ability to access information and speak anonymously, which could have harmful chilling effects. Age assurance methods that use machine learning systems to predict the ages of all users may raise additional concerns of bias and error; these systems often perform less accurately when assessing the ages of people with disabilities, nonbinary faces, and faces of people with darker skin tones. In order to extend parental supervision tools to the appropriate adult account holders, online services may even have to collect sensitive documents like birth certificates or adoption paperwork to verify the parent-child relationship between two users, resulting in the increased collection of sensitive personal information.

COPPA 2.0 presents similar age verification and privacy issues. It applies to websites that are directed to children (under the age of 13) and teens (between 13 and 16 years old), or are reasonably likely to be used by children and teens. Under the original COPPA, the distinction between websites directed to *young* children and other websites was at least somewhat clear (think Dora the Explorer versus a sports drink website). If COPPA 2.0 passes, that distinction will be much less clear because 16-year-olds engage with almost all of the internet.

COPPA 2.0 tries to address this through the category of “mixed audience” websites, where a website may be directed to children or teens, under COPPA 2.0, even if they are not the primary audience. However, mixed audience websites must age-gate (or otherwise determine the age of their users) before they can collect any data from their users. These websites will have a strong incentive to collect age information from all their users to continue collecting data from adults. And similar to KOSA, COPPA 2.0 would require parental action through “verifiable parental consent.” Truly establishing parentage may require people to show birth certificates or other documents that prove one user is another user’s parent (though in practice, the verifiable parental consent mechanisms in COPPA do not always guarantee a parent-child relationship between two users and could in some cases allow any apparent adult to provide consent for a child user). Thus, this privacy law could end up costing most users their privacy—or being easily circumvented.

KOSA gives broad power to state Attorneys General to enforce a “duty of care” against online services, which risks further restrictions on speech that disproportionately affect marginalized groups 

As we have outlined in the past, KOSA establishes a vague and overly broad “duty of care” for online service providers to “prevent and mitigate” a suite of harms to minors and gives enforcement power to state Attorneys General (AG). In practice, this would enable a state AG to pressure online services with the threat of litigation if they did not block or remove content that they host and make available to minors that the AG asserted could give rise to one of the listed harms. As states across the country consider laws to limit minors’ and adults’ access to information related to gender-affirming care, menstruation and reproductive healthcare, and Black history, KOSA would supercharge state AGs’ ability to coerce online service providers into blocking access to such information.

New language in KOSA’s Sec. 3 “duty of care” requires online services to prevent and mitigate certain mental health disorders, “consistent with evidence-informed medical information”. Without a definition for “evidence-informed medical information,” this section continues to give broad control to state AGs to selectively cite studies to support dubious efforts to limit access to lawful and life-saving information without judicial oversight. As ProPublica has reported, representatives in Tennessee are currently doing this by cherry-picking data points from unsound studies that draw unscientific conclusions about the safety of abortions and other reproductive care.

The obligation to not only mitigate but to actually “prevent” harms related to online content will lead online service providers to rely on imprecise content filtering tools to filter content at scale. These content filtering tools are invasive and unable to parse intent, meaning that they are likely, for example, to treat content that offers support to those suffering from eating disorders the same as content that promotes or exacerbates eating disorders, resulting in over-removal of helpful resources. The shortcomings of content filters are compounded at scale. Previous experience indicates that these tools are likely to disproportionately remove content from already marginalized communities including LGBTQ+ youth. The use of such filtering technologies also will have a chilling effect on user speech. In a poll conducted by CDT, 71% of parents agreed that using monitoring and filtering technology would result in their children being less likely to be open and expressive in sharing their personal ideas.

KOSA proposes a one-size-fits-all approach to the multidimensional issue of children’s safety, which will make already vulnerable children less safe

This bill rests on the dangerous premise that all family dynamics are similarly healthy and that a one-size-fits-all approach will succeed in protecting vulnerable young people online. Although Sec. 4(e)(3)(B) limits parents’ ability to access “a minor’s browsing behavior, search history, messages, contact list, or other content or metadata of their communications”, parents will still be given significant and potentially sensitive information about their children’s and teens’ Internet use. KOSA requires parental consent for account creation by children, and requires providers to equip parents of children and older minors with the ability to toggle privacy settings on any service their kid uses. This requires, at the very least, sharing which sites or online services a minor is using. This information can be sensitive when a minor is using online services to access resources or care, like Youthline, for example, which provides peer-to-peer support for youth facing distress or abuse.

The amended KOSA continues to require platforms to expand parents’ ability to surveil their children’s activity online. While parental control tools can be important safeguards for young children who are new to the internet, KOSA’s requirements cover older minors as well and would chill the ability of older teens aged 15 and 16 to access information privately. This will disproportionately affect already marginalized teens, particularly those who do not have healthy relationships with their parents. Some of these may be LGBTQ+ teens who are not out to their parents yet or young people experiencing domestic or parental abuse. Subjecting older teens’ online activity to surveillance will undermine trust in even the most healthy families and is likely to result in greater negative mental health effects for marginalized teens.

KOSA undermines the core operations of the education sector 

Finally, KOSA’s latest provisions undermine the work of key institutions that serve minors, including schools. Students increasingly use a wide range of technology in school settings, through learning platforms, student information systems, online gradebooks, and parental portals. Although KOSA exempts schools themselves from its provisions, it does not account for companies and other third parties that are acting on behalf of schools. Most notably, the provision in KOSA that broadly permits parents and children to delete minors’ accounts and related personal data includes this kind of education technology. 

This allows students and their families to request the deletion of permanent records such as unflattering grades or disciplinary actions in addition to any information that’s required to provide critical services to students such as arranging transportation, providing nutrition benefits, or even taking attendance. Additionally, given the bill’s overly broad, and likely unintended consequence to the education sector, the proposed KOSA council does not include any meaningful representation from the education sector. Although it mentions “educators” in passing, it does not include anyone from the U.S. Department of Education or the school administrators who will find themselves on the front line of mitigating harm if this bill becomes law. KOSA, as currently written, is not ready to become law as the overbroad provisions that lack tailoring for schools may unintentionally burden the provision of educational and other services to students.

Protecting children’s safety online is a vital priority, but KOSA is not the way to do it. KOSA threatens the privacy of children and adults and risks depriving vulnerable youth of the access to information and secure communications technologies that they need to stay safe.