Skip to Content

Free Expression

Recent Court Opinions Cast Additional Constitutional Doubt on KOSA’s Duty of Care

After being passed by the Senate earlier this year, this week the Kids Online Safety Act (KOSA) may now be considered in the House Energy and Commerce Committee. KOSA’s path to this point has been long-winding and fraught with controversy, but this is the closest the bill has come to enactment since it was introduced over two years ago. A House markup outcome, should one happen, will shape its chances of becoming law. 

There’s a new hitch, though. Recent court decisions from the Supreme Court and the Ninth Circuit Court of Appeals lend support to CDT’s and others’ concerns that a critical portion of the legislation likely violates the First Amendment because it will require covered platforms to censor content based on vague standards. If lawmakers are serious about protecting free expression and children online, they will need to course-correct.

KOSA’s Duty of Care

Under the Senate-passed version of KOSA, covered services would be required to exercise “reasonable care” in the creation or implementation of any design feature (defined by the bill to include features such as personal recommendation systems, notifications, or appearance altering features) to prevent and mitigate certain harms to minors, including mental health disorders like anxiety and depression and online bullying of the minor. 

The Federal Trade Commission would be charged with enforcing this provision. State Attorneys General would not be empowered to enforce the federal duty of care, though they would be able to enforce other sections of the bill.

Since the bill’s introduction and through its various iterations, CDT and many of our allies have consistently said that the duty of care provision raises significant free expression concerns because it will force platforms to decide what categories of content cause “anxiety” or “depression” or otherwise harm minors, and downrank or remove the content from recommendations systems (which is how most users get their content). These decisions related to choosing content not suitable for children will often be influenced — and perhaps even dictated —- by political actors in government who might take the position that content related to sexual education (e.g., reproductive care and gender identity), LGBTQ support and communities, global violent conflicts, climate change, or really any issue that goes against a government official’s worldview are harmful to minors because their contemplation may cause anxiety or depression. The vast majority of this content, though it may be politically controversial, nonetheless can be critically important to minors’ health and safety. It is also constitutionally protected and the government cannot restrict it, even as to minors, without a compelling reason to do so and narrow tailoring to avoid unnecessary censorship.

Recent Court Decisions Related to Restrictions on Content Moderation

Two recent court decisions lend support to the arguments CDT and our partners have been making about the constitutionality of KOSA’s duty of care and shed light on how a future court will analyze that provision. The first is the Supreme Court’s recent decision in Moody v. NetChoice, in which the Court explained that social media platforms’ moderation of content on traditional newsfeeds is editorial decisionmaking protected by the First Amendment. The second and potentially more relevant decision is NetChoice v. Bonta, where the Ninth Circuit upheld a preliminary injunction against California’s Age Appropriate Design Code Act’s (CAADCA) Data Protection Impact Assessment (DPIA) requirement, which is similar in many respects to KOSA’s duty of care.

In Moody v. NetChoice, the Supreme Court wrote clearly and for the first time that the process of culling and organizing user posts and the enforcement of rules regarding what content is appropriate on a platform — in other words, content moderation — is an editorial process that receives First Amendment protection. The Court specifically applied its analysis to what it described as “traditional newsfeeds” to products like Facebook’s newsfeed or YouTube’’s homepage where the services arrange content to display (or not display) to users based on various factors. Consequently, going forward, it is crystal clear that any attempt by the government to regulate that process by requiring certain content be restricted from those newsfeeds will confront the First Amendment.

Enter the CAADCA, the subject of NetChoice v. Bonta. That law requires all covered platforms to create a DPIA report for each product or service that they offer that is likely to be accessed by children and to submit that report to the state of California. The report must, among other things, address whether the design of the product could harm children, including by showing them “harmful content.” Covered platforms must also create a timed plan to mitigate or eliminate the risks they identify and include that plan in the DPIA report.

In its decision analyzing this provision, the Ninth Circuit panel held that the DPIA requirement likely violates the First Amendment because the DPIA report constitutes compelled speech  that does not relate solely to covered entities’ commercial interests and therefore is subject to strict scrutiny. Furthermore, to create the report’s contents, companies must develop opinions regarding content that might be “harmful to children” and must lay out their plan of action to mitigate those harms, presumably by censoring that content. This second requirement to implement a plan to mitigate content harms “deputizes covered businesses into serving as censors for the state,” likely in violation of the First Amendment.

Potential Application to KOSA’s Duty of Care

KOSA’s duty of care, much like the CAADCA’s DPIA report, would require platforms to develop opinions about what kinds of content lead to harms to children like anxiety and depression and to then mitigate those harms in design features like personalized recommendation systems (i.e., newsfeeds). Put another way, through KOSA, the government would be requiring covered services to act as censors for the state. 

Moody v. NetChoice indicates that this requirement would be subject to strict First Amendment scrutiny, requiring the government to prove that it has a compelling interest in enacting this requirement, that the requirement will advance the government’s interest, and that it is the least restrictive means of achieving that goal. The Ninth Circuit’s decision in the CAADCA case applies this standard to the DPIA and finds that it likely will not survive such scrutiny.

Applying the same reasoning to KOSA’s duty of care, at least to the extent that it would require covered platforms to change their choices regarding what content can be delivered to minors through their newsfeeds, it appears that the provision would be subject to the same fate as the DPIA in the CAADCA. 

That does not mean the government is unable to act in this area. Even the duty of care’s application to other design features that do not implicate content may be constitutional.  Moreover, as we explained in our Bonta amicus brief with the Electronic Frontier Foundation, finding that the content-restriction aspects of the CAADCA are unconstitutional should not jeopardize the constitutionality of properly crafted privacy laws; the same is true as to KOSA’s duty of care. Although both KOSA and the CAADCA have been characterized by their supporters as attempts to protect children’s privacy online, the duty of care and DPIA are not aimed at protecting children’s data, but are instead aimed at restricting content children can view. The Ninth Circuit’s decision on the CAADCA proceeded with appropriate caution on this front, finding only that the DPIA is likely unconstitutional, and remanding analysis of the statute’s other provisions to the district court.

If KOSA’s duty of care is not the right path forward, what is?

As CDT has previously written, Congress has walked this path of attempting to protect children online before when it passed the Communications Decency Act. Many of the provisions of that law prohibited or criminalized the communication of constitutionally protected expression and those provisions were eventually struck down by the Supreme Court. That law contained another provision, however: Section 230 of the Communications Decency Act.

Section 230, in addition to protecting free expression online by ensuring users and interactive computer services would not be liable for others’ content, also sought to protect children and everyone online by incentivizing content moderation of harmful content and the creation of tools to enable users to better control their online experiences. User empowerment is unquestionably a less restrictive means of achieving the government’s goal of protecting children online. The Ninth Circuit said as much when it wrote in the DPIA case that “incentivizing companies to offer voluntary content filters or application blockers” would be a less restrictive means of achieving legitimate ends. 

Rather than focusing on requiring censorship that is likely to face constitutional challenges, Congress should turn its eye toward creating additional incentives for the development of tools and services that help users control their online experiences. At the time of its enactment, many supporters of Section 230 envisioned that it would encourage a vibrant marketplace of user tools aimed at helping improve our online experience and safety for children. That marketplace has not yet materialized. Congress could seize this opportunity to provide further support to these kinds of tools which will help protect kids and everyone online without running headlong into the First Amendment.