Much of the internet’s success has been due to Section 230, the federal law that has encouraged the development of user-generated content platforms for more than twenty years. As CDT has long observed, Section 230, which immunizes online platforms from liability for unlawful content posted by their users, has encouraged the growth of platforms that otherwise might face crushing liability due to user activity. One of the hallmarks of Section 230 is its preemption of conflicting state and local laws — effectively, no state or municipality can enforce a law that conflicts with the provisions of Section 230.
Unfortunately, that hasn’t stopped some states and cities from trying. As technology companies begin to operate in spaces traditionally regulated by states and cities, we have seen many instances in which they attempt to regulate service providers, some of whom fall within Section 230’s protections. We have provided guidance to legislators and regulators on the best ways to regulate while still protecting individual rights, though problematic proposals still persist. Two recent examples highlight the challenges of crafting 230-compliant legislation, which I discuss here in detail with the hope that other policymakers will work to avoid these issues.
First, a proposal in San Francisco that would regulate short-term rentals clearly goes against what Section 230 states. Under current San Francisco law, individuals who plan to offer their homes on short-term rental platforms must register with the city, obtaining a registration number. The proposal would require platforms to pre-screen user listings to confirm that the registration number is included, imposing potential fines should listings not contain that number.
If states and cities could enact a variety of conflicting laws, the whole point of Section 230 would be undermined.
This imposition of liability clearly goes against Section 230, which states in (c)(1) that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” — meaning that, if an information content provider, typically an individual user, posts something illegal, the interactive computer service, typically a website, can’t be held liable for it. Moreover, under (e)(3), “no liability may be imposed under any State or local law that is inconsistent with this section.” States and localities can pass laws that are consistent with Section 230, but anything inconsistent with Section 230 — like the imposition of liability on a website operator for user-generated content — is unlawful. From a logistical perspective, this makes a great deal of sense. If states and cities could enact a variety of conflicting laws, the whole point of Section 230 would be undermined. As a global medium, the internet wouldn’t work if it were subject to piecemeal regulations by every state and city within the US.
The multiple escape routes that the San Francisco proposal identifies don’t inoculate it from Section 230, either. The proposal attempts to avoid imposition of liability in two ways. One is allowing platforms to pre-screen content; alternatively, the platform can submit listings to the city for corroboration of the registration number before posting. But Section 230 doesn’t allow for states to create liability but couple it with an alternative — it explicitly precludes the imposition of liability, full stop.
The other recent proposal, from Chicago, creates similar issues by holding platforms liable for user content. Like the San Francisco proposal, it uses fines as the leverage to require platforms to ensure that listings on a platform have been approved by the city. And, as with the San Francisco proposal, the architecture of the liability structure runs afoul of Section 230’s preemption clause. The problematic language in this legislation, Section 4-13-250, states “It shall be unlawful for any licensee … to list, or permit to be listed, on its platform any short term residential rental that the commissioner has determined is ineligible for listing”; the penalty for violations, in Section 4-13-410, is “a fine of not less than $1,500.00 nor more than $3,000.00 for each offense. Each day that a violation continues shall constitute a separate and distinct offense.” This essentially creates a strict liability regime for website operators based on third-party content: if a user uploads a non-compliant rental listing, the site operator would immediately be in violation of this provision, regardless of whether they were aware of the posting or its ineligible status. No matter what the amount the potential fine is, this imposition of liability clearly contravenes Section 230.
Enforcing the laws of a city or state is an important goal, especially when those laws are designed for compliance, safety, and non-discrimination. Yet it is equally important to ensure that the internet remains an open platform for innovation and exchange, which requires ensuring that intermediaries are not held legally responsible for content they did not author. In enacting Section 230, Congress ensured that this value would be the law of the land, and it is important that cities and states abide by superseding federal law. As states and cities continue to float proposals targeting companies operating in highly regulated areas like housing and transit, they must work to ensure that those proposals are consistent with laws like Section 230 and best practices concerning individual speech and privacy.