San Francisco’s Homesharing Ordinance Conflicts With Federal Legal Protection for User-Generated Content
From harassment on Twitter and doxxing on Yelp to the spread of terrorists’ propaganda on Facebook and YouTube, internet companies are facing unprecedented pressure to assume responsibility for the activities of their users. The trend toward new liability for online intermediaries is perhaps most striking in the wave of legislation across the U.S. cracking down on networks for short-term rental listings, or homesharing.
State and local governments are considering a variety of legislative and regulatory efforts aimed at homesharing. Some approaches would require providers like Innclusive, HomeAway, and Airbnb to ensure the listings on their websites comply with local health, safety, building, fire protection, and rent-control laws. Others require homesharing services to collect short-term rental taxes on behalf of the municipal tax authority. Still others require that residents who wish to rent out their homes on a short-term basis register with the government – and levy steep fines against the websites and applications if their users fail to do so.
Some of these approaches, however, conflict with existing federal law – Section 230 of the Communications Act – designed to shield online content hosts against liability for their users’ speech. Technology companies are closely watching Airbnb’s litigation strategy to protect its homesharing model, considering it as a bellwether for the future of Section 230.
Last week, CDT joined EFF and other Section 230 experts in an amicus brief in Airbnb’s challenge to the San Francisco homesharing ordinance, which would alter the legal relationship between homesharing sites and the third-party content they host. In our brief, we emphasize that forcing intermediaries to ensure that users’ listings abide by the city’s content specifications contravenes both the letter and spirit of Section 230.
The San Francisco ordinance seeks to ensure homeowners comply with city licensing requirements by penalizing sites like Airbnb and Innclusive when listings appear without a valid registration number. In a sense, San Francisco wants Airbnb and Innclusive to step into the shoes of the short-term rental authority to enforce its licensing laws, and to use homeshare ads, reviews, and listings as a proxy for homeowners’ registration.
But this ordinance also puts homesharing sites in the shoes of their users, creating a liability for the site operator based on what their users write (or didn’t write). It’s true that e-commerce businesses like homesharing services occupy a different networking niche than pure content hosts like social media websites. But their claim to Section 230’s protections against content liability is just as strong. As we argue in our brief:
At issue here is not whether the San Francisco ordinance implicates the free speech interests that underlie Section 230. It is whether the law targets [Airbnb] in its role as a categorically protected publisher of speech created by others.
In other words, Section 230 asks if a law holds intermediaries legally responsible for the content of speech provided by third parties. Because the San Francisco ordinance penalizes homesharing sites when their users upload unlawful posts, we believe that it does just that, and is therefore inconsistent with (and preempted by) Section 230.
In addition to the legal standards, the policy reasons for protecting these types of transactional intermediaries from content enforcement requirements are equally compelling.
First, when Section 230 preempts civil liability in state and federal courts, it protects intermediaries from a patchwork of inconsistent tort regimes for content provided by thousands, perhaps millions or billions, of users. This assurance of uniformity encourages investors to support startups experimenting with new ideas like homesharing. This in turn ensures that internet users have access to a diversity of innovative products and services that they can use to become more empowered consumers, creators, communicators, and small business owners.
Second, when Section 230 is applied against criminal liability at the state and municipal levels, it empowers internet companies to resist efforts by governments to co-opt their services as extensions of law enforcement. This in turn protects users by ensuring that the intermediaries they rely on are not compelled to monitor their postings or regulate their speech at the behest of local authorities. It also ensures that if the police decide to take action against harassing trolls or unlicensed landlords, they must target the individuals responsible for the unlawful content rather than the platform whose services they used.
These policy arguments are true regardless of whether the service calls itself a content host or an e-commerce site facilitating online transactions. We’ve seen a number of instances of governments targeting an intermediary’s technical or transactional services in order to suppress lawful content. The experience of Backpage.com demonstrates that trying to separate transactional and content-hosting functions at the state and local level creates a great risk for the free speech of users.
Section 230 has long been a moving target for political leaders and litigants looking to turn intermediaries into regulators of their online communities. But there are good reasons why we shouldn’t want these intermediaries to become gatekeepers and regulators of user-generated content. As state and local regulators contemplate the on-demand economy, they must understand that the federal framework for shielding intermediaries – and promoting free speech online – means that some options are off the table.