Skip to Content

Free Expression

Florida Social Media Law Prioritizes Politicians Over the Public

A political candidate posts a conspiracy theory to Facebook claiming that the 9/11 attacks were “faked.” Another candidate tweets an anti-Semitic claim about his critics and a link to a website with their names, email addresses, and phone numbers. Yet another advertises an AR-15 giveaway on social media as part of his campaign

Social media platforms commonly remove, make less visible, or label posts like these for violating their terms of service or other content rules. But a new Florida law, S.B. 7072, would restrict platforms’ ability to moderate this content or permanently ban the posters, simply because they are political candidates. NetChoice and the Computer and Communications Industry Association have challenged the law in federal court, arguing it violates social media companies’ First Amendment rights. 

CDT joined an amicus brief, led by the Reporters Committee for Freedom of the Press, that explains that S.B. 7072 violates the First Amendment rights of online services and puts press freedoms at risk. Just as importantly, S.B. 7072 is disastrous for social media users and the public. The court should enjoin enforcement of the law and ultimately strike it down. 

At bottom, S.B. 7072 gives political candidates and “journalistic enterprises” special treatment when it comes to content moderation, all because some Florida officials disagree with certain high-profile content moderation decisions made by a handful of companies. In addition, by requiring social media platforms to apply content moderation decisions in a “consistent manner,” the law will make those decisions a minefield for protracted and expensive litigation, which will discourage platforms from engaging in any moderation at all. 

Like former-President Trump’s “Executive Order on Preventing Online Censorship,” Texas Attorney General Ken Paxton’s investigation into Twitter, and a slew of other state and federal legislative proposals, S.B. 7072 is part of a disturbing trend of politicians trying to either control social media platforms’ content moderation practices or browbeat them into ending such practices altogether. In pursuit of this goal, Florida has enacted a legal regime that ultimately will hurt regular social media users and members of the public of all political stripes.

A few of the more concerning parts of S.B. 7072, which is set to take effect on July 1, include:

  • A “must-carry” provision prohibiting certain social media platforms from permanently banning political candidates or suspending them for more than 14 days. With respect to posts by, or even just about, candidates during an election cycle, the law also prohibits these platforms from using automated tools to make such posts more or less prominent in a newsfeed, view, or search results, or to limit or eliminate their display to other users (called “shadow banning” under the law), no matter how abusive or illegal the posts may be. 
  • A section barring certain social media platforms from taking almost any action on content posted by “journalistic enterprises”—a term that covers entities doing business in Florida that either publish a certain number of words or hours of video or audio and have a certain number of readers or viewers; have an FCC broadcast license; or operate a cable channel. The law prohibits social media companies from, among other things, deleting content, making content less visible to other users, or even merely labeling content that was posted by these journalistic enterprises. The law prohibits social media platforms from taking these actions against even the most offensive, defamatory, or illegal content by journalistic enterprises, unless it’s legally obscene.    
  • A requirement that certain social media platforms apply content moderation decisions “in a consistent manner among its users on the platform” and a ban on platforms changing their content rules more than once every 30 days.

The problems with these provisions are numerous. Perhaps most seriously, the must-carry provision for political candidates may make the content on social media platforms more dangerous for users and the public, both inside and outside of Florida. S.B. 7072 allows candidates to post disinformation, incitement to violence, and harassing content with near impunity, since it prohibits platforms from banning or suspending candidates as they would regular users. Candidates who know they can post such content without facing the same serious repercussions as other users will have little incentive to refrain from making these posts, meaning that more of them will appear on social media.  

And because candidates—who often include incumbents—are among the most influential and widely read users on social media, their posts can be especially problematic for users who are targeted by them and for the public at large. What’s more, under S.B. 7072, a political candidate is any person who files qualification papers and subscribes to a candidate’s oath as required by Florida law, a low bar that almost anyone can meet. Bad actors who qualify as candidates under this low bar will be able to exploit the law to violate social media platforms’ rules.  

Additionally, the law’s ban on automated prioritization or “shadow banning” of content by or about political candidates will make social media platforms less relevant and useful for users. As CDT has explained, automated tools for both text and multimedia have serious limitations; however, many social media sites rely on them to organize users’ personalized feeds and to flag or even filter or take down content that violates their rules. 

Prohibiting platforms from applying automated tools to content by or about political candidates will make it difficult if not impossible for social media to incorporate these posts into users’ personalized feeds. Some platforms may try to comply with the law by featuring such content in all users’ feeds even if particular users want to avoid it; as a result, S.B. 7072 may decrease the ability of social media users annoyed by political memes and rants to avoid them. Other platforms may refrain from including posts by or about political candidates in users’ feeds at all because they cannot do so without giving the posts more or less priority over other content, ultimately making those posts harder to find. 

In addition, S.B. 7072’s prohibition on automated “shadow banning” requires social media platforms to rely entirely on human review to screen and remove posts by or about candidates, a daunting proposition for platforms on which vast amounts of user-generated content is posted each day. Automated tools can be essential to platforms that, for example, bar some or all political content. S.B. 7072 makes running such platforms much more difficult and expensive, and may leave users with fewer options to participate in non-political online forums as a result. And it would be difficult for any provider to guarantee that no post by or about a candidate will end up flagged by their automated systems and then reviewed by a staff member, making it legally risky for sites to use of automated tools to fight hate speech, harassment, suicide threats, or other serious issues.

S.B. 7072’s definition of “journalistic enterprises” and prohibition on moderating those entities’ content also pose significant risks to users and the public. Under the law, social media platforms could not delete or downrank posts by certain groups that purposefully peddle in disinformation and rack up hundreds of thousands of views—including disinformation campaigns by state-controlled media. S.B. 7072 prevents even labeling posts by news outlets that are graphic or disturbing or placing them behind a warning, to the detriment to users who are especially sensitive to such content, including marginalized groups and children.  

In addition, S.B. 7072’s ban on updating content moderation policies more than once every thirty days would make social media platforms less nimble in adjusting their rules in response to new challenges, to the detriment of users and the public. Social media platforms often must update their content moderation rules frequently and with short notice to respond to new and evolving issues, such as upcoming elections or public health emergencies

S.B. 7072’s most insidious effect, however, may be how it discourages social media platforms covered by the law from engaging in any content moderation at all. It does this through, among other things, a vague requirement that social media platforms apply their content moderation decisions in a “consistent” manner. The law does not explain what it means by consistency.

At the same time, content moderation decisions can be extremely context-dependent, and often rely on judgment calls about close cases. As a result, the law’s consistency requirement creates untenable uncertainty about whether particular content moderation decisions will be considered “consistent” and in compliance with the law. Coupled with the prospect of defending against expensive civil lawsuits authorized to enforce this provision—at least some of which may be meritless and harassing—some social media platforms may decide the better course of action is to stop moderating content on their platforms entirely.

Creating an incentive to end content moderation entirely harms all social media users. While having some places for unmoderated discussion online is beneficial, internet users gain just as much from having moderated forums for sharing and accessing user-generated content. Content moderation means that users can choose to participate in online communities with rules and approaches that meet their specific interest and needs, whether that be limiting discussion to certain topics like parenting or books, or prohibiting unwanted content like pornography.  

For all of these reasons, S.B. 7072 puts the interest of politicians ahead of the public. The law is yet another partisan attempt to make social media platforms moderate content in a manner preferred by the government, or to pressure them into not moderating content at all. It will take away critical tools these companies use to fight disinformation, hate speech, violent speech, and more. The court should not hesitate to strike down this unconstitutional and unwise law.