Skip to Content

Cybersecurity & Standards, Free Expression, Government Surveillance, Privacy & Data

After Disgraceful Attack on Capitol, Changes to Tech Policy Should Balance Action with Respect for Civil Liberties

As Congress considers changes to technology laws and policies in the aftermath of the January 6 attack on Capitol Hill, lawmakers should ensure that our rights to peaceful protest and passionate dissent remain central parts of our democracy. 

It would be unwise to give in to the temptation to make hasty and sweeping changes to laws that can affect the freedom of speech and association for millions of Americans. As we saw after 9/11, such changes tend to disproportionately affect racial, ethnic, and religious minorities. 

Social media will loom large in the discussion, with Congress already primed to tackle questions of content moderation and consolidation of corporate power. There is understandable horror in seeing how easily online incitement can translate into offline violence. At the same time, the de-platforming of President Trump, and taking Parler out of app stores and Amazon Web Services’ content-hosting platform, illustrate the power of these companies to shape the online information environment. Indeed, that was their goal: AWS, Twitter, Apple, and others explained their decisions as efforts to limit further incitement and disinformation and to prevent additional offline violence.  

Such displays of corporate power raise significant policy questions, but there’s no quick legislative “fix” to this complex situation. The First Amendment limits what speech Congress can restrict and what speech it can compel private companies to make available on their services. And Congress cannot authorize holding intermediaries liable for hosting or enabling access to lawful speech, even if that speech is abhorrent. 

Private companies, on the other hand, can and do go much further than the First Amendment in limiting the speech they’re willing to sustain, and can move much more quickly than any legal process in deciding that some speech is too dangerous to continue to distribute. Section 230, a piece of Internet legislation passed into law as part of the Communications Decency Act of 1996, protects these moderation decisions, while also shielding companies from liability for content posted by others. Changes to Section 230 intended to make it riskier for sites to host speech that might lead to offline violence or other harms would create strong incentives for companies to suppress controversial speech from a broad spectrum of organizers, activists, and other voices.

Instead, Congress should focus on promoting greater transparency from these services about how they make important moderation decisions, so that everyone can understand where the lines are being drawn and the services can be subject to more effective oversight. Congress should also take steps to enable independent research that increases platforms’ accountability, and that can help platforms better identify when online chatter is mobilizing offline harm. 

Any potential solutions need to account for salient differences: the “right” answer for large social media companies with ample resources may not be workable for a small website that features restaurant reviews. And the appropriate policy may differ for app stores, hosting services, and other providers that are higher in the Internet “stack,” where companies’ decisions impact the fate of entire websites.

Lawmakers should also look to other important areas of law that affect our online information ecosystem. Federal privacy legislation, for example, could put real limits on services’ ability to collect and share people’s personal information. That would fundamentally change the behavioral advertising landscape–and thus bad actors’ ability to target different groups with disinformation. Congress should also make competition a top priority, as the risks to free expression are exacerbated when just a small handful of companies are making moderation decisions.

Another issue that has arisen in the aftermath of the attack on the Capitol is the role of surveillance technology. Here again, policymakers need to ensure that they do not inadvertently harm civil liberties in a rush to respond.

Already, there are dangerous calls to use facial recognition technology to scan social media content and identify people who stormed the Capitol. But facial recognition technology is unreliable, especially in this context of trying to match one face to many in environmental conditions where light and face orientation cannot be controlled. Moreover, facial recognition technology has been shown to be less accurate when used on Black people, creating a risk of misidentifications that perpetuate racial biases in policing.

Also, as insurrectionists increasingly migrate from open social media platforms to encrypted systems given tech companies’ moderation and de-platforming efforts, we are likely to see renewed calls for backdoors to encryption in order to permit law enforcement access. Congress should resist those calls because a backdoor to encryption that is designed for law enforcement can also be exploited by criminals, malicious hackers, and hostile foreign governments. Weakening cybersecurity is the wrong response to the attack on the Capitol.

Instead, Congress should focus on examining why the considerable surveillance authorities and resources that law enforcement agencies already have were not effectively utilized to prevent the Capitol attack, especially given the easily observable organizing that took place on social media. 

At this point, there is simply no evidence that additional surveillance authorities are needed, especially when they would be prone to abuse that would inevitably focus disproportionately on marginalized communities, journalists, and activists.

As the new administration and new Congress assume power, they must grapple with the assault on our democracy while protecting the values that our democracy holds dear. A commitment to policy discussions based on a nuanced understanding of how these technologies work, and how policy changes would impact the varying users and marginalized voices who rely on those technologies, is a good place to start. 

Read more from Alex about CDT’s 2021 tech policy roadmap for a new White House & Congress.