Skip to Content

Facebook’s Fight to Block Violent and Hateful Content

Voice of America:

In early May, Facebook said it was hiring an additional 3,000 people to identify and remove violent and hateful content.

The announcement came after several people posted videos of murders and suicides that stayed on Facebook for hours.

In a Facebook post, CEO and co-founder Mark Zuckerberg called the videos “heartbreaking.” He said the new employees will work with about 4,500 existing ones to find and remove such content as quickly as possible.

Facebook receives millions of reports about content each week, said Zuckerberg.

In addition to humans, the company uses technical tools to identify questionable material. But with Facebook’s nearly 2 billion monthly active users, identifying and blocking banned content is difficult.

Emma Llanso is director of the Free Expression Project at the not-for-profit Center for Democracy & Technology. She says the Facebook documents show how difficult it can be for social media companies to balance free expression with content controls.

“I think it can help more people understand what a challenging task we all face in figuring out what are the sorts of speech and the kinds of content that we all find acceptable on our social media services…”

She said one way for social media companies to balance these issues is to provide better filtering tools for users to block content themselves.

“You have got to look more to the ability for people to create their own filters or blacklists, or categories of things that they just don’t want to see. That’s the sort of response that really puts the power in the hand of the individual user…”

Full story here.