By Andre Jones
Facebook announced in a blog Tuesday that it has deleted around 66,000 posts every week during the last two months as it gets serious about cracking down on what it feels is hate speech.
The social media giant said that deleting posts can “feel like censorship” but that it is working on explaining the vetting process better in order to improve its enforcement of hate speech. Let’s look at how Facebook is currently defining hate speech.
According to Facebook, hate speech is defined as attacks on people based on their race, s*xual orientation, and other “protected characteristics.” Mostly relying on the almost 2 billion users to report hateful posts, the 4,500 employees reviewing the posts are bound to make some mistakes. Some maintain that those “mistakes” are themselves hate filled and are slanted in favor of white people who use Facebook.
Last year, Facebook deleted a post by Black Lives Matter Activist and poet Didi Delgado which read, “All white people are racists. Start from this point, or you’ve already failed.” Her post was promptly removed and she received this warning, “You recently posted something that violates Facebook policies, so you’re temporarily blocked from using this feature. For more information, visit the Help Center.”
Facebook Vice President Richard Allan had this to say, “We know that these kinds of mistakes are deeply upsetting for the people involved and cut against the grain of everything we are trying to achieve at Facebook.”
Facebook’s algorithm for “protected” characteristics does a lot to explain the “mistaken” deletion of posts that call out racism and injustice and the alarming absence of deletions for well-known white hate groups and violent racists. If a post or page is considered an attack on a protected category, it is, by Facebook definition, hate speech and has a good chance of being deleted. In Delgado’s case, it is helpful to know that “white people” are a protected category.
“The policies do not always lead to perfect outcomes,” said Monika Bickert, head of global policy management at Facebook. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.”
According to The Root, though Facebook uses the State Department’s list of terrorist organizations to ban certain groups. a casual search on Facebook returned over 122 Ku Klux Klan groups and more Nazi groups than can be counted.