Facebook removes anti-racist content based on racist terms like "skinhead", due to lack of context consideration
Summary: Social media platforms are constantly seeking to remove racist, bigoted, or hateful content. Unfortunately, these efforts can cause unintended collateral damage to users who share surface similarities to hate groups, even though many of these users take a firmly anti-racist stance.
A recent attempt by Facebook to remove hundreds of pages associated with bigoted groups resulted in the unintended deactivation of accounts belonging to historically anti-racist groups and public figures.
Hundreds of anti-racist skinheads are reporting that Facebook has purged their accounts for allegedly violating its community standards. This week, members of ska, reggae, and SHARP (Skinheads Against Racial Prejudice) communities that oppose white supremacy are accusing the platform of wrongfully targeting them. Many believe that Facebook has mistakenly conflated their subculture with neo-Nazi groups because of the term “skinhead.”
Dozens of Facebook users from around the world reported having their accounts locked or their pages disabled due to their association with the "skinhead" subculture. This subculture dates back to the 1960s and predates the racist/fascist tendencies now commonly associated with that term.
Facebook’s policies have long forbidden the posting of racist or hateful content. Its ban on "hate speech" encompasses the white supremacist groups it targeted during its purge of these accounts. The removals of accounts not linked to racism -- but linked to the term "skinhead' -- were accidental, presumably triggered by a term now commonly associated with hate groups.
Questions to consider:
- How should a site handle the removal of racist groups and content?
- Should a site use terms commonly associated with hate groups to search for content/accounts to remove?
- If certain terms are used to target accounts, should moderators be made aware of alternate uses that may not relate to hateful activity?
- Should moderators be asked to consider the context surrounding targeted terms when seeking to remove pages or content?
- Should Facebook provide users whose accounts are disabled with more information as to why this has happened? (Multiple users reported receiving nothing more than a blanket statement about pages/accounts "not following Community Standards.")
- If context or more information is provided, should Facebook allow users to remove the content (or challenge the moderation decision) prior to disabling their accounts or pages?
Resolution: Facebook's response was nearly immediate. Facebook apologized to users shortly after OneZero reported the apparently-erroneous deletion of non-racist pages. Guy Rosen (VP- Integrity at Facebook) also apologized for the deletion on Twitter to the author of the OneZero post, saying the company had removed these pages in error during its mass deletion of white supremacists pages/accounts and said the company is looking into the error.