Telegram gains users but struggles to remove violent content (2021)

Summary:

After Amazon refused to continue hosting Parler, the Twitter competitor favored by the American far-right, former Parler users looking to communicate with each other — but dodge strict moderation — adopted Telegram as their go-to service. Following the attack on the Capitol building in Washington, DC, chat app Telegram added 25 million users in a little over 72 hours.

Telegram logo

Telegram has long been home to far-right groups, who often find their communications options limited by moderation policies that, unsurprisingly, remove violent or hateful content. Telegram’s moderation is comparatively more lax than several of its social media competitors, making it the app of choice for far right personalities.

But Telegram appears to be attempting to handle the influx of users — along with an influx of disturbing content. Some channels broadcasting extremist content have been removed by Telegram as the increasingly-popular chat service flexes its (until now rarely used) moderation muscle. According to the service, at least fifteen channels were removed by Telegram moderators, some of which were filled with white supremacist content.

Unfortunately, policing the service remains difficult. While Telegram claims to have blocked “dozens” of channels containing “calls to violence,” journalists have had little trouble finding similarly violent content on the service, which either has eluded moderation or is being ignored by Telegram. While Telegram appears responsive to some notifications of potentially-illegal content, it also appears to be inconsistent in applying its own rule against inciting violence.

Decisions to be made by Telegram:

  • Should content contained in private chats (rather than public channels) be subjected to the same rules concerning violent content?
  • Given that many of its users migrated to Telegram after being banned elsewhere for posting extremist content, should the platform increase its moderation efforts targeting calls for violence?
  • Should a process be put in place to help prevent banned users/channels from resurfacing on Telegram under new names?

Questions and policy implications to consider:

  • Does Telegram’s promise of user security and privacy dissuade it from engaging in more active content moderation?
  • Is context considered when engaging in moderation to avoid accidentally blocking people sharing content they feel is concerning, rather than promoting the content or endorsing its message?
  • Do reports of mass content violations (and lax moderation) draw extremists to Telegram? Does this increase the chance of the moderation problem “snowballing” into something that can no longer be managed effectively?

Resolution: 

Telegram continues to take a mostly hands-off approach to moderation but appears to be more responsive to complaints about calls to violence than it has been in the past. As it continues to draw more users — many of whom have been kicked off other platforms — its existing moderation issues are only going to increase.


Written by The Copia Institute, January 2021

Copia logo