Telegram Shifts Stance on Content Moderation Following CEO’s Arrest
In a significant policy change, messaging platform Telegram has quietly removed language from its FAQ page that previously stated private chats were protected from moderation requests. This alteration comes in the wake of CEO Pavel Durov’s recent arrest in France, where he faced allegations of allowing criminal activity on the platform.
Durov, breaking his silence since the arrest, issued a public statement promising increased content moderation efforts on Telegram. This marks a notable departure from the company’s earlier position that it had “nothing to hide.”
In his statement, Durov acknowledged the challenges posed by Telegram’s rapid growth to 950 million users, which has made preventing criminal abuse of the platform more difficult. He committed to making “significant improvements” in content moderation and expressed a personal goal to prevent abuse on the platform, promising to share more details on progress soon.
The changes are reflected in Telegram’s updated FAQ page, which now includes instructions on how to report illegal content through the app’s “Report” buttons. This contrasts with the previous version, which stated that Telegram did not process requests related to private chats and group chats.
Durov’s arrest by French authorities came with charges of allowing the platform to be used for distributing child sexual abuse material and facilitating drug trafficking. Historically, Telegram has taken a hands-off approach to moderating content, a stance that has made it a critical source of information about Russia’s war in Ukraine.
This shift in policy represents a significant change for Telegram, which has long prided itself on its privacy-focused approach. As the platform continues to grow, it faces increasing pressure to balance user privacy with the need to prevent illegal activities.