The messaging app Telegram has announced that it will now share users' IP addresses and phone numbers with authorities who present valid search warrants or legal requests. CEO Pavel Durov stated in a Telegram post that this change in the platform's terms of service and privacy policy is intended to "discourage criminals." He emphasized that while 99.999% of Telegram users are not involved in criminal activities, the 0.001% engaged in illicit behavior tarnish the platform's reputation and endanger the interests of its nearly one billion users.
This announcement marks a significant shift for Durov, the platform’s Russian-born co-founder, who was recently detained by French authorities at an airport near Paris. Following his arrest, he faced charges for allegedly enabling criminal activity on Telegram, including complicity in distributing child abuse images and drug trafficking, as well as failing to comply with law enforcement.
Durov, who denies the allegations, criticized authorities for holding him accountable for crimes committed by third parties on the platform, calling it "surprising" and "misguided." Critics have pointed out that Telegram has become a hub for misinformation, child pornography, and terrorism-related content, partly due to its feature allowing groups of up to 200,000 members, compared to WhatsApp's limit of 1,000.
Telegram was also criticized last month for hosting far-right channels that incited violence in English cities. Recently, Ukraine banned the app on state-issued devices to reduce potential threats from Russia.
Durov's arrest has sparked concerns about free-speech protections on the internet. According to John Scott-Railton, a senior researcher at the University of Toronto's Citizen Lab, many users are now questioning whether Telegram remains a safe space for political dissidents. He noted that the latest policy change has raised alarm in various communities, as Telegram had marketed itself as a platform resistant to government demands, attracting users from repressive regions like Russia, Belarus, and the Middle East.
Scott-Railton pointed out that there is uncertainty about how Telegram will respond to requests from authoritarian regimes in the future. Furthermore, cybersecurity experts have highlighted that Telegram's moderation of extremist and illegal content is weaker compared to other social media platforms. Before this policy expansion, Telegram only provided information on terror suspects.
In response to the scrutiny, Durov stated that the app is now utilizing a dedicated team of moderators and artificial intelligence to make problematic content less visible in search results. However, experts like Daphne Keller from Stanford University’s Center for Internet and Society argue that simply obscuring illegal content may not meet legal obligations under French or European law. She emphasized that Telegram must remove any content that its employees can reasonably identify as illegal and notify authorities about certain types of serious illegal content, such as child sexual abuse material.
Keller questioned whether the company's changes would satisfy authorities seeking information about investigation targets, including their communications and message content, suggesting that the commitment appears to be "less than what law enforcement wants."
Credit:MyJoyOnline.