RUssia-born tech tycoon Pavel Durov, the founder of Telegram, was arrested in Paris on August 24, 2024. French authorities announced that Mr. Durov was being investigated for a litany of serious crimes, including enabling the distribution of child sexual abuse material on an app, facilitating drug trafficking, and refusing to work same as law enforcement. Should digital platform owners be responsible for user-generated content? Pranesh Prakash and Rohit Kumar discuss the question in the conversation moderated by Aaratrika Bhaumik. Edited quote:
Does Telegram’s content moderation policy and refusal to cooperate with law enforcement agencies justify the founder’s accusations?
Pranesh Prakash: It is not clear whether Telegram’s content moderation policy is lax or whether there is an actual reluctance to cooperate with law enforcement agencies. In 2022, Germany reported that Telegram had complied with requests to remove 64 channels that potentially violated German hate speech laws. In this particular case, beyond the list of costs, certain facts and circumstances remain ambiguous. However, I would say that unless there is personal involvement or direct involvement, the founder of the messaging platform is not criminally liable for the actions of the users of the platform.
Rohit Kumar: While it’s understandable that Telegram aims to promote free speech, it’s important to acknowledge the real-world harms associated with unregulated messaging platforms. Ideally, directors and founders should not be personally liable. However, if there is clear evidence of involvement or direct knowledge, criminal liability can be imposed. However, the threshold for such liability is generally set very high, requiring sufficient evidence.
From a policy perspective, to what extent should social media intermediaries be held accountable for the content they host?
Rohit Kumar: A well-established principle is the safe harbor, which states that platforms should not be held responsible for user-generated content because they are only intermediaries. Privacy must also be preserved, meaning that platforms must avoid excessive monitoring or interception of user communications. For example, when there was a spread of misinformation on WhatsApp during the elections in India, the platform limited the ability to send messages simultaneously to different groups and reduced the group size. In addition, the platform must have a compliance officer or designated representative to cooperate with law enforcement, if proceedings are required. Ensuring these steps are in place and establishing clear procedural protocols should be a key focus for messaging platforms.
Pranesh Prakash: In the case of end-to-end encrypted platforms, their ability to see reported messages and take action is limited. Additionally, platforms designed to record minimal or no metadata at all face significant obstacles when cooperating with law enforcement agencies regarding user data. Under EU (European Union) law, there is a clear ban on requiring platforms to monitor or spy on users. When it comes to Telegram, while maintaining the confidentiality of one-on-one and group conversations and not allowing enforcement actions on these communications, it allows for content scrutiny in public channels.
Can even liberal democracies add stricter content moderation from these platforms? Does the passage of the Digital Services Act (DSA), 2024, the EU’s latest attempt to manage large technological excesses, signal a wider shift in this direction?
Pranesh Prakash: I don’t believe it. For example, in 2000, a French court ordered Yahoo! Inc. to block French users from accessing an auction of Nazi memorabilia on a US-based website – an example of direct content regulation by the courts in a liberal democracy. This shows that content regulation is not a new development. However, what has changed is that many who once defended freedom of speech now seem to prioritize the harm of ‘disinformation’ over the need for freedom of expression. This shift shows complicity in the over-regulation of free speech, which to an extent DSA also embodies.
Rohit Kumar: The main difference between then and now is the faster pace at which disinformation spreads. This is not just a conflict between the desire to protect freedom of speech and the need to regulate disinformation; it goes beyond a simple political narrative. As incidents of abuse and damage in the real world increase, the argument for stricter oversight becomes stronger. For example, X’s decision to de-platform Donald Trump during the last US presidential election was made by the platform itself. But do platforms have the power to determine who has a voice and who doesn’t? We need greater procedural clarity about how these decisions are made, who makes them, where responsibility lies, and when government intervention is appropriate.
Does Telegram’s laissez-faire approach to content moderation jeopardize the safe harbor protection under the Information Technology (IT) Act, 2000, in India?
Pranesh Prakash: Telegram is not subject to certain provisions of the IT Rules 2023, which mandate special service requirements for entities operating in India. In fact, very few global companies comply with these regulations, especially those with physical presence in India. A parallel comparison can also be drawn with the situation in France. One of the main charges levied by the French authorities against Telegram is that it provides a cryptological service that aims to guarantee confidentiality without a license. However, these regulations are outdated and inconsistent. For example, anonymity networks like Tor have not been studied in France. A similar pattern of selective enforcement may follow in India if the government decides to target Telegram under the IT Act.
Rohit Kumar: The Ministry of Electronics and Information Technology has announced that it is investigating Telegram over concerns that it is being used for illegal activities such as extortion and gambling. In addition, some of the requirements under the IT Rules 2023, such as submitting transparency reports and appointing a compliance officer, are quite extensive. Although the Indian government maintains that Telegram complies with these regulations, I agree with Pranesh that there is a risk of selective prosecution.
Could the threat of personal liability push tech executives to reevaluate the risks of unregulated content?
Rohit Kumar: In the Indian context, the threat of personal liability has been repeated many times. For example, the former IT Minister had issued a warning to X for not complying with the new IT Rules. Such arrests of senior executives are sure to cause concern among social media companies, especially if they happen in liberal democracies. However, there is a broad consensus among stakeholders in India that personal liability for regulatory violations should not be enforced. However, it may be more effective to impose higher penalties for repeated violations or consider banning entities that are persistently non-compliant.
Pranesh Prakash: It certainly will. However, the country should also reconsider its approach. One potential consequence is that more messaging platforms can use end-to-end encryption and minimize metadata storage to avoid helping law enforcement. So, this willful blindness will emerge more quickly if founders face personal responsibility for user-generated content.
Do you think this could be an isolated incident or is it becoming the norm?
Pranesh Prakash: As the moral panic over disinformation grows, we can expect to see more such arrests. However, in Durov’s case, French prosecutors called it a special crime rather than a speech-related offense. We will also witness increased censorship and more restrictions on applications like Telegram and WhatsApp. Telegram has been banned in more than ten countries.
Rohit Kumar: Social media intermediaries will re-evaluate their systems and procedures more carefully. This could lead to greater adoption of encryption, which the platform has been promoting as a marketing tactic. In addition, the main platform may rush to negotiate safeguards with various governments to prevent abuse of power by both parties. This issue has evolved beyond just free speech to include questions of sovereignty.
Listen to the conversation on The Hindu Parley podcast
Pranesh Prakash is Co-founder and former policy director at the Center for Internet and Society; Rohit Kumar is the founding partner of Quantum Hub