What Does the Arrest of Telegram CEO Mean for Free Speech?

Telegram CEO, Pavel Durov, was arrested in France on Saturday shortly after landing at Le Bourget Airport in France. French authorities have cited the reason for his arrest as his supposed failure to mitigate criminal activity on popular messaging app, Telegram.

Durov, a Russian-born entrepreneur who founded Telegram in 2013 alongside his brother Nikolai, lives in Dubai where the company is based, and he holds dual French and Russian citizenship.

While the move by French authorities makes a significant controversial statement, it doesn’t necessarily come as a great surprise to those who have kept up with recent developments.

Indeed,  Telegram, alongside other social media platforms including Elon Musk’s X, has recently been accused of imposing insufficient moderation on inappropriate user activity, attracting heat from EU authorities and social activist groups alike.

 

What Is Pavel Durov Accused Of? 

 

Moderation has become a buzzword in the world of social media of late, with plenty of conflicting opinions regarding free speech and the social responsibilities of popular social media platforms.

The European Union has notoriously taken a hard stance on the issue, imposing regulations including the Digital Services Act (DSA) in an attempt to prevent the circulation of illegal and harmful content.

Indeed, these are the regulations Pavel Durov is accused of contravening by means of his ever-growing social media and messaging platform, Telegram, purportedly having failed to impose proper restrictions on the spread of harmful content.

According to French officials, the investigation into Durov’s supposed inaction relates specifically to the app’s failure to adhere to rules imposed by law enforcement that pertain to fraud, drug trafficking and child abuse.

In the most basic terms, the argument is that it is Durov’s responsibility to prevent such content from being circulated and perpetuated by means of his app, and he has failed to prohibit such criminal activity.

However, details of these accusations are yet to materialise, and due to Pavel Durov’s dual French-Russian citizenship, the Russian embassy in France has made a statement advising the public that they have not yet received clarity on the specific conditions of his arrest.

In the midst of the uncertainty of the conditions surrounding the accusations against Durov comes speculation regarding possible and potential punishment he and Telegram may face as a result.

According to the terms set out by the DSA, EU authorities may impose a monetary fine on Telegram of up to 6% of its annual turnover, along with further investigation, additional regulatory controls and prolonged monitoring of the app’s services.

However, given how recent the arrest was along with the significant amount of controversy surrounding the EU’s actions, the specifics of such penalties and punishment remain to be seen.

The most important question that has arisen in the last 24 hours, however, is what the implications of Pavel Durov’s arrest will be with regard to free speech and the future of social regulation.

 

The Implications of Durov’s Arrest on Free Speech

 

Many people, both within the industry and outside of it, haven’t wasted time voicing their opinions on the EU’s actions and the broader implications of Durov’s arrest.

Well-known figures including X’s Elon Musk and infamous NSA whistleblower Edward Snowden immediately made public statements via various platforms regarding Durov’s arrest, alluding to how it represented a major blow to freedom of speech.

Snowden asserted that the move made by French authorities was “…an assault on the basic human rights of speech and association”.

He went on to state that this monumental decision would also have far-reaching implications for France’s reputation, a sentiment that has been echoed by many others.

So, what’s the crux of the issue and why is everybody so upset?

 

 

What the EU Expected Durov and Telegram To Do 

 

Since the exact conditions surrounding Durov’s arrest haven’t been clarified as of yet, there’s a lot of speculation, but essentially, the expectation from the EU was for Durov to enforce restrictions on Telegram users’ freedom to participate in large groups and share files by means of these channels.

The accusations pertain to the fact that Telegram’s large groups (allowing up to 200,000 members) supposedly provide users with a quick and easy way to spread misinformation as well as harmful content by means of broadcasting functions.

Indeed, social activists including an anti-racism group Hope Not Hate have asserted that Telegram has recently become a virtual meeting place for racists and extremists to spread dangerous ideology and potentially organise events that may threaten public safety, with the recent London-based riots having been used as an example.

Thus, that being said, the argument from the EU and French authorities is that it was Durov’s responsibility to prevent this from happening and mitigate the potential risks of dangerous content being shared via Telegram.

However, how Durov and Telegrm were expected to “mitigate” these risks seemed to involve broad, sweeping limitations on users’ ability to communicate via large groups – essentially, as some have argued, keeping everybody quiet in an attempt to control a small minority of bad apples.

This, for many including Durov and Telegram’s leadership, was the primary issue upon which they weren’t willing to waver.

But, is this true,? Is Telegram really being used for individuals and groups with nefarious intentions to organise criminal activity and spread hate speech?

One group may argue yes, another may argue no, but it’s the opinion of a ubiquitous third group that makes the most controversial assertion of all.

That is, whatever one’s opinion may be, one has the right to express it, in the name of free speech, and the widespread imposition of bans on broad methods of communication equates to mass censorship – or at least, the potential thereof.

 

Public Safety Vs. Free Speech

 

Indeed, many argue that whether or not Telegram is being used for potentially abhorrent discussions and is failing to mitigate the risks this activity brings isn’t the question.

Rather, the question that needs to be asked is, should social media platforms ever prevent people from expressing their opinions at all?

Some immediately assert that when hate speech, violence and criminal activity are involved, the spreading of such harmful opinions should, undoubtedly, not be allowed.

But, who is the arbiter of what can and cannot be shared? Who decides what is right and what is wrong? Is it okay to impose blanket restrictions on all speech for the purpose of restricting some?

The concern among many, especially the likes of Snowden and Musk who have been incredibly outspoken on the matter, is that it’s a slippery slope.

While intentions may start out as pure, once certain groups are given the power to restrict the sharing of ideas and the general principles of free speech, there’s no telling where it will end.

It creates the potential for political silencing and the suppression of dissent which is, arguably, one of the greatest threats to democracy.

So, is it possible to find a balance between managing objectively criminal activity while still allowing free speech to thrive?

 

Pavel Durov and the Future of Free Speech

 

The opposition of most public figures, social media platforms and members of the public to the EU’s Digital Service Act and Pavel Durov’s arrest is not about the attempt to control criminal activity and the inciting of violence by extremists.

Rather, the growing concern is centred on the widespread restrictions being imposed on industries as part of the EU’s overly extreme regulations and policies and how these policies create very real potential for the elimination of free speech.

Indeed, there’s a lot to be discussed about ways in which the EU’s regulations may be refined to prevent some of the problems currently being experienced, to achieve a healthy balance all around.

Many advocates for stricter regulation point to Whatsapp as an example of how this may be done, referring specifically to its flagging and moderation system.

Essentially, Whatsapp allows for specific messages to be flagged by recipients, which then automatically sends the content in question to Whatsapp moderators who evaluate the content and make a decision about whether or not it’s within the realms of the law.

The immediate clap back to this process, which has gone hand-in-hand with many industry experts’ recent move away from Whatsapp, is that it violates user privacy. Most importantly, the implication of this is that Whatsapp doesn’t, in fact, use true end-to-end encryption methods if this is possible.

Thus, this supposed simple solution doesn’t seem to be in the pipeline for Telegram as the platform has famously been vehemently opposed to violating user privacy in this way, always prioritising end-to-end encryption.

Of course, this is but one possible solution to finding a balance between over-regulation and the spreading of dangerous criminal content. It’s certainly plausible to expect that there may be alternatives more suitable to Telegram and its users.

As it stands, however, outrage surrounding Durov’s arrest is aimed directly at the regulations that are currently in place and are being enforced upon Telegram and its CEO, highlighting a concerning future for free speech, or the lack thereof, in the Western world.

Regardless of what happens next, it’s safe to say that Pavel Durov’s arrest will forever be inextricably linked to the future of EU regulatory policy and free speech in the European Union.