Last week’s dramatic arrest of Pavel Durov, the self-styled libertarian founder of Telegram, has sent shockwaves throughout the tech and free expression communities. Durov’s arrest in France represents the first time a tech founder has been personally targeted for harbouring illegal content on their platform.
There is no suggestion that Durov himself was personally engaged in unlawful conduct. France’s investigation—and the proposition that platform owners can face criminal jeopardy for the mere fact of providing encryption services—threatens to create a hostile precedent for both free expression generally and the viability of communicating without government surveillance specifically.
Telegram allows both encrypted and unencrypted as well as open channels. Unlike Signal or Apple messages, end-to-end encryption for private chats is not the default setting on Telegram and users must opt-in to a “secret chat” feature to enable it. The material freely flowing through the app’s public channels is not for the faint of heart. Hamas openly broadcasts celebrations of its terror in daily updates.
Out of curiosity, I began following the channel of Aleksandr Dugin, who has been described as Putin’s brain, in 2022 after Russia invaded Ukraine. I left the channel after a few weeks of being avalanched with “Z” pro-regime propaganda, memes lionizing Putin, and anti-NATO screeds.
France 24 reported that its investigation of Durov concerns suspected “complicity” in various crimes including running an online platform that allows illicit transactions, child pornography, drug trafficking, and fraud, as well as the refusal to communicate information to authorities, money laundering, and providing cryptographic services to criminals.
In response, Telegram claimed that the company abides by European Union laws and its moderation was “within industry standards and constantly improving.” In a public statement, it said, “it is absurd to claim that a platform or its owner are responsible for abuse of that platform.”
The basket of charges that Durov has been smacked with, though, appears to be a classic motte-and-bailey. Allowing child sexual exploitation materials and drug trafficking networks to flourish—or refusing to cooperate with law enforcement on properly specified warrants—should appropriately attract legal sanctions, although the normal way of enforcing them has been through “jawboning” platforms, threatening them with fines or further regulation for non-cooperation.
Even section 230 of the United States’ Communications Decency Act, which was designed to allow tech platforms to operate freely without the threat of liability for user conduct, does not shield platforms from liability for facilitating criminal conduct. If France is alleging that there are specific instances of crimes which law enforcement has produced proper warrants for and been stonewalled by Telegram, they should say so in more specific terms. And even if this is the case, it doesn’t justify the dramatic display of state coercion in arresting Durov and holding him in custody.
But merely “providing cryptographic services”—i.e., allowing private encrypted messages—suggests the French warrant is aiming at a broader goal of asserting state control which should concern anybody who cares about free expression and privacy. It goes to the very raison d’etre of apps like Telegram—evading government surveillance.
Telegram has been used to organize protests against authoritarian regimes and preserve access to information in the face of blocked news websites. In Hong Kong, it was used to organize a protest marking the anniversary of the Tiananmen Square massacre. Anti-authoritarian activists in Belarus and Myanmar have also relied on the app.
And, of course, the same encryption services that are used by activists—or more benignly, people who do not wish their private communications to be subject to state surveillance—are also sometimes used by criminals. Governments have frequently sought to override privacy protections to seize information from both.
Putin attempted—and failed—to ban Telegram. The app is now used equally by Russian and Ukrainian fighters as well as dissidents in both countries to communicate, despite Telegram being pressured to disclose information about anti-war activists to the Kremlin (Durov fled to France in the first place after refusing to comply).
There is ample evidence that the political theatre of Durov’s arrest is itself the point. Law enforcement can and have infiltrated Telegram, including in Canada, where in June a Quebec man was charged with wilful promotion of hatred against Jewish people for comments he is alleged to have made on Telegram. A few days ago, the American Drug Enforcement Agency announced the arrest of a man who was found to be selling drugs and machine guns on the app.
In the wake of Durov’s arrest, French Prime Minister Emmanuel Macron tweeted that “France is deeply committed to freedom of expression and communication” and denied that the arrest was politically motivated. But France, like other European countries, has none of the instinctive reticence against regulating speech and, more broadly, big tech that the U.S. traditionally has had. France has recently censored information for blatantly political ends: in May, it blocked access to TikTok in its Pacific colony of New Caledonia for two weeks to prevent organizing protests over voting reforms.
Indonesian Communication and Information Minister Rudiantara shakes hands with Telegram co-founder Pavel Durov during their meeting in Jakarta, Indonesia, Aug. 1, 2017. Tatan Syuflana/AP Photo.
Not coincidentally, the European tech industry is paltry compared to that in the U.S., having been hollowed out by overzealous regulation and a generally hostile business environment.
This is not to say that the U.S. is immune from the same censorship instincts. The day after Durov was arrested, Mark Zuckerberg publicly confirmed in a letter to Congress that Meta was pressured to censor speech related to the pandemic by the U.S. government. And vice presidential nominee Tim Walz recently mused, incorrectly, that “There’s no guarantee to free speech on misinformation or hate speech and especially around our democracy” (indeed, even lying, or uttering false information unknowingly, is considered protected expression in the U.S.).
We don’t hold internet service provider CEOs personally liable for facilitating crimes over their networks, nor does France. Assigning criminal liability to Durov for merely providing encrypted technology without more specification connecting him to crimes is just as remote and disconcerting.
The geopolitical dimensions of Durov’s arrest suggest that he may be well positioned to strike a deal to avoid trial and further punishment if he is so inclined: Telegram’s servers presumably hold information, perhaps from Russian actors, that the French authorities would very much like access to, and there are rumours that the timing of Durov’s arrest is aimed at scrambling Russian war communications.
But whether or not he pleads guilty, there is a wider threat of bad precedent arising from this whole affair. If Durov can be personally held criminally culpable by France for providing encryption under the pretence of instances where bad actors utilize them, other Western governments will be emboldened to act similarly. And there will be an immediate chilling effect across all encrypted apps, including Meta’s WhatsApp and Signal, and their activities across the world.
Should Canadians be concerned? Our government has generally taken the approach of targeting platforms, rather than CEOs, for putatively illegal content. Bill C-63, the Online Harms Act, would expand the scope of content that will attract sanctions with hefty fines for failure to block “content that foments hatred.” But the Online Harms Act was inspired by similar legislative movements in Europe, like Germany’s Netz-DG Act and the EU’s Digital Services Act. If France’s gambit goes ahead, we can expect Canada to eventually follow in its footsteps.