Content pfp
Content
@
0 reply
0 recast
0 reaction

Thomas pfp
Thomas
@aviationdoctor.eth
1/ France is once again in the spotlight for unfortunate reasons — this time, the arrest of Telegram’s CEO, Pavel Durov. I’ve done a bit of digging among French sources to figure out what’s happening. A 🧵
17 replies
31 recasts
65 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
2/ Durov was arrested at Le Bourget airport near Paris last night after flying there from Baku (Azerbaijan) on his business jet. He was accompanied by his bodyguard and personal assistant, and was planning to have dinner in Paris that evening.
2 replies
0 recast
9 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
3/ The arrest was motivated by an outstanding search warrant against Durov. It is unclear whether Durov was aware of the warrant before deciding to fly to France.
1 reply
0 recast
5 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
4/ The warrant was issued by a judge based on a request by the “Minors Office” (OFMIN). In that sense, the arrest is neither arbitrary, political, nor unlawful.
1 reply
0 recast
7 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
5/ OFMIN is a 40-strong specialized police unit reporting to the Ministry of Interior and created in November 2023 to investigate online crimes against minors.
1 reply
0 recast
6 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
6/ As far as I can tell, OFMIN is not accusing Durov himself of crimes against minors. Instead, Durov is being accused of either refusing to filter Telegram for child sexual abuse material (CSAM), or refusing to cooperate with French police on specific CSAM investigations involving Telegram, or both.
1 reply
1 recast
10 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
7/ The CSAM problem is real and widespread. The French OFMIN received 318,000 reports in 2023, up from 227,000 in 2022. Not all those involve Telegram, of course.
1 reply
0 recast
6 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
8/ 90% of those reports actually originate from the U.S. nonprofit National Center for Missing & Exploited Children (NCMEC), and are forwarded to OFMIN only because either a perpetrator or a victim used an IP geolocated in France.
1 reply
0 recast
8 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
9/ There are precedents for platform providers attempting to moderate CSAM. Most famously, Apple engineered an iCloud photo scanning tool that detects CSAM.
1 reply
0 recast
6 reactions

Thomas pfp
Thomas
@aviationdoctor.eth
10/ That project launched in August 2021 and was terminated two years later. Apple determined that there was no way to filter CSAM without making unacceptable tradeoffs to user privacy, and that the latter’s importance trumped the former’s.
2 replies
0 recast
4 reactions

Cassie Heart pfp
Cassie Heart
@cassie
The "determination" was more that media was hellbent on not letting it happen rather than Apple not feeling it was a good tradeoff. Apple's approach was in every way _better_ than the existing (and still current approach), of iCloud Photos being scanned server side, with Apple having access to the encryption keys for the photos. If you want full encryption on iCloud Photos, you have to enable advanced data protection, which nearly nobody does, and has a lot of usability drawbacks most would not be willing to deal with. The better usability alternative died under friendly fire of misinformation.
0 reply
0 recast
1 reaction