Founder of telegram arrested in france

Some steps. If reporting is true Telegram refuses to take any steps. Exactly what steps is way above my pay grade. At least Meta and Twitter take some steps to remove such content but imo they should be working much harder at it. Platform owners shouldn't be given a pass on this because they claim not to be publishers. Legislation needs to be updated as the likes of Section 230 is so out of date for what the internet has become.
I'm always sceptical when it comes to reporting especially when it comes to technology. Do we know Telegram refuses to take any steps WRT the public or private messaging part of its platform, if it's the former then there's an argument that they're not doing enough to remove publicly available illegal content, if it's the latter then there's really not much they can do.
 
Trying to prevent child porn being distributed and expecting these platforms to aid authorities in identifying perpetrators is irrational :rolleyes: Tbh your post sounds irrational.
Nope I think that is a secondary reason for why you support this and like I said the Child porn stuff is a trojan horse. I see you are trying to reframe my post so you could inclucde a quippy ending remark.

You bringing this up unprompted, seems to show that the association with Russia appears to have influenced your support of France's actions. I have no proof but I reckon that is your main reason for supporting this.
Russian state has been using Telegram as they commit war crimes in Ukraine. This guy is in Putin's pocket. Zero sympathy for him.
 
I'm always sceptical when it comes to reporting especially when it comes to technology. Do we know Telegram refuses to take any steps WRT the public or private messaging part of its platform, if it's the former then there's an argument that they're not doing enough to remove publicly available illegal content, if it's the latter then there's really not much they can do.


According to reports, they refuse/ignore requests to join or report things to organisations such as the IWF and other organisations:
The app is not a member of either the National Centre for Missing and Exploited Children (NCMEC) or the Internet Watch Foundation (IWF) - both of which work with most online platforms to find, report and remove such material.

....


The BBC understands that NCMEC has repeatedly asked Telegram to join to help tackle child sexual abuse material (CSAM) but it has ignored requests.

Telegram also refuses to work with the Internet Watch Foundation, which is the UK’s equivalent of NCMEC.

An IWF spokesperson said: “Despite attempts to proactively engage with Telegram over the last year, they are not members of the IWF and do not take any of our services to block, prevent, and disrupt the sharing of child sexual abuse imagery."

By not being an active part of IWF or NCMEC, Telegram is not able to proactively find, remove or block confirmed CSAM which is categorised and added to lists compiled by the charities.

IWF said that the company did remove CSAM once material was confirmed but said it was slower and less responsive to day-to-day requests.

The BBC contacted Telegram for comment about its refusal to join the child protection schemes and received a response after publication which has been included.

Telegram is also not a part of the TakeItDown programme that works to remove so-called revenge porn.
SOURCE


I am cynical and suspicious of people using the Terrorism and Child Protection arguments to push agendas however, in the case of Telegram, they really dont help themselves or the wider E2EE arguments with respect to surveillance.



EDIT:

Trying to prevent child porn being distributed and expecting these platforms to aid authorities in identifying perpetrators is irrational :rolleyes: Tbh your post sounds irrational.

And this is where I have a problem with it... Arguments like above are skirting the accusation of child abuse towards people just because they argue against the topic of surveillance on E2EE.
 
Last edited:
According to reports, they refuse/ignore requests to join or report things to organisations such as the IWF and other organisations:
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.

Basically private content should remain private.
 
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.

The IWF and other organisations keep lists of CSAM that they are made aware of via various methods (actively looking for it, stuff reported to them etc). They make hashes of the images/videos etc and this is used to detect CSAM easier online including by other platforms.

I think part of the issue is that Telegram refuse to work with these organisations to help them collate and therefore hash new CSAM even when the CSAM is reported to Telegram i.e. made public to them. These new hashes could then be used by other platforms to identify and remove this CSAM from their own platforms so it could be seen as Telegram being obstructive in these instances.

This is my thinking behind it but, until more info is released by French authorities, we cant be sure exactly what they are looking at.
 
Last edited:
The IWF and other organisations keep lists of CSAM that they are made aware of via various methods (actively looking for it, stuff reported to them etc). They make hashes of the images/videos etc and this is used to detect CSAM easier online including by other platforms.

I think part of the issue is that Telegram refuse to work with these organisations to help them collate and therefore hash new CSAM even when the CSAM is reported to Telegram i.e. made public to them. These new hashes could then be used by other platforms to identify and remove this CSAM from their own platforms so it could be seen as Telegram being obstructive in these instances.

This is my thinking behind it but, until more info is released by French authorities, we cant be sure exactly what they are looking at.
That is my guess.

IIRC all the "social media" platforms over a certain size are expected to have policies and staffing in place to deal with certain legal obligations, and one of the most basic ones that is common across pretty much the entire western world is enforcement of the laws regarding CSAM.
So if Telegram is not taking action about CSAM when it is technically able to (IE if it's posted in one of their "public" areas), once they've been made aware of it then the likes of law enforcement are likely to take a very dim view of it, likewise if Law enforcement have a warrant or similar for a users IP address/other information that the service has and is technically able to provide but just ignore it then the bosses are going to get into trouble.
 
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.

Basically private content should remain private.

Your emails are private aren't they? Your telephone calls and text messages are private. Are you saying the State shouldn't be able to gain access to any of this is they suspect you of criminal activity? With a court order and as long as the law is applied correctly you have to allow the State to gain access to such content. Sending illegal content by private message shouldn't put it beyond the law.
 
Nope I think that is a secondary reason for why you support this and like I said the Child porn stuff is a trojan horse. I see you are trying to reframe my post so you could inclucde a quippy ending remark.

You bringing this up unprompted, seems to show that the association with Russia appears to have influenced your support of France's actions. I have no proof but I reckon that is your main reason for supporting this.

People are allowed more than one reason you know. If people are suspected of breaking the law in this country then as long as the law is followed no platform should be beyond the reach of the law here.

Yes I think Telegram is being used by the Russian state to aid its war in Ukraine. That isn't the reason I think these platforms have gone way past what the protections of Section 230 and similar legislation was designed for. Your phone can be tapped, your emails read, your browser history searched with a court order. Why should this platform be any different? And if these platforms refuse to play ball these should be consequences for the owners. No social media company should have more power than sovereign states.
 
UAE aren't very happy with France and have pulled a 20 billion dollar contract for french army aircraft.

This whole thing is very odd and very french.. but I can see both sides of the coin, still on the fence to where I sit
 
Some steps. If reporting is true Telegram refuses to take any steps. Exactly what steps is way above my pay grade. At least Meta and Twitter take some steps to remove such content but imo they should be working much harder at it. Platform owners shouldn't be given a pass on this because they claim not to be publishers. Legislation needs to be updated as the likes of Section 230 is so out of date for what the internet has become.

You would think so, I reported blatant racism on a post the other day and I got the response back that it met their terms and conditions.
 
Your emails are private aren't they? Your telephone calls and text messages are private. Are you saying the State shouldn't be able to gain access to any of this is they suspect you of criminal activity? With a court order and as long as the law is applied correctly you have to allow the State to gain access to such content. Sending illegal content by private message shouldn't put it beyond the law.
No. Targeted requests directed at people suspected of criminal activity is perfectly fine, but we're not talking about that.

The vast, in fact AFAIK the only way, an anonymous online person would be suspected of criminal activity is if you monitored all anonymous people, think about it. To suspect something, to think or believe something to be true or probable, you need some sort of evidence because there should be a presumption of innocences.

If you suspect someone of criminal activity then you either already know who that person is or what the criminal activity is, in other words the only way you can know that is if you've heard or seen something that lead you to think or believing that. The thinking or believing something to be true or probable comes before the request to listen/read private content, not before.

e: Put it this way: What is the probable cause that you (the authorities) have for wanting to access the private content of an anonymous online person?
 
Last edited:
People are allowed more than one reason you know.
Yes I know which is why I used the words “secondary” and “main”. To differentiate between the two reasons. Try actually reading what I wrote.


Yes I think Telegram is being used by the Russian state to aid its war in Ukraine. That isn't the reason I think these platforms have gone way past what the protections of Section 230 and similar legislation was designed for. Your phone can be tapped, your emails read, your browser history searched with a court order. Why should this platform be any different? And if these platforms refuse to play ball these should be consequences for the owners. No social media company should have more power than sovereign states
I’m not sure why you decided to monologue your love for giving the state more control (as that had nothing to do with my point) but if you feel so strongly about it you can always book yourself a one way ticket to China then since you want to live in an authoritarian country. Just to get a taste for what it’s like.
 
Last edited:
No. Targeted requests directed at people suspected of criminal activity is perfectly fine, but we're not talking about that.

The vast, in fact AFAIK the only way, an anonymous online person would be suspected of criminal activity is if you monitored all anonymous people, think about it. To suspect something, to think or believe something to be true or probable, you need some sort of evidence because there should be a presumption of innocences.

If you suspect someone of criminal activity then you either already know who that person is or what the criminal activity is, in other words the only way you can know that is if you've heard or seen something that lead you to think or believing that. The thinking or believing something to be true or probable comes before the request to listen/read private content, not before.

e: Put it this way: What is the probable cause that you (the authorities) have for wanting to access the private content of an anonymous online person?

If you know that user has say sent illegal imagery to another user then there is your probable cause. You now should be able to gain access to everything that platform has on that user, their IP etc. Or that person has been in contact with a known terrorist that is suspected of planning an attack, that would likely be enough in many jurisdictions to gain a warrant.

It seems with this guy that he refuses to moderate any content and not work with authorities if they want details on certain users. I'm not suggesting authorities should be able to read content at will but with an court authorised warrent.
 
Yes I know which is why I used the words “secondary” and “main”. To differentiate between the two reasons. Try actually reading what I wrote.



I’m not sure why you decided to monologue your love for giving the state more control (as that had nothing to do with my point) but if you feel so strongly about it you can always book yourself a one way ticket to China then since you want to live in an authoritarian country. Just to get a taste for what it’s like.

LOL the old "go live in China" line. The state can already read your emails, text messages and listen to your call with a warrant. Why should messaging apps be any different? Unless you don't think they should be able to do any of that in which case this conversation is pointless.
 
Right, but how do you know that they've done that?

Well its at least a 2 person event right? Maybe the other person grasses them up.

Also Telegram isn't just a DM platform, there are channels with thousands of users in them, some are likely with illegal content. If law enforcement manage to get into that channel then they can monitor it and they'll have probable cause on everyone who takes part. You agree that at that point a platform should be handing over all the information they have on those people? And that isn't even taking into account that channels with illegal content aren't being shut down by the platform.
 
Back
Top Bottom