What are you on about?ITS A MATRIX ATTACK!!!!!!!!!!
Yes you do. Why are you shouting about matrix attacks? What does that have to do with the discussion.You sound absolutely off your rocker.
What are you on about?ITS A MATRIX ATTACK!!!!!!!!!!
Yes you do. Why are you shouting about matrix attacks? What does that have to do with the discussion.You sound absolutely off your rocker.
I'm always sceptical when it comes to reporting especially when it comes to technology. Do we know Telegram refuses to take any steps WRT the public or private messaging part of its platform, if it's the former then there's an argument that they're not doing enough to remove publicly available illegal content, if it's the latter then there's really not much they can do.Some steps. If reporting is true Telegram refuses to take any steps. Exactly what steps is way above my pay grade. At least Meta and Twitter take some steps to remove such content but imo they should be working much harder at it. Platform owners shouldn't be given a pass on this because they claim not to be publishers. Legislation needs to be updated as the likes of Section 230 is so out of date for what the internet has become.
Nope I think that is a secondary reason for why you support this and like I said the Child porn stuff is a trojan horse. I see you are trying to reframe my post so you could inclucde a quippy ending remark.Trying to prevent child porn being distributed and expecting these platforms to aid authorities in identifying perpetrators is irrational Tbh your post sounds irrational.
Russian state has been using Telegram as they commit war crimes in Ukraine. This guy is in Putin's pocket. Zero sympathy for him.
I'm always sceptical when it comes to reporting especially when it comes to technology. Do we know Telegram refuses to take any steps WRT the public or private messaging part of its platform, if it's the former then there's an argument that they're not doing enough to remove publicly available illegal content, if it's the latter then there's really not much they can do.
SOURCEThe app is not a member of either the National Centre for Missing and Exploited Children (NCMEC) or the Internet Watch Foundation (IWF) - both of which work with most online platforms to find, report and remove such material.
....
The BBC understands that NCMEC has repeatedly asked Telegram to join to help tackle child sexual abuse material (CSAM) but it has ignored requests.
Telegram also refuses to work with the Internet Watch Foundation, which is the UK’s equivalent of NCMEC.
An IWF spokesperson said: “Despite attempts to proactively engage with Telegram over the last year, they are not members of the IWF and do not take any of our services to block, prevent, and disrupt the sharing of child sexual abuse imagery."
By not being an active part of IWF or NCMEC, Telegram is not able to proactively find, remove or block confirmed CSAM which is categorised and added to lists compiled by the charities.
IWF said that the company did remove CSAM once material was confirmed but said it was slower and less responsive to day-to-day requests.
The BBC contacted Telegram for comment about its refusal to join the child protection schemes and received a response after publication which has been included.
Telegram is also not a part of the TakeItDown programme that works to remove so-called revenge porn.
Trying to prevent child porn being distributed and expecting these platforms to aid authorities in identifying perpetrators is irrational Tbh your post sounds irrational.
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.According to reports, they refuse/ignore requests to join or report things to organisations such as the IWF and other organisations:
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.
That is my guess.The IWF and other organisations keep lists of CSAM that they are made aware of via various methods (actively looking for it, stuff reported to them etc). They make hashes of the images/videos etc and this is used to detect CSAM easier online including by other platforms.
I think part of the issue is that Telegram refuse to work with these organisations to help them collate and therefore hash new CSAM even when the CSAM is reported to Telegram i.e. made public to them. These new hashes could then be used by other platforms to identify and remove this CSAM from their own platforms so it could be seen as Telegram being obstructive in these instances.
This is my thinking behind it but, until more info is released by French authorities, we cant be sure exactly what they are looking at.
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.
Basically private content should remain private.
Nope I think that is a secondary reason for why you support this and like I said the Child porn stuff is a trojan horse. I see you are trying to reframe my post so you could inclucde a quippy ending remark.
You bringing this up unprompted, seems to show that the association with Russia appears to have influenced your support of France's actions. I have no proof but I reckon that is your main reason for supporting this.
Some steps. If reporting is true Telegram refuses to take any steps. Exactly what steps is way above my pay grade. At least Meta and Twitter take some steps to remove such content but imo they should be working much harder at it. Platform owners shouldn't be given a pass on this because they claim not to be publishers. Legislation needs to be updated as the likes of Section 230 is so out of date for what the internet has become.
No. Targeted requests directed at people suspected of criminal activity is perfectly fine, but we're not talking about that.Your emails are private aren't they? Your telephone calls and text messages are private. Are you saying the State shouldn't be able to gain access to any of this is they suspect you of criminal activity? With a court order and as long as the law is applied correctly you have to allow the State to gain access to such content. Sending illegal content by private message shouldn't put it beyond the law.
Yes I know which is why I used the words “secondary” and “main”. To differentiate between the two reasons. Try actually reading what I wrote.People are allowed more than one reason you know.
I’m not sure why you decided to monologue your love for giving the state more control (as that had nothing to do with my point) but if you feel so strongly about it you can always book yourself a one way ticket to China then since you want to live in an authoritarian country. Just to get a taste for what it’s like.Yes I think Telegram is being used by the Russian state to aid its war in Ukraine. That isn't the reason I think these platforms have gone way past what the protections of Section 230 and similar legislation was designed for. Your phone can be tapped, your emails read, your browser history searched with a court order. Why should this platform be any different? And if these platforms refuse to play ball these should be consequences for the owners. No social media company should have more power than sovereign states
No. Targeted requests directed at people suspected of criminal activity is perfectly fine, but we're not talking about that.
The vast, in fact AFAIK the only way, an anonymous online person would be suspected of criminal activity is if you monitored all anonymous people, think about it. To suspect something, to think or believe something to be true or probable, you need some sort of evidence because there should be a presumption of innocences.
If you suspect someone of criminal activity then you either already know who that person is or what the criminal activity is, in other words the only way you can know that is if you've heard or seen something that lead you to think or believing that. The thinking or believing something to be true or probable comes before the request to listen/read private content, not before.
e: Put it this way: What is the probable cause that you (the authorities) have for wanting to access the private content of an anonymous online person?
Yes I know which is why I used the words “secondary” and “main”. To differentiate between the two reasons. Try actually reading what I wrote.
I’m not sure why you decided to monologue your love for giving the state more control (as that had nothing to do with my point) but if you feel so strongly about it you can always book yourself a one way ticket to China then since you want to live in an authoritarian country. Just to get a taste for what it’s like.
Right, but how do you know that they've done that?If you know that user has say sent illegal imagery to another user then there is your probable cause.
Right, but how do you know that they've done that?
Implicating them in a crime, doubtful. Is it safe to assume that as that's the only example you can think of that you agree there is no probable cause, that there are no other reasons for the authorities to have access the private content of an anonymous online person. That the only reason they'd need a service provider to supply them with the details of a user is if they had already breached that persons right to privacy.Well its at least a 2 person event right? Maybe the other person grasses them up.
I thought I'd already addressed that point when i said this...Also Telegram isn't just a DM platform, there are channels with thousands of users in them, some are likely with illegal content. If law enforcement manage to get into that channel then they can monitor it and they'll have probable cause on everyone who takes part. You agree that at that point a platform should be handing over all the information they have on those people? And that isn't even taking into account that channels with illegal content aren't being shut down by the platform.
IMO if what they're being asked to report content that's meant to be private then they should refuse/ignore such request, if that content is public on the other hand it's fair game but then the only real thing the authorities would need are details, if any, of who posted it.
Basically private content should remain private.
On facebook some local idiot on a city page was threatening me with physical violence a few months back.You would think so, I reported blatant racism on a post the other day and I got the response back that it met their terms and conditions.
CIA funded that, or at least the US gov.Presumably they will be arresting the creators of the TOR network too.