Alex Jones..

Soldato
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
Well it looks like the media giants have finally pulled the plug on one of the worlds biggest cretins, Youtube, Facebook, iTunes and others have all blocked him from their platforms;

https://www.bbc.co.uk/news/technology-45083684

I absolutely cannot stand the man, or that awful infowars company - but I'm not sure blocking his content from all the platforms sits well with me.

When all the media companies group together and decide to do a mass 'nope' towards someone, the amount of power they exercise is enormous, it probably comes with practically zero input from actual authority or any judicial system - instead it seems to me, to be a culture of intolerance - if you can't deal with it, ban it.

When the big media companies act this way, I feel as though the ability to make up my own mind about something is being taken away, and performed by some other agency according to it's own rules or agendas, as though they want to spoon feed everyone with their definition of all of the nice things.

I think there are some cases where big media companies should remove things - for example if it actually breaks the law of the land [child abuse, terrorism, etc], in many cases it doesn't - it's just a bit nasty or naughty.

Who the hell wants to live in a world where anything controversial or unsavoury (but legal) is immediately blocked?
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
We'll hear plenty more about it (in the form of whinging) but there's nothing he can do, because the law does not dictate that private corporations must provide a platform for everyone.

That's probably the most interesting aspect to the whole thing - the law doesn't dictate that private corporations must provide a platform for everybody, however I think if you take a step back - the private corporations [FB,YT,iT,TW,SP,etc] together wield more real power than most legal systems combined. In my eyes they've not only become more powerful than authority - but they're actually altering society, by enforcing standards and policies which are more powerful and wider reaching [globally] than literally any lawmaker can ever be.

In one swoop, these companies have the power to totally alter what everybody sees <in the whole world>, with no real debate or wider engagement, merely a statement that reads something along the lines of 'due to violations of xyz' or 'we want to provide a safe environment for everyone' etc.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
It's all very well having a moral position on 'free speech', but when the real world effect is a resounding negative, such as with this chap, then you should really sit back and question the rigidity of your morals.

That applies to any firmly held moral belief. We should be flexible, or else we are just more fundamentalists.

Which is the biggest negative? Living in a world where legal things are censored according to 'terms & conditions' which aren't open to any outside debate whatsoever, with the intention of sanitising all content so that everything is perfectly safe 100% of the time?

Or

Living in a world where people can make up their own minds and exercise freedom of choice, where we absorb the fact that sometimes - what people say might hurt a little or even a lot?
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
Last time I checked, these corporations did not control what everybody sees in the whole world. They control what people see on their platforms, and who generates the content on their platforms? Their users.

In practical terms, they collectively have the power to control what people see, more than any other single entity - Facebook alone has a global user count of 2.3Billion, there is nobody else who has as much control over what people consume, in terms of media.

That video you posted is interesting, because it's similar to something I pointed out ages ago, relating to the weird, backward policies of google image search and censorship, essentially - you have people making up rules as to who can see what - according to very strange principles and morals that I don't understand and to me make no sense.

Some things are ok - Alex Jones = BAD, holocaust denial = OK, soft porn = BAD, Islamic terror images = OK... so on.

Y1byaeN.jpg

6jm1Akq.jpg
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
So... what was your point again?

I'm pointing out that with google safesearch turned on, you can view pictures, videos, content of islamic terrorism and beheadings - yet it won't return a single image or a single thing to do with anything that might relate to soft porn.

My point being, that the policy which dictates what safesearch deems as 'safe' or not, is totally broken, as Dowie pointed out - it's mostly down to US culture, rather than actual common sense.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
I don't care that he's right wing.

I care that he makes up lies, to fool morons in to attacking victims and family members of school shootings.

Then makes money off the back of it.

There are people who are far worse than Alex Jones, who get away with far worse, more dangerous things every single day.

The world is literally rammed with people who are exploiting others, telling lies to make money, saying nasty things to make money, it's the price you pay for having human beings with free will with some degree of freedom of speech - bad things are going to happen, it's life.

Am I defending Alex Jones? no - I think he's an absolute piece of garbage, what he said about Sandy hook was inconceivable. But do I think he should be shut down and blocked from all platforms? I'm not 100% sure.

Not because I'm some sort of foaming right winger, but because I think I can see a situation further down the line, where it becomes a standard practise to silence the 'unsavoury' but legal views and opinions of people with questionable outlooks. I think I'd rather have the ability to decide for myself whether I wish to silence them or not, by simply watching them - or not watching them.

Also, by attempting to block or shut him down - you also run the risk of making him more popular, look at the amount of publicity he's managed to gain from it, a number of people in this thread have said "no idea who is he" well now they know, whereas - if he'd just have been left alone to rant to his followers - most of whom are stuck in echo chambers, that might have been a safer option.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
I haven't read the entire thread and I don't know anything about this fellow. That said the argument about private companies having no duty to provide a platform is a troubling one with regards to free speech. Between a small number of Tech Giants there is a virtual cartel on access to information Google with search and YouTube, FaceBook on search and social media, Apple with it's massive user base and Spotify with it's huge user base. Arguments about the duty of a private company to provide the platform were rational in a world with large numbers of broadcasters the plurality of supply was guaranteed. The new Tech Giants have such a disproportionate share of certain groups that if they blacklist people it undermines free speech. We may not squeal too much about some horrid conspiracy theorist but what if more legitimate areas of discussion are so constrained. The risks associated with the offendeshpere lobbying non "sanctioned ideas out of public discourse might be low at the moment but are surely growing if our media is any fair reflection.

Good post - this is exactly the reason I created the thread.

For me it's not entirely about Alex Jones, I merely use him as the main example - it's about the discomfort around how these gigantic companies, with unprecedented levels of data, information and arguably - control, are able to impose restrictions or controls against certain individuals or groups, when they have no legal duty to do so, where no laws are being broken. The ramifications of this 'becoming the norm' are significant and worrying in my view.

It seems that having an 'off' switch, or the ability to look away or turn the volume down, isn't enough - we seem to be going down the road of handing that ability to third parties with unknown agendas and interests, whilst taking our own ability to make decisions out of the process, which I think in the end could be undesirable to society.

For me, the only point where those things are desirable are when it enters legal territories - as prescribed by actual law, not by a media company's 'danger' algorithm or terms and conditions.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
I'm not sure anyone is arguing about whether or not he broke the T's and C's, for me it's more around how these companies that arguably provide connectivity to billions of people, are at the same time exerting more censorship powers, than any law or government in the world. Essentially they're becoming the decision makers in who gets to see and hear what. Whether it's due to T's and C's or community policy is irrelevant, in practical terms, the impact of these policies are far reaching and borderless - without any external influence, nobody has a say.

I'm not for one second going to argue that people like Alex Jones have legitimate viewpoints, I'm just worried about who's calling the shots and how things may play out in future, with regard to these companies, who are in effect exercising far more censorship power than any government or authority.

My personal view, is that Alex Jones's madness should be protected, that way - when it's out in the open, it can be taken for the trash that it is, by attempting to shut it down - risks creating far more hullabaloo than simply allowing it in the first place - maniac's gonna maniac, and all that.

Ok yes - he said crazy unforgivable things about Sandy Hook, but meh - for me it's far more harmless when he's out in the open spouting it, rather than forcing him underground, trying to shut him down essentially enforces many of the positions him and his supporters take.

idk - just not convinced it's the right approach.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
So the right approach would be forcing companies to provide him a platform?

Nobody is 'forcing' these companies to provide any platform at all, they do it to make lots of money and gather information on people, nobody is forcing them to do anything. That said - these companies now find themselves in the position, of exercising far more censorship and real control than any companies or governments who've existed before.

Another problem, is that in the history of things - as far as I can tell, actually de-platforming people has never really solved a problem - there might be a collective sigh of relief when someone swings the ban hammer, but overall - I think trying to silence voices like Alex Jones causes more problems than it solves.

Oh please... him and his supporters will never listen to reason, the fact that a lot of them actually believe his crap conspiracies is telling enough in how unexceptionally dumb they are. The rest are just trolls who also happen to be racist *****.

There is no longer any legitimate way to reach consensus, it's just violence from here on, it's bloody brilliant.

Of course they won't listen, they're all idiots - but to me it sounds like you have some misplaced belief, that the majority of people are all super clever and get their information from good sources seems to me that you're angling for perfection and balance in a world full of clowns.

In my eyes, it would be far simpler, easier and probably safer in the long run - to let him run his mouth off, idiots will idiot - but they'll be doing it out in the open where it's easier to discuss and laugh at, rather than sidelining them, sending them underground and potentially making things more dangerous.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
No one is actually trying to Silence scumbags like Jones.
Googles advertisers simply don;t want their ads associated with such disgusting views. Google doesn't want to loose advertising revenue. They also don't want to to appear to be providing free services that support the spread of vile misinformation because it will have a long term financial cost.

I think this is nonsense.

Put yourself in the shoes of an advertising company; You think you're going to care if your ads appear in unsavoury places? Rubbish - ads are there to be seen and clicked on, because they generate revenue - nobody is getting salty and upset because the people doing the clicking happen to be total cretins, their clicks are worth just as much as the next person.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
So I ask again - do you think that Youtube should have to allow Alex Jones to post his videos on their website? Especially when he has his own website?

I think if they're going to provide a platform for everybody, then yes - provided those videos don't break the law, then I don't see why Youtube should become the arbiter of what people can see or hear. If people use the platform - they should have the ability to make up their own minds. If it's nasty or offensive - turn it off.

It's a strange culture we're headed for, where we seem incapable of ignoring, or turning away from things we don't like - instead we have to go after them and shut them down and attempt to silence things we don't agree with, it seems over reactive.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
And if you're okay with OcUK's stance on swearing but not okay with YT's stance on posting utter trash that precious few advertisers¹ want to be within nine miles of, where d'you draw that particular line?

¹ - advertising being the thing that keeps the lights on at YT after all...

It's not a very good comparison, simply living within the rules of the forum by not swearing isn't de-platforming someone, or really censoring them - it's just making sure everyone moderates their language and keeps it a family friendly place - it doesn't stifle or prevent any specific narrative from being spoken.

I'm not going to try it (lol) but I'm pretty sure, that like some other people on here, if I rant and rave about conspiracy theories, nonsense and other such things - provided I don't swear, threaten or spam, I'm pretty sure I wouldn't be banned. [There are many examples of this with previous posters in other threads]
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
Well, yeah. That would be my point. Which is why I was asking Screeeech :)

See my previous replies, I repeat;

I don’t think it’s wise to de-platform people who have crazy or toxic views, so long as what they’re saying doesn’t break the law, it really is that simple, I’ve said it enough times now, it’s a clear line.

Saying something like “their platform their rules” is fine when you apply it to somewhere like OCUK, because the impact is very limited, especially when the only real rules that are enforced are for things such as bad language, threats or personal insults - it’s hardly the same, as de-platforming someone because of their crazy political views or conspiracy theories.

I think what you’ve failed to grasp, is that platforms like Facebook,YouTube etc - provide communication and a platform to roughly 1/3rd of the planet’s population, they’re operating at a scale few people could even imagine 15 years ago.

When they decide to step in and start restricting speech based on their own Ts and Cs, I think that could be troublesome further down the line.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
The "bigger picture" fundamentally is that if you break their rules, they have the right to boot you off their service. You've agreed to those rules when signing up or by continuing to use and pay for the service, and you're free to find a provider that doesn't impose such rules if you so desire.

To be fair, I don't think that's the point in contention; Yes it's their platform, yes they have Ts & Cs, yes if you break those Ts & Cs you can be removed from the platform, - I don't think anybody is arguing that any of that is false.

The question isn't; can or can they not enforce those Ts & Cs? But more given the direction in which society is moving, combined with the size, scale and culture of these platforms, is it good, helpful or useful that they've decided to start imposing censorship and de-platforming people who aren't breaking any laws?

My problem is the arbitrary application of these Ts & Cs, which seem in some cases to be breached - but not in others, for example - everyone is more than happy that Alex Jones has been removed and everyone was also happy when Britain First were removed from Facebook, however Antifa - who have been labelled as domestic terrorists by the US government are still able to use those platforms, yet as so called terrorists, in the eyes of the law they're breaking more rules than Alex Jones, who in my mind - is just a toxic, mental individual with a big mouth.

I think I agree with Jordan Peterson's analysis, in that now these social media companies have stepped in and decided to become the arbiters of what is and what is not allowed, rather than be neutral platforms; they've potentially bitten off more than they can chew. Because once you start censoring one mental group, you have to start censoring them all or it becomes unfair - essentially they'll run out of manpower before they can do it properly, and I think it's a mistake.

I'm worried, that we're removing the ability for people to simply turn things off or look the other way, for me it's far better to engage in debate and defeat it with an opposing narrative, out in the open. Rather than rely on arbitrarily applied filters and policies, to ensure that I'm operating in a 100% trigger-free safe-zone at all times.

I think it's potentially more dangerous to de-platform these groups, and push them into closeted echo-chambers, where their insane views won't be challenged or argued against, in the same way they would be on a large open platform.
 
Last edited:
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
They are the arbiters of what is and what is not allowed ON THEIR OWN PLATFORM and nowhere else. Alex Jones is not in any way (and rightly so) having his right to free speech taken away from him.

I understand they can allow or deny whatever they wish, arbitrarily on their own platforms, according to their Ts & Cs, that's not the point though - the point is; Is that a good thing?

I'm not arguing that Alex Jones is having his freedom of speech taken away (I don't believe I've said or implied this anywhere), because he can setup his own website and preach nonsense from it. I'm saying that I have concerns around how these rules are applied on these platforms, their effectiveness and their impact on other groups - for example, I assume you're fine with Alex Jones being banned, whilst Antifa post all they like - and use those platforms to organise riots and other such things?

What you seem to be proposing, is that anyone with a website should be allowed to host any content, even if said content is hate speech and threats. That is ridiculous. The fact is, is that Facebook, Youtube, and Twitter can rightly decide what is and what is not hosted on their own website (within the law, and in terms of not being discriminatory).

No I'm not - I've said at least four or five times and been very clear, I think that content should be allowed provided it doesn't break the law [which would include types of hate speech and threats]. If Alex Jones has broken the law then he should be blocked/removed - but as far as I can tell he hasn't, however I'd happily stand corrected if he has.

I don't class OCUK censoring swearwords as being a competing argument, merely keeping things family friendly by removing swearwords, isn't de-platforming people, there have been many crazy people filling threads with conspiracy theory nonsense and even posts containing racism by people like a certain Mr Wilson, but they've not been removed.
 
Last edited:
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
Not at all. Is it even true though? Do Antifa publically call for and organise riots on these platforms?

If so, I would have thought publically calling for people to break the law and incite violence would be against their terms, so they should be banned also.

Yeah there are loads of them, many of which are country specific - but they do contain information about marches, protests, where they're occurring etc, (once you dig through all the other stuff) they've also been the subject of front page global news on a number of occasions, so it's not exactly 'under the radar' so to speak...

https://twitter.com/antifa_riot?lang=en

https://twitter.com/antifausa?lang=en

Of course, we're all able to read between the lines - when they say a protest, or march - we all know exactly what they're angling for (a good punch up)

And so if you think that Antifa or BLM should be banned because they're using these platforms to propagate or incite nastiness, as they evidently do (just like Alex Jones) then surely you see my point, which is these rules, Tc & Cs aren't being applied fairly, aren't being applied according to any coherent framework or authority, they seem to be applied arbitrarily - which they can do, because those platforms are private, but that doesn't seem very 'fair' does it?

Isn't it also interesting, how many of these hard right wing organisations, (Alex Jones, Britain First, etc, etc) get de-platformed, however the hard left ones seem free to go on doing as they please, even when labelled as domestic terrorists.

Even more interesting, is how those large social media and tech companies are jam packed to the rafters with hard left employees, (I work in tech, frequently in Los Angles, San Jose and that area - I have many friends at Facebook, Twitter, Google etc - they're very left leaning organisations, to the point where I think it's become unbalanced) maybe just maybe, that has some influence in how those Ts & Cs are enforced...?

Then you haven't been paying attention. He got banned from Apple and Facebook for breaking their hate speech policies.

It isn't as if they just banned him straight away either. He made repeated violations and had warnings.

https://www.cnbc.com/2018/08/06/apple-pulls-alex-jones-infowars-podcasts-for-hate-speech.html

I repeat; did Alex Jones break the law? Was he arrested and charged with any offences? Because as far as I can tell he's only ever been arrested once in his life - a misdemeanour for using a megaphone without a permit.

What? Alex Jones would have been banned on OCUK in an instant if he posted here, what he posts on other social media platforms.

Jones was banned from Youtube for cyberbullying and harassment. You would get banned from here for the same thing i'm sure.

It's difficult to know, and why the whole OCUK comparison to a social media platform like Facebook is a poor comparison, OCUK isn't really compatible with the sorts of things that Alex Jones does or says, he's basically a TV show that screams nonsense and sells snake oil, you can't really do that on here - the best he could do would be to post nasty things, and as I've pointed out - many people have posted nasty things on here and not been banned, including racist and homophobic things - they're still here though, they just didn't use swear words or threaten individual members.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
Ah I see, so you have a preconceived and assumed conclusion that these protests and marches are just riots. That would open up an entirely different can of worms for Youtube/Facebook etc. They are the ones to decide if a protest will end up in violence or not?

I agree with you though, if those accounts break their terms of service, they should be banned too.

If a group are using a platform to organise "events" which frequently turn violent and end up with riots and violence, (regardless of how they're labelled on the platform) then it only takes an ounce or two of due diligence to conclude what's going on, use of the platforms to organise and create disturbances and unrest... Again, groups like Antifa and BLM frequently make front page news globally with their protests, riots and disturbances - it's not exactly unheard of, for these groups to be smashing things up on the streets, just as it is the right wing neo nazis, however it always seems to be the latter that gets banned.

Yes it is interesting that well educated professionals who work for successful technology companies are more left leaning. Very.

But that doesn't make the application of the rules fair on society - it means they're enforcing their worldview on people, by applying their own narrative and shutting down anybody with an opposing view,
because it supposedly breaks Ts & Cs, this is evident when other groups which could be argued are just as bad but have a similar world view,​
are allowed continue.

Maybe it's true that lots of clever people work for the big tech companies, but does that mean society should be happy to hand the rules of censorship to an untouchable West Coast based elite, with no real say on the application of those rules - just because they all have an IQ over 9000, we should just leave everything to them right?

You brought up hate speeches. He was banned for hate speeches, even after warnings etc.

I don't agree with you that the litmus test for being banned from a privately owned social media platform is breaking the law. That is ridiculous.

I'll agree there's probably no perfect system at this point, maybe I'd agree that having nothing other than 'the law' by itself wouldn't be completely ideal, but in my opinion it would be a decent starting point, because it is to a degree impartial - and the lawmakers themselves [the government] are accountable to the people, the tech companies can pretty much do as they please with their private infrastructure and are largely accountable to nobody, they can enforce any rules they like, at any time - to suit any narrative they please, maybe they even have political agendas? who knows.

You are disingenuously downplaying how bad Alex Jones actually is. Without a shadow of a doubt, I think he would be banned from here if he continually posted the same things that he has done in the past. Perhaps a mod could settle this for us?

I think Alex Jones is absolutely awful - I'm not downplaying him at all, I think he's one of the worst people I've ever seen in a video - however in the final analysis, I'd rather be left to make my own mind up on him, than San Jose make it up for me, or force me to go trawling through some hell-hole of maniac websites to find his content, because I want to have a good laugh.

I think, that if someone came in here (as they have done before) and started talking about conspiracy theories, or how Sandy Hook was fake or whatever, the thread would most likely just get locked because it would descend into utter chaos and toxicity on all sides. I could also understand it because OCUK is a small business that relies on customers buying computer stuff, I doubt they'd want their shop forum having threads full of utter chaos, because that could harm their business.

That would obviously be censorship, but it could perhaps be argued that it's more a method of protection for itself, to prevent things being said on it's web forum from hurting its sales, because it's primary reason for existence is to sell computer hardware, not provide a social media platform.

There's a gulf of difference between a company like OCUK who sells computer hardware with an internet forum of 130k members, to a global social media company, which exists to provide a platform, with 2-3 billion users that houses entire political campaigns, global news media, celebrity stuff, everything under the sun from all walks of life, to almost everybody - the difference is incomparable, which is why I think comparing OCUK to the large social media companies, as totally facile, and not really helpful in either side of the debate.​
 
Last edited:
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
That still opens up an even bigger can of worms though as then the companies themselves have to decide on whether a march or protest that is being organised will become violent or has definite violent intent. The difficulty also, is that the described intention is often ultimately noble. They are against racism, white supremacy, fascism, sexism, antisemitism etc. All of which are demonstrably bad things. I don't agree that anything should be solved with violence, but that leaves it up to the company themselves to work out whether there is true intent to cause and incite violence.

I think you're being naive if you think Antifa are truly against all of those things, some of them are - some of them just want to smash things up and have a good old riot. As for decision making - if the US authorities have labelled them as a terrorist organisation - as far as I'm concerned, the decision has been made already.
What do you mean their own narrative? As it is their own company, they can apply their own "narrative" as much as they want to their own platform. You seem very confused. You seem to be genuinely advocating for there to be some government rule to force social media companies to host anything and everyone and not be allowed to ban people who violate their terms of service. That truly would be a violation of their own free speech and rights..

You also have to remember that the "narrative" is set against lies, bullying, racism, hate speech, white supremacy, fascism, sexism, antisemitism etc. This is a good thing.

By narrative, I mean a hard left worldview.

Of course, it's their own company - and yes, they can apply their own narrative as much as they want on their own platform, I'm not talking about anybody being forced into doing anything, I just think that a platform such as Facebook, Youtube, Twitter etc - should attempt to be as neutral as possible, and it is in fact a mistake for them to impose Ts & Cs which favour any specific narrative or worldview, because then it's no longer a neutral platform, and then the Ts & Cs won't be applied fairly to everyone, if you start to ban one group of people - you have to go after all of them, it can't be done properly at Facebook scale.

Simply saying; "They're against lies, bullying, racism, hate speech, white supremacy, so we should just let them swing the hammer at anybody who's part of those groups" is problematic, because as soon as someone from Antifa or Blacklivesmatter starts engaging in hate speech or bullying, or even arguably racism (BLM) - they get let off scot free, because the people swinging the ban hammer aren't applying the rules in a uniform manner, their skewed towards their own prejudice.​

Given the choice, I would prefer for us to be governed by intelligent and well educated people, yes.

Intelligent and well educated people are one thing, but then they come from a society which is so totally backward and broken (American society) I'd question a lot of the policies and practises which these companies may implement in their platforms.

Like Google Safesearch - with it turned on, you can find videos of people being killed, islamic state beheading people, murdering, killing, hatred - yet not a single bare breast or backside,

All developed by super clever people, with totally backward and broken views, it's nonsense - yet it's the same for everyone if you turn it on.

Ultimately (and i point to Werewolf's insightful post about it again), they are comparable because a large part of the decisions are based on making money. Hate speech and racism etc is fortunately not popular and will potentially affect advertising revenue form them. The same way it would likely affect OCUK's sales if this forum was associated with that sort of thing.


What is your ultimate proposal to combat this? Do you think that by law, private companies or individuals with social media websites should be forced to host anything, as long as it is within the law?

I don't think anybody should be forced to do anything, instead people using these global social media platforms, should learn to deal with things they find offensive without relying on some sort of higher authority to sanitise everything for them - look the other way, turn it off, unsubscribe, change the channel, don't click on it, or just deal with it.

It reminds me of that episode of Black Mirror, (Arkangel) where a child gets fitted with an implant, which can filter out dangerous, disturbing or upsetting images from their vision or sound in real life - which at first seems great, but ultimately ends up crippling her and becomes a massive problem, because she's no longer able to make her own mind up about anything - it's all being done by a third party.
 
Soldato
OP
Joined
29 Dec 2014
Posts
5,781
Location
Midlands
I just discovered that a left-leaning youtuber I like, cultofdusty, has been banned from several platforms, because the alt-right who he upsets have been trawling through his posting history and reporting anything remotely questionable. apparently he was banned from twitter for referring to someone as a cowardly bitch.

now, I don't think they should have banned him and don't think they can justify it within their own Ts&Cs like they can with far more egregious violators, but they are perfectly within their rights to do so...

To be fair Twitter is really, really weird when it comes to accounts being suspended, a friend of mine had their account suspended for calling someone a **** when they smashed up a new iPhone for fun with a hammer, yet other people seem to post all manner of horror without any repercussions at all.
 
Back
Top Bottom