Social media bosses could be liable for harmful content

Caporegime
Joined
29 Aug 2007
Posts
28,730
Location
Auckland
source

Look, first things first ... yes it's the Griadion but let's move on from that.

Social media executives could be held personally liable for harmful content distributed on their platforms, leaked plans for a long-awaited government crackdown obtained by the Guardian reveal.

and

The regulator – likely initially to be Ofcom, but in the longer term a new body – will have the power to impose substantial fines against companies that breach their duty of care and to hold individual executives personally liable.

This sounds like a good idea to me. Thoughts?
 
Duty of care. That's the thing, where do you draw the line? And what exactly is harmful on social media?

Sticks and stones may break my bones bit mean words may land you in prison... I think it might be banned here but "man the fudge up" or stfu
 
But who defines what 'harmful content' is? Illegal content fair enough. But something doesn't become 'harmful' simply because a few idiots on the internet take umbrage with it. Let's not forget, in this country we have the police chasing people down purely because they retweeted something that a few cretins didn't like. It wasn't illegal, it wasn't a hate crime, it wasn't even them that originally posted it yet the police thought it necessary to track them down because they wanted to 'check his thinking'.
 
They need to be liable for censoring stuff that's clearly not illegal as well, censoring stuff that's legal on ideological grounds and leaving things that are clearly illegal is the way they seem to operate atm, they are a law unto themselves. TBH the sooner there is some kind of Bill of Rights for the internet the better, gone are the days of meetings in town halls. All of the hysteria about Russia interfering in elections when the biggest influencers these days are clearly social media corporations through filters, banning, censorship, de-ranking, de-monetising etc under the guise of protecting elections, it's funny how the left always claim to fighting that which they are guilty of.
 
Last edited:
But who defines what 'harmful content' is? Illegal content fair enough. But something doesn't become 'harmful' simply because a few idiots on the internet take umbrage with it. Let's not forget, in this country we have the police chasing people down purely because they retweeted something that a few cretins didn't like. It wasn't illegal, it wasn't a hate crime, it wasn't even them that originally posted it yet the police thought it necessary to track them down because they wanted to 'check his thinking'.

Couldn't have said it better myself.
 
Only as far as providing suitable systems for content to be reported and moderated in a timely fashion. People need to get a grip and realise technology isn't magic - also often there are underlying agendas with these proposals such as making a case for further reaching control of the internet rather than genuine concerns about inappropriate content :(
 
I personally don't think it's fair to place all the responsibility on the sites hosting the content/facilitating the discussion.

It's like blaming the atmosphere for carrying the sound waves made when we speak (OK it's not really).

I think we can realistically expect firms to keep making progress in this regard. I don't think we can expect them to solve the problem overnight.

It's like saying we'll fine physicists every year they fail to make a working fusion reactor. They're making progress - that's all we can ask.
 
Owners of physical spaces that are open to the public have a degree of responsibilty over what happens in those spaces. I don't see why virtual spaces are any different in that regard.

"We're just a social conduit" wouldn't work for a pub landlord if punters were openly supporting terrorism, for example.
 
Owners of physical spaces that are open to the public have a degree of responsibilty over what happens in those spaces. I don't see why virtual spaces are any different in that regard.

"We're just a social conduit" wouldn't work for a pub landlord if punters were openly supporting terrorism, for example.

Same thing though - if the owner of a social media site was aware of and didn't do anything then sure they'd be held accountable even now, if the content was reported and nothing done about it same. But the systems are there to make the company aware of such content.
 
Owners of physical spaces that are open to the public have a degree of responsibility over what happens in those spaces. I don't see why virtual spaces are any different in that regard.

"We're just a social conduit" wouldn't work for a pub landlord if punters were openly supporting terrorism, for example.
Well for one you don't need sophisticated algorithms to overhear conversations in public spaces.

Secondly you can't be expected as - eg a pub landlord - to overhear all conversations at once and not let anything gets past your ears.

So yes there are fundamental challenges with social media that do not translate into the physical world.
 
But who defines what 'harmful content' is? Illegal content fair enough. But something doesn't become 'harmful' simply because a few idiots on the internet take umbrage with it. Let's not forget, in this country we have the police chasing people down purely because they retweeted something that a few cretins didn't like. It wasn't illegal, it wasn't a hate crime, it wasn't even them that originally posted it yet the police thought it necessary to track them down because they wanted to 'check his thinking'.

Precisely. People have differing tolerances for what they find harmful or upsetting.
We've all seen how we have some members on here who get their noses out of joint at the slightest thing and start throwing around shouts of racist or Nazi whenever someone doesn't agree with them or they don't agree with someone.
Do we bow to the lowest common denominator? In which case any opinion that doesn't fit the current narrative or broad consensus is basically harmful and therefore not to be tolerated.
 
Precisely. People have differing tolerances for what they find harmful or upsetting.
We've all seen how we have some members on here who get their noses out of joint at the slightest thing and start throwing around shouts of racist or Nazi whenever someone doesn't agree with them or they don't agree with someone.
Do we bow to the lowest common denominator? In which case any opinion that doesn't fit the current narrative or broad consensus is basically harmful and therefore not to be tolerated.
I don't think we've seen Nazi or racist in this thread, apart from your comments.

And you're a bright guy but this latching onto "narrative" ... it's just tiring, you know?
 
I don't think we've seen Nazi or racist in this thread, apart from your comments.

And you're a bright guy but this latching onto "narrative" ... it's just tiring, you know?

I didn't say we had seen it in this thread. But we frequently see it in others.
I'd say it's actually often more insulting than the comments that trigger it.
 
I wonder if this will lead to a stricter identity verification on social media platforms, if they are going to held liable for what you post then I would have thought they'd want to be able to track you down. Breaking site rules could end up in a lawsuit :eek:

Out of interest how far can enforcement of site rules go, if you post something that deliberately gets around content filters which then brings the sites reputation in to disrepute can they do anything legally to punish you? Or would they just have to pass details on to the police for anything linked to hate crime etc.
 
Same thing though - if the owner of a social media site was aware of and didn't do anything then sure they'd be held accountable even now, if the content was reported and nothing done about it same. But the systems are there to make the company aware of such content.

I'm not sure they are. There are online spaces which advertise themselves as "pro free speech" where very little seems off-limits.

Well for one you don't need sophisticated algorithms to overhear conversations in public spaces.

Secondly you can't be expected as - eg a pub landlord - to overhear all conversations at once and not let anything gets past your ears.

So yes there are fundamental challenges with social media that do not translate into the physical world.

There are challenges in managing this kind of thing in both the physical and online world.
 
Back
Top Bottom