- Joined
- 29 Aug 2007
- Posts
- 28,730
- Location
- Auckland
What kind of constraints would be legally enforeceable and workable? In your view.Yes.
What kind of constraints would be legally enforeceable and workable? In your view.Yes.
That's the kind of question that would keep a lawyer employed for a good few years... none of us have a bat's chance in hell of giving a meaningful answer.What kind of constraints would be legally enforeceable and workable? In your view.
What kind of constraints would be legally enforeceable and workable? In your view.
How would you achieve the removal of illegal content?I believe content that is illegal should be removed and the operators of social media should be obligated to do this.
In which case the amount of content on social media sites will literally drop off a cliff.Imo they should be treated like publishers. This free pass they were given gives them zero incentive to police the content on their sites.
How many thousands of staff to review every post that gets flagged on Facebook?@FoxEye a user flagging system with staff to then review it.
Why should this be an issue?How many thousands of staff to review every post that gets flagged on Facebook?
Well, at least that is clearI believe content that is illegal should be removed and the operators of social media should be obligated to do this.
I do not believe that 'harmful' content should be covered under these regulations.
As an example, someone may not approve of homosexuality. They may state that they find it disgusting or wrong or abhorrent. I do not agree with them. I do think they should be freely allowed to hold and state those opinions however.
If they then state that homosexuals should be harmed that has crossed a line in to illegality. That's the point at which that content should be removed.
As for enforcement...community support helps. If something has been flagged then perhaps x number of days should be allowed to review it.
Edit, apologies if you miss this. As a further example of why I don't think 'harmful' works due to individual subjectivity.
Recently on here we had a poster say that a child should be tortured and killed. That post was seen as being acceptable. We had another poster use a term which the dictionary defines as not being offensive and yet the posts referencing it were deleted.
I think that's completely back to front.
Dis said:]As for enforcement...community support helps.
In which case the amount of content on social media sites will literally drop off a cliff.
Publishers employ people to proof read, edit, re-draft... all their publications.
But who defines what 'harmful content' is? Illegal content fair enough. But something doesn't become 'harmful' simply because a few idiots on the internet take umbrage with it. Let's not forget, in this country we have the police chasing people down purely because they retweeted something that a few cretins didn't like. It wasn't illegal, it wasn't a hate crime, it wasn't even them that originally posted it yet the police thought it necessary to track them down because they wanted to 'check his thinking'.
Well, at least that is clear
I don't like your examples by way of exemption, I think they're subjective and are quite telling. But I get it, again.
What do you mean by this?
How many thousands of staff to review every post that gets flagged on Facebook?
It’d be healthier and better for all involved if they were simply banned outright.
However, I could find no data on number of people banned folowing NewZealand, for re-distributing content (noted this morning BBC's using it's name)Facebook already review flagged content and remove harmful and illegal posts, so for the likes of Facebook there is liekly only a moderate change increase personnel.
Probably also helps to have a system where when people flag content if the content is actually removed then they have an increased weighting to help expedite more likely content.
Owners of physical spaces that are open to the public have a degree of responsibilty over what happens in those spaces. I don't see why virtual spaces are any different in that regard.
"We're just a social conduit" wouldn't work for a pub landlord if punters were openly supporting terrorism, for example.
What kind of constraints would be legally enforeceable and workable? In your view.