The ongoing Elon Twitter saga: "insert demographic" melts down

Status
Not open for further replies.
Soldato
Joined
10 May 2012
Posts
10,062
Location
Leeds
Humans are expected to make mistakes, we get distracted, we think we are better than we are, we are human, its why we have to have insurance and pass tests. Autonomous vehicles however don't suffer from those human traits/weaknesses. To allow vehicles with no human in it, it has be infallible. The public won't accept an autonomous vehicle making an error and killing someone. If a human does it the human pays a price, they lose their licence and likely their liberty as well. Are you going to put that vehicle in jail? or the CEO of the company that made it? As for the all conditions, we drive in all conditions and in some conditions these vehicles struggle massively.

Would you rather have a pilot who has trained and passed the required tests flying your plane or software that isn't infallible and struggles to know what is going on around it when its dark/raining/foggy/snowing. Although for a plane there isn't other planes coming at it just half a meter away at 60mph or planes pulling out of junctions or kids running into the road. We won't see robotaxi style vehicles for years and years other than in maybe some states that put corporations above public safety. And the first time it goes badly wrong and someone gets killed or badly hurt the public backlash will be massive.

I think for all intents and purposes self driving vehicles will be infallible, the accidents will happen due to a human error somewhere in the chain. A self driving car can understand that the conditions are affecting it's ability to sense the environment, at which point it can just stop moving. There will always be accidents and deaths when you move people around, self drive will substantially reduce those deaths in the future, additionally accidents will be a point for the engineers to learn from; though I expect AI will just make self drive so good that this won't actually be issue.
 
Last edited:
Caporegime
Joined
20 May 2007
Posts
39,703
Location
Surrey
I think for all intents and purposes self driving vehicles will be infallible, the accidents will happen due to a human error somewhere in the chain. A self driving car can understand that the conditions are affecting it's ability to sense the environment, at which point it can just stop moving. There will always be accidents and deaths when you move people around, self drive will substantially reduce those deaths in the future, additionally accidents will be a point for the engineers to learn from; though I expect AI will just make self drive so good that this won't actually be issue.

Hard disagree

Most of my tech doesn't work properly, has a design flaw or goes wrong in some way or has done at some point.

You only have to have a limited experience in the modern world to realise that our tech goes wrong all the time, and that software is never fully finished or debugged and needs seemingly neverending updates.

Your faith that something so complicated (and something that will be mass produced as well) will be infallible is astounding.
 
Soldato
Joined
3 Oct 2007
Posts
12,097
Location
London, UK
I think for all intents and purposes self driving vehicles will be infallible, the accidents will happen due to a human error somewhere in the chain. A self driving car can understand that the conditions are affecting it's ability to sense the environment, at which point it can just stop moving. There will always be accidents and deaths when you move people around, self drive will substantially reduce those deaths in the future, additionally accidents will be a point for the engineers to learn from; though I expect AI will just make self drive so good that this won't actually be issue.

Ah so we do agree it will need to be infallible. The problem is getting to that place as they are miles away from it at the moment.
 
Last edited:
Caporegime
Joined
24 Oct 2012
Posts
25,063
Location
Godalming
Hard disagree

Most of my tech doesn't work properly, has a design flaw or goes wrong in some way or has done at some point.

You only have to have a limited experience in the modern world to realise that our tech goes wrong all the time, and that software is never fully finished or debugged and needs seemingly neverending updates.

Your faith that something so complicated (and something that will be mass produced as well) will be infallible is astounding.

Your tech wasn't made by tesla tho, clearly inferior innit
 
Soldato
Joined
6 Feb 2019
Posts
17,600
It won't take long till we start seeing cybertrucks full of rust then. What the Tesla user manual is essentially saying is you have to wash the thing like twice a week if you drive it everyday
 
Last edited:
Soldato
Joined
10 Mar 2003
Posts
6,744
I think for all intents and purposes self driving vehicles will be infallible, the accidents will happen due to a human error somewhere in the chain. A self driving car can understand that the conditions are affecting it's ability to sense the environment, at which point it can just stop moving. There will always be accidents and deaths when you move people around, self drive will substantially reduce those deaths in the future, additionally accidents will be a point for the engineers to learn from; though I expect AI will just make self drive so good that this won't actually be issue.

And who built the software? There will always be a human in the chain. The software would have to be 100% infallible, secure and be able to interact with every other car maker on the planets different software. To ensure the Ambulance, etc. services got to anywhere it would then have to link in with those services as well to create a better 'get out of the way' system. Besides there is no such thing as driverless. There has to be a licenced driver at the controls of the vehicle. If you're not paying attention and have an accident you're the one who is going to be looked at - whether the car did or didn't do something it's you at fault.


M.
 
Soldato
Joined
3 Oct 2007
Posts
12,097
Location
London, UK
And who built the software? There will always be a human in the chain. The software would have to be 100% infallible, secure and be able to interact with every other car maker on the planets different software. To ensure the Ambulance, etc. services got to anywhere it would then have to link in with those services as well to create a better 'get out of the way' system. Besides there is no such thing as driverless. There has to be a licenced driver at the controls of the vehicle. If you're not paying attention and have an accident you're the one who is going to be looked at - whether the car did or didn't do something it's you at fault.


M.

We aren't going to see driverless vehicles in the next 15 years IMO. Yes there might be the odd state that bows to cooperate pressure but not mainstream. And I'd want to see legislation that fully holds the producer responsible for any accidents caused by their vehicles. For once don't give these companies and its executives a pass.
 

JRS

JRS

Soldato
Joined
6 Jun 2004
Posts
19,535
Location
Burton-on-Trent
We aren't going to see driverless vehicles in the next 15 years IMO. Yes there might be the odd state that bows to cooperate pressure but not mainstream. And I'd want to see legislation that fully holds the producer responsible for any accidents caused by their vehicles. For once don't give these companies and its executives a pass.

Driverless cars will be uninsurable until the matter of who is liable - manufacturer, software provider, 'driver', vehicle owner - in the event of a crash is resolved.
 
Joined
12 Feb 2006
Posts
17,227
Location
Surrey
Honestly I hope automated driving comes to this country soon, even what Elon has to offer. I don't think it'll be better driver than me, but there's absolutely tones of friends and family I know that I'd rather a computer driving over them.
 
Caporegime
Joined
29 Jan 2008
Posts
58,912
Why would the bar for AI self driving be "infallible in all conditions"? Surely the bar is just, better than humans, at which point by not using self drive you're actually costing lives.

Correct, it would clearly cost lives and it's an absurd position.

To allow vehicles with no human in it, it has be infallible. The public won't accept an autonomous vehicle making an error and killing someone. If a human does it the human pays a price, they lose their licence and likely their liberty as well. Are you going to put that vehicle in jail? or the CEO of the company that made it? As for the all conditions, we drive in all conditions and in some conditions these vehicles struggle massively.
[...]We won't see robotaxi style vehicles for years and years other than in maybe some states that put corporations above public safety.

This is flawed and you're not up to date on this at all, robotaxis already exist in San Francisco and you've got the public safety aspect completely backwards; moving to self driving removes human errors. It doesn't mean they are going to be completely infallible though, just that they'll make roads much safer than they are currently:


Secondly, the other point re: putting people in jail seems like a total red herring, this isn't some new problem you need to solve.

If you live on a hill and your handbrake fails, sending your car into someone's house down the hill and causing significant damage then you're liable as it's your car. That no one is going to jail over it vs if you'd perhaps been in the car and driving recklessly or drunk is irrelevant.
 
Caporegime
Joined
29 Jan 2008
Posts
58,912
Honestly I hope automated driving comes to this country soon, even what Elon has to offer. I don't think it'll be better driver than me, but there's absolutely tones of friends and family I know that I'd rather a computer driving over them.

Won't be long until it's not only a better driver than you but will be a better driver than the top F1 drivers if say racing were the goal.
 
Caporegime
Joined
18 Mar 2008
Posts
32,747
So what stops someone hacking into these things and killing dozens of pedestrians or companies using it to scare politicians into supporting them for less money?
 
Last edited:
Caporegime
Joined
30 Jun 2007
Posts
68,784
Location
Wales
Would you rather have a pilot who has trained and passed the required tests flying your plane or software that isn't infallible and struggles to know what is going on around it when its dark/raining/foggy/snowing.
Tbf those are the conditions where its usualy rhe plane flying. Because the software can see things the human can't as it isn't limited to vision.


The human pilot will kill you much more often than the software.
 
Status
Not open for further replies.
Back
Top Bottom