EV general discussion

Humans are far from perfect drivers so surely the benchmark is to be better than humans but not zero collisions? By that logic, some collisions are therefore acceptable.

I also don’t think it’s unreasonable to expect the autonomous systems to make different mistakes to what humans ordinarily would make.

It’s not actually an autonomous car so it’s sort of a moot point anyway. The human is meant to be in control, that’s who is ultimately at fault but the cars system was clearly the second contributing factor followed by the road layout.

There is definitely a long way to go on these automated features, no matter what said CEOs tell their investors.
You can't apply a principle of it being simply acceptable that autopilot is better than 'humans'.

Driving standards vary wildly so are we simply saying that if it crashes less than the average driver then that's ok?

With life and death stuff like this you can't take a black and white view of X number of people die on the roads now, it would be x - 10% if everyone used autopilot. That isn't good enough. Nor is accepting that the systems make different mistakes, they need to make no mistakes.

I accept that crashes will happen (software can't work beyond the laws of physics) but we can't have a situation where the computer having a hiccup ends up with someone losing their life.

"I'm terribly sorry that your son/daughter died because of a duff bit of code. Hopefully it will be of some comfort that generally the system is better than an average driver". Imagine the litigation if nothing else, although I'd prefer to focus on the human nature of whether I could forgive a mistake by an individual or that of a car company working to a budget trying to shift the next shiny shiny with tech.
 
Last edited:
As for the crash, stuff happens. Humans crash all the time and rarely are two cares given. It even says that in the article linked.

Self driving vehicles crash at over double the rate of human drivers though, and that statistic includes the substantial proportion of accidents due to drugs, alcohol, and other impairments. I'm not against it in principle but it's currently not fit to be used on the roads.

I’m not sure you can attribute the crash to it only using cameras, the underlying software is orders of magnitude more important than the sensor suite. The cameras would have seen the lane markings etc long before the car hit the object. The software ultimately decided to ignore them.

Maybe, maybe not, for that particular example but Teslas absolutely have had crashes due to the bad decision to only use cameras.
 
Last edited:
Self driving vehicles crash at over double the rate of human drivers though, and that statistic includes the substantial proportion of accidents due to drugs, alcohol, and other impairments. I'm not against it in principle but it's currently not fit to be used on the roads.

Strange source to pick for this but I’ll work with it.

Your source states they have a lower killed or seriously injured rate, is that not the point?

It also states the most common type of accident is a human driver rear ending the autonomous vehicle and the second is a human driver side swiping the autonomous vehicle. Those two alone account for 83% of collisions.

It sounds like human drivers are the issue here.

You can't apply a principle of it being simply acceptable that autopilot is better than 'humans'.

Driving standards vary wildly so are we simply saying that if it crashes less than the average driver then that's ok?

With life and death stuff like this you can't take a black and white view of X number of people die on the roads now, it would be x - 10% if everyone used autopilot. That isn't good enough. Nor is accepting that the systems make different mistakes, they need to make no mistakes.

I accept that crashes will happen (software can't work beyond the laws of physics) but we can't have a situation where the computer having a hiccup ends up with someone losing their life.

"I'm terribly sorry that your son/daughter died because of a duff bit of code. Hopefully it will be of some comfort that generally the system is better than an average driver". Imagine the litigation if nothing else, although I'd prefer to focus on the human nature of whether I could forgive a mistake by an individual or that of a car company working to a budget trying to shift the next shiny shiny with tech.

We’ve already had the scenario where the computer made a mistake and someone lost their life (the Uber autonomous vehicle collision with a pedestrian).

People are still going to be killed or seriously injured by fully autonomous cars, no one sensible is arguing that they will not.

Part of the objective of these systems is to lower the KSI rate but not eliminate it. It’s not unusual for safety systems themselves to cause different problems. One of the arguments against mandatory seatbelts was that they can cause injuries which otherwise wouldn’t occur but it was decided the pros outweigh the cons.

I know the 10% number you used was a strawman but I agree the KSI reduction would need to be significantly larger than that due to the potential impact of errors.

However, I don’t think you can set a hard line of no errors, that’s just not realistic and humans make loads of errors all the time that result in people being killed or seriously injured.

Fortunately for you, there isn’t any realistic prospect of autonomous cars becoming mandatory any time soon.
 
Last edited:
Your source states they have a lower killed or seriously injured rate, is that not the point?

Not really comparable data, because autonomous driving is only available on high end and recent cars which have higher than average crash safety due to improvements in safety technology. You're much less likely to be injured or die if you're sitting in a newer car.

It also states the most common type of accident is a human driver rear ending the autonomous vehicle and the second is a human driver side swiping the autonomous vehicle. Those two alone account for 83% of collisions.

It sounds like human drivers are the issue here.

Not really, it sounds like the autonomous vehicles are behaving erratically and unpredictably leading to a higher accident rate. Yes, if you're driving behind a car you should be able to stop at any time if they slam the anchors on for any reason and you're responsible for any resulting accident, but realistically a sudden unnecessary hard braking manoeuvre is always going to cause problems.
 
I have the Hypervolt as I didn't want a charger that looked like a toilet seat on my wall! The Hypervolt 3 is compatible with Octopus and has solar integration.

Mines covered over since I never have to mess with it.

I refuse to fly on any plane that has an auto pilot in use.

A few years ago we had to wait 5 hours for a co-pilot to arrive.
When we landed at Manchester the Pilot announced "We really didn't have to wait for him, the plane took off itself, flew itself and landed itself".
I don't know how true all that was but somebody will pick it apart, probably Jonnycoupe :)
 
Last edited:
cool story bro. Pilots are glorified bus drivers. Has to be the most disappointing jounrny of school boy dreaming to reality.

Im one of the few people who tried to help you work out how to drive your own car... so stay in your lane before you start calling people out by name
 
Last edited:
A few years ago we had to wait 5 hours for a co-pilot to arrive.
When we landed at Manchester the Pilot announced "We really didn't have to wait for him, the plane took off itself, flew itself and landed itself".

It's mostly true, the only thing the pilot is needed for in ordinary operation is communicating with Air Traffic Control and reacting to their commands - however, things aren't always in "ordinary operation".
 
I think you’ll be travelling by bus and rail from now on then as autopilot is routinely engaged for sections of even short haul flights.

Same goes for a lot of shipping.
It was a joke. Planes fly themselves these days, the pilots are there just in case and need to pay attention at all times. A bit like the state of self driving, except people forget the second part and blame the car.
 
I think you’ll be travelling by bus and rail from now on then as autopilot is routinely engaged for sections of even short haul flights.

Same goes for a lot of shipping.
And rail isn't?

A lot of system are automated now, however those systems don't have kids/people/animals that just run out and are routinely blocked or similar.
 
Not really comparable data, because autonomous driving is only available on high end and recent cars which have higher than average crash safety due to improvements in safety technology. You're much less likely to be injured or die if you're sitting in a newer car.



Not really, it sounds like the autonomous vehicles are behaving erratically and unpredictably leading to a higher accident rate. Yes, if you're driving behind a car you should be able to stop at any time if they slam the anchors on for any reason and you're responsible for any resulting accident, but realistically a sudden unnecessary hard braking manoeuvre is always going to cause problems.

I’m not sure you can draw any such conclusions because there simply isn’t enough detail in your source, it’s just some basic high level statistics which lack context. They are also taken from seemingly lots of different sources.

For example, it doesn’t make clear what’s included in the autonomous driving and what isn’t.

Is it just the likes of Waymo or does it include level 2 cars in public hands. I suspect it’s the former not the latter, the latter isn’t actually autonomous driving and doesn’t need to be reported to anyone.

The general accident rates is pulled from the likes of insurance and police data and we know that not everyone reports all accidents, particularly minor ones. Waymo etc. have to report everything in detail including sub-1mph collisions which make up over 40% of all collisions they are involved in.

Waymo (et al) cars are only driven in urban environments where accident rates are much higher per miles driven compared to all driving which includes high speed roads which generally have lower accident rates.

It’s widely reported that a lot of the crashes involving Waymos are largely down to distracted drivers and not something the car has done.

E.g. https://arstechnica.com/cars/2024/09/human-drivers-are-to-blame-for-most-serious-waymo-collisions/

Their version of the data they report is here: https://waymo.com/safety/impact/

So yes the data isn’t directly comparable but not for the reasons you think.
 
Back
Top Bottom