Self-drive Uber identifies woman as false positive (which inc. plastic bags)

Capodecina
Soldato
Joined
30 Jul 2006
Posts
12,130
An Uber self-driving test car which killed a woman crossing the street detected her but decided to ignore her.

The modified self-driving Volvo XC90 was happily cruising alon at a leisurely 40 MPH when it collided with a 49-year-old woman pushing a bike across the road. She later died from her injuries.

The car’s sensors detected the woman but its software which decides how it should react was tuned too far in favour of ignoring stuff which might be plastic bags. LINK
 
Well not specifically I guess but it classified her as some sort of object it could ignore/drive through.

Has the human supervisor, who was supposed to be in charge of the vehicle still, been charged?
 
Well not specifically I guess but it classified her as some sort of object it could ignore/drive through.

Has the human supervisor, who was supposed to be in charge of the vehicle still, been charged?
Charged with what exactly? Manslaughter?
 
Occasionally self drive cars will hit stuff, especially as the technology is new and still being refined. Why is it news every time it happens? There are millions of accidents every day involving human drivers being utterly idiotic.
 
In much the same way a Surgeon occasionally saws off the wrong leg? Just put it down to a statistical anomaly you think?

Well not really, you take measures to try and reduce the odds of it happening in future..

like you have things like the surgeon confirm with the patient which leg they're operating on before the operation (it sounds silly but they did this before some key hole surgery on my knee), then they write on that leg with a pen

literally the surgeon saw me in my room before the OP, talked me through the procedure, confirmed the leg etc... then just before I went under confirmed the same again etc..
 
Surely self-driving cars should have heat camera's fitted as part of their detection system? Would be pretty hard to mistake a human for a car then.
 
Occasionally self drive cars will hit stuff, especially as the technology is new and still being refined. Why is it news every time it happens? There are millions of accidents every day involving human drivers being utterly idiotic.

Because the point of a robotic car is to remove the driver. Doing this requires proving that it is safe to remove the driver.

A robotic car failing to acknowledge someone in the road and hitting them at 40mph without touching the brakes does not support this.

Since the robot has no capacity to "think" the decision to not do anything was part of its coding and this means it will do it every single time in the same circumstances.

Its news because it was not a fault in the way a human doing something stupid would be an individual fault because they know what they are doing is wrong.
 
Because the point of a robotic car is to remove the driver. Doing this requires proving that it is safe to remove the driver.

A robotic car failing to acknowledge someone in the road and hitting them at 40mph without touching the brakes does not support this.

Since the robot has no capacity to "think" the decision to not do anything was part of its coding and this means it will do it every single time in the same circumstances.

Its news because it was not a fault in the way a human doing something stupid would be an individual fault because they know what they are doing is wrong.
Even if there are accidents, surely driverless cars would still be statistically safer than human driven cars? Nothings ever 100% safe, it’s all about mitigating risk.
 
Since the robot has no capacity to "think" the decision to not do anything was part of its coding and this means it will do it every single time in the same circumstances.

well that isn't necessarily true, it is constantly learning so it won't necessarily do the same thing every single time in the same circumstances
 
Even if there are accidents, surely driverless cars would still be statistically safer than human driven cars? Nothings ever 100% safe, it’s all about mitigating risk.

A robotic car needs to be safer because it has no excuse of being human.

I didn't say robotic cars were not ok, I was replying to why it was news when they fail.

well that isn't necessarily true, it is constantly learning so it won't necessarily do the same thing every single time in the same circumstances

Well that isn't necessarily false but learning is a highly abused term when it comes to computers.

In this example of keeping a steady 40mph through a pedestrian I would be amazed if the cars programs had any say in the variables that allowed it to happen.
 
Thoose captcha things used on websites demonstrate the issues computers have in reliably identifying objects in a number of different contexts (yes I know many humans struggle with them as well!)
 
Shouldn't the title be 'woman walks in front of car and dies'?
Harsh, but true.. ? Wasn't it a dark/unlit road and she decided to cross outside of a crossing with no lights etc on her? :confused: It's a great shame but hardly proof that driverless cars aren't safe.
 
Back
Top Bottom