Moral Machine

Soldato
Joined
30 Aug 2009
Posts
8,130
Location
one nation under sony

http://moralmachine.mit.edu/

Welcome to the Moral Machine! A platform for gathering a human Perspective on moral decisions made by machine intelligence, such as self driving cars.

we show you moral dilemmas, where a driver less car must choose the lesser of two evils, such as killing two passengers or five pedestrians. As an outside observer, you judge which outcome you think is more acceptable. you can then see how your responses compare with those of other people.
 
I took a different approach - which isn't intended or within the realm of the answers - and took the options that gave the highest chance of people getting out the way, longest travel time to minimise injuries rather than certain death for occupants, etc. heh... apparently I'm an evil person who doesn't uphold the law.
 
I'm an evil git :D

I picked everything which involved kill the car passengers over pedestrians, hit men over women, hit adults over children and hit poor/criminals over rich/workers and the website virtually called me Hitler!
 
Meanwhile in the real world, situations are almost never so binary :p

Do you run down the old lady and definitely kill her or take the longer path towards the baby in the pram that might survive due to the structure of the pram and longer time to reduce speed? :S
 
Do you run down the old lady and definitely kill her or take the longer path towards the baby in the pram that might survive due to the structure of the pram and longer time to reduce speed? :S

You swerve to the other side into the hedge and up with a few scratches in your paintwork :p
 
Some of those stats are very one sided, for example they assumed i liked saving criminals, but i answered on the basis that with the exception of when the cat was driving the car will have no ability to determine social class or level of "innocence" of pedestrians, the same way humans cant tell that at a glance either.
 
I went for go straight every time. Assuming the self driving car has non-functioning brakes, I'd rather it went straight than be totally unpredictable and randomly change lanes. And what about on coming traffic...

Conclusion, give self-driving cars brakes humans can use.
 
I went for go straight every time. Assuming the self driving car has non-functioning brakes, I'd rather it went straight than be totally unpredictable and randomly change lanes. And what about on coming traffic...

Conclusion, give self-driving cars brakes humans can use.

^^ This. The least chance of other collateral damage.

Although that apparently gave me a much greater propensity to aim for men, older and larger people!
 
Some of those stats are very one sided, for example they assumed i liked saving criminals, but i answered on the basis that with the exception of when the cat was driving the car will have no ability to determine social class or level of "innocence" of pedestrians, the same way humans cant tell that at a glance either.

You forget that this is a Google car. It knows everything about everyone and is probably programmed to kill iPhone users but save Android users.
 
Fun. In every choice where it was between people crossing the road with the green light or red, I always killed those crossing incorrectly, regardless of who it was. ("Upholding the law" at max :D). As others have said, it's not really practical to know the "fitness" and "social value" as the game puts it so I ignored them. If you cross at the wrong time you risk getting hit, whatever the kind of car.

Also as the above say, I went for "avoiding intervention" quite a bit.
 
Well, it turns out I'm the saviour of the homeless (maxed towards lower social standing) and am pretty much Judge Dredd when it comes to upholding the law (maxed towards being right in terms of the law).
 
Back
Top Bottom