Can't comment on US law as I am not fully up to speed, so do not know how the manufacturer could refuse if they get a court order to release the data.
......
It could also be his insurance company. I was in a talk with an American lawyer and for some of these cases, the insurace company is the one to start the legal proceedings regardless of whether the person in question wishes to.Just some jumped up lawyer trying to make a name for himself I bet.
Look like there was another one in California involving a Tesla again. Autopilot drove in to the back of a stationary fire engine at 65 mph
What they are saying here is quite interesting: https://www.wired.com/story/tesla-autopilot-why-crash-radar/
Look like there was another one in California involving a Tesla again. Autopilot drove in to the back of a stationary fire engine at 65 mph
What they are saying here is quite interesting: https://www.wired.com/story/tesla-autopilot-why-crash-radar/
I'd put money on it all going wrong within a year after launch.
If it relies on a cloud system it's going to break at some point.
You mean control like this:
Truck driver kills two pedestrians while playing Pokemon Go
Lorry driver kills four while on the phone
Romanian lorry driver who called two people was distracted by sat nav
Lorry driver kills van driver while checking how much time he had left
My commute down the M1 is often affected by accidents - the vast majority involve one truck or more.
I'm a firm believer that motorways need some form of automation.
Yes. Why not? We put up with humans driving them (approximately 90% of crashes are due to human driving by the way), so anything that makes our roads safer should be encouraged.
When Tesla added Autosteering into their range, there was a ~40% drop in the Tesla crash rate.
Hmmmmm
As we are already heading into the fourth year of cloud computing in many thousands of cars already on the roads of the world, do you want to change your statement at all?
It is well past a year and nothing has broken as yet.
A cloud system is a big single point of failure. Many companies have been bitten by this in the past few years and started bringing IT services back in-house. Just because it hasn't failed yet doesn't mean it won't spectacularly in future.
There have already been instances where GPS services have failed or developed issues (one was becoming a few meters out). What happens to all the AVs that rely on it?
.......
Tesla pulled off an awesome move when they got existing Tesla owners to essentially train their models, they've basically got self driving cars already.
A group at the Massachusetts Institute of Technology is gathering human input into the type of ethical dilemma such machines will face.
Participants are asked decide, for instance, whether a self-driving vehicle with brake failure should continue straight killing a woman, a baby, a criminal and a cat; or swerve, resulting in the death of a girl, a pregnant woman, a dog and a baby.