Bank Holiday Horror

The drunk driver was parked in lane 1... the 2nd driver is out on bail until his trial. So might not be guilty of anything, just procedure for such a big accident?
 
the 2nd driver is out on bail until his trial. So might not be guilty of anything, just procedure for such a big accident?
No, he's been charged and is going to court, so the police must have sufficient evidence against him and the CPS must think there is a reasonable chance of conviction for it to get that far.

Edit: Innocent until proven guilty and all that, but yeah.
 
From the report on th eDM the Fed-Ex driver was about to overtake the MiniBus when it swerved into his lane to avoid The Parked HGV.
I would imagine if the Fed-Ex driver might be prosecuted for "Due care and attention" because he should've been aware of what was about to happen.

But then he might be completely exonerated as it obviously caught him out just as much as the Minibus driver.

Who knows.
 
From the report on th eDM the Fed-Ex driver was about to overtake the MiniBus when it swerved into his lane to avoid The Parked HGV.
I would imagine if the Fed-Ex driver might be prosecuted for "Due care and attention" because he should've been aware of what was about to happen.

But then he might be completely exonerated as it obviously caught him out just as much as the Minibus driver.

Who knows.

Horrid situation - seen it a couple of times (fortunately without major incident) where the person coming up behind the lorry is on their phone and then notices and swerves out to the other lane at the last moment :( already being in the other lane there is only so much you can do to accommodate them.
 
From the report on th eDM the Fed-Ex driver was about to overtake the MiniBus when it swerved into his lane to avoid The Parked HGV.
I would imagine if the Fed-Ex driver might be prosecuted for "Due care and attention" because he should've been aware of what was about to happen.

But then he might be completely exonerated as it obviously caught him out just as much as the Minibus driver.

Who knows.

So;

Driver A is parked in L1
Driver B, travelling in L1, fails to register this until he is too close to stop and swerves into the path of Driver C
Driver C collides with Driver B (And presumably in this case,) crushes his vehicle into driver A's parked vehicle.

Driver B must have been "right" on top of Driver A's vehicle for the crash to happen this way.

Now, the question is, Had Driver B survived this accident. Would he be facing charges too??
 
he must have already been overtaking the minibus, or have seen the truck stopped in lane 1 - otherwise why would he be in lane 2?

unless lane 1 is only for the next junction? merges in to the off slip
 
Not familiar with the road but if the stopped lorry had no lights on then it could have been tricky to see as more and more of the motorway network lighting is being turned off. Plus no-one would be expecting a parked vehicle on the motorway.

Horrible accident, people just don't seem to realise how much force a HGV has behind it.
 
Ah ok, I had heard it was the other one! Deserves everything he gets then. Be interesting to see what the other lorry driver did to be charged as well.
 
The table below defines each level of automation.

hEKq02p.png

Everyone keeps going on about the technical hurdles which fully automated vehicles have to overcome yet these aren't the problem at all. Yes of course all the technical problems will be solved - the real issues are legal.

The "Fallback Performance of Dynamic Driving Task" is the key issue here. As soon as this is no longer a human being, you've got a huge legal issue. Right now, every automated vehicle on public roads has to have a human driver ready to take over. Crucially, this driver is legally responsible should a serious accident occur.

What happens when a level 3 or above vehicle kills someone? Who's responsible then? The car manufacturer? The software engineers? This will be a legal minefield and one far more difficult to resolve than a few technical issues.

There's currently a big debate going on about autonomous military systems and whether a computer should be able to autonomously decide to take a life. I find it very ironic that Musk himself has recently been calling for an outright ban on such military systems, yet how will these be any different to the fully automated vehicles that he's working towards? If an automated car has the ability, by virtue of its decisions and actions, to cause death, how is that any different to an autonomous military system?
 
What happens when a level 3 or above vehicle kills someone? Who's responsible then? The car manufacturer? The software engineers? This will be a legal minefield and one far more difficult to resolve than a few technical issues.

The problem is level 3 and 4 require an alert driver at all times, only level 5 doesn't.

Someone has already died while in a level 3 Tesla while running its 'auto pilot', a "semi" cut across the front of the Tesla and it couldn't 'see' it against the sky and proceeded to plough into the back of it.

Also due to the fact that lorries in the US are not required to have frames around the bottom of the trailers to stop cars going under them unlike in the EU it was quite nasty. The front of the car didn't stop until it hit the rear axle and the back of the trailer was already penetrating the cabin.

Ultimately it was the drivers fault, it was only level 3 autonomous which requires a driver to be alert and able to take over. It turns out that the driver was not alert and was actually watching film on a tablet or DVD at the time and not looking at the road. So the crash could have been avoided if the driver was alert. Likewise the death could have been avoided if the trailer had safety features like mandated in the EU.
 
I'm thinking of the videos of robots trying to pick up a a box with symbols printed on them, and taking ages, dropping them etc!

Try thinking about something that isn't from the 1980s then.

This is what I thought. Look at this thing, over 6 years ago...


Will a driver-less car be able to look at the youths in the car next to it at the traffic lights, see their agitated state, their blinged up car and suspect the driver may do something rash based on past experience of groups of youths in blinged up cars? Will it see the elderly lady with the hat, and judge her reactions may be suspect due to age and maybe poor eyesight? Will it look for shadows of vehicles hidden around blind bends, or look for HGV's across hedgerows and anticipate meeting them? Will it be able to judge if said vehicle is stationary or moving? Will it see the poorly secured sheet of steel on the artic trailer that the wind is starting to lift and might scythe the roof off a following vehicle? Will it intuitively correct a skid or decide to plant the car in the hedge rather than hit an oncoming vehicle with the combined speed of impact, or perceive an oil film from diesel spillage on a wet road? I'll take a competent human behind the wheel of a vehicle I am an occupant of for the foreseeable future, thanks very much.

Short answer. Yes, I think they will be able to, maybe the sheet steel being the only thing that might catch it out.
 
Will a driver-less car be able to look at the youths in the car next to it at the traffic lights, see their agitated state, their blinged up car and suspect the driver may do something rash based on past experience of groups of youths in blinged up cars? Will it see the elderly lady with the hat, and judge her reactions may be suspect due to age and maybe poor eyesight? Will it look for shadows of vehicles hidden around blind bends, or look for HGV's across hedgerows and anticipate meeting them? Will it be able to judge if said vehicle is stationary or moving? Will it see the poorly secured sheet of steel on the artic trailer that the wind is starting to lift and might scythe the roof off a following vehicle? Will it intuitively correct a skid or decide to plant the car in the hedge rather than hit an oncoming vehicle with the combined speed of impact, or perceive an oil film from diesel spillage on a wet road? I'll take a competent human behind the wheel of a vehicle I am an occupant of for the foreseeable future, thanks very much.
Yes to all those things, and I doubt most humans could handle all those things well to be honest.
 
The problem is level 3 and 4 require an alert driver at all times, only level 5 doesn't.

Someone has already died while in a level 3 Tesla while running its 'auto pilot', a "semi" cut across the front of the Tesla and it couldn't 'see' it against the sky and proceeded to plough into the back of it.

Also due to the fact that lorries in the US are not required to have frames around the bottom of the trailers to stop cars going under them unlike in the EU it was quite nasty. The front of the car didn't stop until it hit the rear axle and the back of the trailer was already penetrating the cabin.

Ultimately it was the drivers fault, it was only level 3 autonomous which requires a driver to be alert and able to take over. It turns out that the driver was not alert and was actually watching film on a tablet or DVD at the time and not looking at the road. So the crash could have been avoided if the driver was alert. Likewise the death could have been avoided if the trailer had safety features like mandated in the EU.

Actually appears to be levels 4 & 5 which don't require a driver as a fallback.

I remember the above accident and I also note that, whenever an autonomous vehicle is involved in an accident, the manufacturer is always at pains to point out how it wasn't at fault and/or a human driver wouldn't have been able to do any better etc etc.

This is irrelevant. When fully autonomous cars which don't have a driver fallback are allowed on the road, sooner or later one will kill someone and it will be the autonomous vehicle that is deemed to be at fault. What happens then has to be addressed, discussed and resolved before such vehicles will be allowed anywhere near public roads.
 
That is the reason you don't see them on the roads already. The biggest hurdle is not technical, Google and Tesla say they are ready to go with their implementation. The only reason they are not on the roads right now is down to legislation and liability.
 
When fully autonomous cars which don't have a driver fallback are allowed on the road, sooner or later one will kill someone and it will be the autonomous vehicle that is deemed to be at fault. What happens then has to be addressed, discussed and resolved before such vehicles will be allowed anywhere near public roads.

And that is being discussed now between car manufacturers, insurers, lawyers and the Govt

Have a read on the bill in parliament at the moment

https://publications.parliament.uk/pa/bills/cbill/2016-2017/0143/cbill_2016-20170143_en_2.htm
 
And that is being discussed now between car manufacturers, insurers, lawyers and the Govt

Have a read on the bill in parliament at the moment

https://publications.parliament.uk/pa/bills/cbill/2016-2017/0143/cbill_2016-20170143_en_2.htm

Never doubted it was being discussed, I'm merely saying that this is the big hurdle that must be overcome - the technical ones are child's play by comparison and will be sorted far sooner.

Shouldn't realistically take more than 5 years to sort it out.

Yep sounds about right.

driver fallback.....does that mean no wheel/pedals?

The "fallback" column in the above chart refers to cars which don't rely on a driver being able to take control at any moment. This is what's required before we can have things like autonomous taxis, cars which can drive empty and so forth.
 
Back
Top Bottom