AyeI like the idea of not being trained properly being the pilots fault.
That sounds worryingly like the old Airbus problem where IIRC the autopilot under certain specific conditions could not easily be overridden, or when it wasn't clear that the autopilot was still in partial control, or had disabled some of it's controls due to accidental input from the pilot (so pilot thinks the autopilot is doing X when it isn't, or the auto pilot is doing Y when the pilot thinks it's fully off).I read earlier that the MCAS system used to be over-ridden by manual inputs, but with the Max that changed. Now, the system has to be disabled before it stops interfering, but Boeing didn't initially update their manuals to mention this fact. That would have a significant impact on pilot reaction, surely?
From what i remember pretty much every crash tends to have multiple points of failure, and usually if the pilot makes a serious mistake that "causes" the crash it's because other things have gone wrong as well, for example I can remember one crash where the pilot made a mistake that was pointed out by another member of the cockpit crew, but because the Pilot was a senior one the junior pilots concerns were ignored despite him actually being correct.
.
Sounds like the 1977 Tenerife crash you're thinking of, the KLM captain, impatient to take off, started his take off run without getting permission from the tower
his junior co-pilot warned him that they should wait for final permission, but the Captain ignored him and the junior officer sadly didn't press it any further and the KLM plane ploughed into a taxiing Pan Am.
583 people died, worst single air crash ever, not even sure if the KLM plane left the ground.
I believe the specific crash he might be referring to is more recent, and involved an Asian crew where there was a culture of subordination between pilot and co-pilot. I seem to remember it might have been the crash where the jet landed short into a breakwater at the end of the runway (San Francisco maybe?).The KLM 747 was within 100 m (330 ft) of the Pan Am and moving at approximately 140 knots (260 km/h; 160 mph) when it left the ground. Its nose landing gear cleared the Pan Am, but its left-side engines, lower fuselage, and main landing gear struck the upper right side of the Pan Am's fuselage,[9] ripping apart the center of the Pan Am jet almost directly above the wing. The right-side engines crashed through the Pan Am's upper deck immediately behind the cockpit.
As a side note - if you automate everything then 100% of all accidents will be down to the automation.
explain to me this agenda please.So dont make a thread then. Because 99% if not 100% of those posting are just giving there opinion nothing more.
Bit ridiculous to be honest. But that's fine I'm happy with my comment and when the news comes out il either eat my words or tell you all to get ******.
You all make out the the airlines dont have an agenda in this.. hilarious.
Boeing is expected to release a software patch to the system to deal with this issue, according to Reuters'
Just like new cars/software/whatever. Aircraft Manufacturers carry out 100's of thousands of hours of testing and simulations to ensure that systems are all working as designed.
And yet, once unleashed on the real world, with real people, all the bugs suddenly come flying out of the woodwork!
Unlike professional testers. real people are unpredictable and, no matter how rigid the training, are likley to behave in ways no designer can ever fully predict. It isn't just a case of making equipment idiot-proof, you have to make it proof against smart people too (And that is very much harder to achieve!). And the more sophisticated the machine the more utterly bizarre, incomprehensible and unimaginable ways there are for somebody to screw it up (And as I said, especially if you are dealing with smart people rather than the run of the mill!)
Mostly they are minor issues. But sometimes they can be catastrophic:/
I recall years ago, I think it was Mercedes, had this idea of having an emergency braking system, that would try to guess when a driver was attempting an emergency stop and apply maximum braking automatically until the vehicle came to a stop. This was deemed a "good idea" because it had been concluded that, even in emergencies, drivers would hesitate for a second or so before really hammering on the brakes and allowing the ABS etc systems do do their work.
It worked great during testing but in the real world resulted in drivers randomly emergency stopping without warning, particularly in urban areas.
The typical scenario was where somebody was looking for an unfamiliar address or a side road and would jab on the brakes a bit sharpish if they thought they were going to overshoot it. The Brake ECU would interpret this as an emergency stop and bring the car top a screeching halt.
During all the testing and development, nobody had actually thought of this one!
In aircraft there is this story... (And no, it isn't funny! Not at all! No really!)
http://www.grumpyoldsod.com/mohammedair.asp
But it illustrates nicely how the unpredictability of people can be the downfall of even the most carefully designed machine or system!
Or the time that the rudder actuator on one model could freeze up and once it started to work again it could be doing so in the opposite direction to what the controls were telling it to do, pilot error in that the pilots didn't recognise that in the seconds they had (it took the investigators about 4 crashes over 3 years), but equipment failure in that it performed in a completely different way to how it was intended to.
Yep, the Civil Aviation Authority have banned the 737 MAX from operating in and around the UK. Alongside many other authorities, that's quite a step but expected none the less.
I would now expect the aircraft to be grounded worldwide shortly.
Just like new cars/software/whatever. Aircraft Manufacturers carry out 100's of thousands of hours of testing and simulations to ensure that systems are all working as designed.
And yet, once unleashed on the real world, with real people, all the bugs suddenly come flying out of the woodwork!
Unlike professional testers. real people are unpredictable and, no matter how rigid the training, are likley to behave in ways no designer can ever fully predict. It isn't just a case of making equipment idiot-proof, you have to make it proof against smart people too (And that is very much harder to achieve!). And the more sophisticated the machine the more utterly bizarre, incomprehensible and unimaginable ways there are for somebody to screw it up (And as I said, especially if you are dealing with smart people rather than the run of the mill!)
Mostly they are minor issues. But sometimes they can be catastrophic:/
I recall years ago, I think it was Mercedes, had this idea of having an emergency braking system, that would try to guess when a driver was attempting an emergency stop and apply maximum braking automatically until the vehicle came to a stop. This was deemed a "good idea" because it had been concluded that, even in emergencies, drivers would hesitate for a second or so before really hammering on the brakes and allowing the ABS etc systems do do their work.
It worked great during testing but in the real world resulted in drivers randomly emergency stopping without warning, particularly in urban areas.
The typical scenario was where somebody was looking for an unfamiliar address or a side road and would jab on the brakes a bit sharpish if they thought they were going to overshoot it. The Brake ECU would interpret this as an emergency stop and bring the car top a screeching halt.
During all the testing and development, nobody had actually thought of this one!
In aircraft there is this story... (And no, it isn't funny! Not at all! No really!)
http://www.grumpyoldsod.com/mohammedair.asp
But it illustrates nicely how the unpredictability of people can be the downfall of even the most carefully designed machine or system!
Cut and paste job, doesnt seem to be the whole story:
https://www.snopes.com/fact-check/etihad-a340-accident/