And on several occasions the pilots have tried to take over when the autopilot goes wrong, and been unable to, the computers have locked everything out and just dived to the ground, and the plane has crashed, then 200 to 300 or more die, not just one or two.
There was a study commissioned by the Federal Aviation Administration, whose conclusions were that "pilots sometimes rely too much on automated systems and may be reluctant to intervene". The study added that many pilots "lack sufficient or in-depth knowledge and skills’ to properly control their plane's trajectory, should they need to fly manually, due to inadequate training methods".
One report that studied over 200 plane crashes over a 40 year period, found that in around two-thirds of those crashes, autopilot issues were a major contributing factor, and the pilots were not able to control the plane manually.
It seems to me that what you have described with aircraft pilots inability to intervene successfully is comparable to a road vehicle with Level 2 or Level 3 autonomy. What has been determined on the road is that humans drivers who rely on such autonomy and are required to stay alert cannot be relied upon to react in time to take over the vehicle to prevent an accident. As we know, this is precisely why Waymo has shunned any Level 2 or Level 3 solutions and moved fully to at least Level 4 autonomy.