(March 19, 2018 at 11:24 pm)Anomalocaris Wrote:(March 19, 2018 at 11:11 pm)Tiberius Wrote: Self driving cars are actually very good. They have been around for a while now, this is the first case of a death.
What I really want to know is why the human driver didn’t brake. The whole reason there’s still a human behind the wheel is to act when the car fails. There’s clearly still work to do, but we also shouldn’t overreact to one case.
It is often found that during pilot training, inexperienced flight instructors tend to delay taking over the control of the aircraft for safety when the student pilot does dangerous things. The instructors are trained to be vigilant for such action and promptly take control of the aircraft when the student pilot screw up. But when someone or something is apparently operating a vehicle normally, it seems to be reflexive for humans to give the controller more latitude then is warranted when situation suddenly change.
Yes, of course. Complacency is natural for a human - part of the attraction of self-driving cars in the first place. When the computer gets it right 99.9% of the time, the human overseer loses interest and is not a reliable fail-safe.
I've always thought that the most difficult part of developing this tech is going to be identifying that infinitesimally small time when they will fail so that you can make corrections to the program. You can't really be aware of the flaw until it actually occurs. You hope your human overseer will kick in and prevent the actual accident but they are going to get complacent because they are human.
Only two things are infinite, the universe and human stupidity, and I'm not sure about the former.
Albert Einstein
Albert Einstein