RE: Should driverless cars kill their own passengers to save a pedestrian?
November 17, 2015 at 2:55 am
(November 17, 2015 at 2:35 am)heatiosrs Wrote:(November 17, 2015 at 2:10 am)RaphielDrake Wrote: Thats true. A properly constructed machine will always have a lower risk of failing than a human beings cognitive faculties. Its part of the reason I think governments should be run by AI.Very true, but based on these responses we should probably work on making the cars safe in order to not have to deal with a situation like this one.
Another issue though; If a pedestrian could make sure someone in a car crashes by walking infront at an extremely busy junction with no risk to themselves wouldn't that be an extremely reliable method of murder? Potentially mass murder?
Hm, no. I think pedestrians should probably take responsibility for their own fuck-ups instead of the consequences being inflicted on what is essentially an innocent passenger.
Likely, in all honesty, the car would go off the cliff well before hitting the human, you're not going to program a car with ability to reason, and I think death will not even be an option, so if death happens, it will most likely be as accident(passenger).
To an extent we can actually program machines to reason, its just usually not as abstract as our ability to do so. A binary system can creates all kinds of checks and balances that would allow it to do so. As the checks and balances get more in-depth, more complex and more varied we may actually verge on creating a consciousness that rivals, if not surpasses, ours.
"That is not dead which can eternal lie and with strange aeons even death may die."
- Abdul Alhazred.
- Abdul Alhazred.