RE: Should driverless cars kill their own passengers to save a pedestrian?
November 17, 2015 at 1:08 pm
(November 17, 2015 at 12:53 pm)DespondentFishdeathMasochismo Wrote: If the driverless car couldn't stop in time to hit a pedestrian, then would it be any less culpable than a manually driven car, to stop in time before hitting the pedestrian? In my eyes the outcome is the same, except one outcome favors the pedestrian's life over the driver's life. Your analogy about yelling out the window is hilarious though
Intentions matter here, but so does knowledge beforehand about whether there is any likelihood of the person being run over.
If I somehow knew with 100% certainty that if I got in my driverless car on day X then that car would necessarily run over someone, but it won't on any other day, and I can prove that I have that 100% knowledge (somehow, hypothetically) then IMHO if I go and drive anyway on that one day when I am certain my driverless car would come someone, I am morally culpable. I should take a break on that day if I really have that knowledge, the fact that the car does it for me is irrelevant in this case: Intentionally killing is bad but so is knowingly allowing someone to be killed even if you don't directly do it yourself, the point is if you can prevent it and you know how, then not doing so is equivalent to killing from a consequentalist perspective.