RE: Should driverless cars kill their own passengers to save a pedestrian?
November 17, 2015 at 10:33 pm
(This post was last modified: November 17, 2015 at 10:35 pm by Aroura.)
(November 17, 2015 at 10:16 pm)IATIA Wrote:(November 17, 2015 at 9:40 pm)Aroura Wrote: So....that's my response as well. I have to listen to the man constantly on topics like this, and the truth is....I agree. Not just because he's my hubby (although he may be my hubby because we tend to agree on stuff like this, if you see what I mean).
I do not necessarily disagree with you or hubby, however, this is a new field. A software game; It is not like immoral/errant programming is going to let loose the bad guy into our world to reek havoc. Electric plants; it is not like immoral/errant programming is going to allow the power lines to slither around and attack people. Brownouts need to consider hospitals and public safety, but that is pretty much cut and dried.
The one point I disagree on is that there is a moral consideration due to the fact that the programmer does have to consider life and death situations that can be directly attributed to the device and software rather than an indirect result. The program itself will have no concept of that or anything for that matter, it will be all up to the programmer that is far removed from any situation that may arise. Granted that after the fact the programmers will be intimately involved along with the company, lawyers, family and anyone else that has suffered a loss or has something to gain.
In the first trial of 'death by driverless vehicle', any bets on how many times "moral" will be said?
Of course morals are involved...I even linked the ethical chart and talked about why the car will be programmed to attempt to stop and not to swerve, and the ethical reason (not just the logical ones) why that is. But your scenario poses no DILEMMA, that's all we are saying. The only dilemma here is weather there should even be cars in the first place.
Driverless cars will reduce the number of automobile fatalities. Period. By a LOT. There will be times they cause them, just like seat belts kill people instead of saving them 1 time in 10,000 accidents. And of course, some people will freak out, just like there are morons out there who won't wear a seat belt because of that one time in a 10,000.
I expect this will eventually be like vaccines. Driverless cars will actually be SO good at reducing fatalities, we will forget (look, some of you already forgot and this is all just hypothetical) how many humans cause automobile accidents every year. So people will rail against the thing that saves them, irrationally claiming it is the thing causing harm.
Here is the thing. When I go out and drive, I am trusting fallable people. Including the meth addicts, drunks, tons of people on ADD meds, overworked tired people, stressed out moms, etc. to pay attention and not kill me and my family in the car. When I walk on a sidewalk, same thing.
Driverless cars are not just some mindless machine. They are programmed by human beings who will do their VERY BEST (because of law suits and such, as well as because their families will be using the damned things) to make sure they don't cause unnecessary fatalities, and indeed in most cases, will prevent them. And they will be implemented by devices which are far less fallible than human beings.
So, we all put our lives in the hands of other people when we use the roads and sidewalks. I would prefer to place my life in the hands of a small group of engineers who will be trained to think about these things, than the drunk lady next door, or that meth head in the car with no muffler and bad breaks because his money is going ot more meth instead of repairing his breaks.
So again, I get that morals are involved, I just don't see the dilemma.
“Eternity is a terrible thought. I mean, where's it going to end?”
― Tom Stoppard, Rosencrantz and Guildenstern Are Dead
― Tom Stoppard, Rosencrantz and Guildenstern Are Dead