Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: December 26, 2024, 4:42 am

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Should driverless cars kill their own passengers to save a pedestrian?
#71
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 2:02 am)heatiosrs Wrote: I would still say assuming that whatever speed the car is going at, if it hits the pedestrian it will result in death, hitting the car is the best outcome given the original question.

Yeah, but Dad's reaching in the trunk while Mom is out in the street dragging her kid back out of the road. (I have seen small kids run out in the road all the time and parents (if they are even around at all) will usually just call out to them).
You make people miserable and there's nothing they can do about it, just like god.
-- Homer Simpson

God has no place within these walls, just as facts have no place within organized religion.
-- Superintendent Chalmers

Science is like a blabbermouth who ruins a movie by telling you how it ends. There are some things we don't want to know. Important things.
-- Ned Flanders

Once something's been approved by the government, it's no longer immoral.
-- The Rev Lovejoy
Reply
#72
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 8:48 pm)IATIA Wrote:
(November 17, 2015 at 2:02 am)heatiosrs Wrote: I would still say assuming that whatever speed the car is going at, if it hits the pedestrian it will result in death, hitting the car is the best outcome given the original question.

Yeah, but Dad's reaching in the trunk while Mom is out in the street dragging her kid back out of the road.  (I have seen small kids run out in the road all the time and parents (if they are even around at all) will usually just call out to them).
Might want to specify when making a question like this, you said "or", not "and".
Which is better:
To die with ignorance, or to live with intelligence?

Truth doesn't accommodate to personal opinions.
The choice is yours. 
--------------------------------------------------------------------------------

There is God and there is man, it's only a matter of who created whom

---------------------------------------------------------------------------------
The more questions you ask, the more you realize that disagreement is inevitable, and communication of this disagreement, irrelevant.
Reply
#73
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 8:07 pm)IATIA Wrote:
(November 17, 2015 at 12:31 am)Aroura Wrote: ...  I agree that this is not some moral issue.  Like the train track dilema.  But this one fails to give me much of a dilema.  The car will do what it is programmed to do, which will be to try and avoid killing anyone.

As you say, "The car will do what it is programmed to do", but that is where the moral issue resides.  There is a programmer or team of programmers, that must consider the outcome of the decision making process that will ultimately control the vehicle.  I do not know if there are any real programmers on board, but programming must encompass worst case scenarios.  In the case of computer software and games, generally the rule of thumb is that any input which fails to align with the intended programming is just shoved in the bit bucket and the software will resume polling inputs.  However, in the case of cars and planes, it would be disastrous to ignore unforeseen inputs, so worst case scenarios must be considered, albeit they can usually be grouped into similar algorithms, but that is where the problem comes in.  Simple algorithm, something in way, dodge it.  OH NO! Cliff, too late.  It does not matter how fast the computer is because it must abide by physics and causality, i.e. reason for stopping and stopping distance.  Even though it can calculate, for sake of argument, the exact stopping distance, it is still obligated by the laws of physics.  This does not consider tire wear, that patch of oil (hit one on a bike once, no fun) or any other unknowns.  Because the car cannot think and can only do what is programmed, ultimately the programmer has to consider the decision to program the safety of the occupants or the safety of the greater numbers or whatever.  This is where the morality lies and ultimately the obligatory lawsuits.  It is obvious from some of the posts, that some posters did not really read the article linked at the beginning.  So yes, from the programmer's stand point, this is exactly the trolley problem.

Ok, my husband is a programmer and working to become a computer engineer (well, to be completely honest, he is in his senior year of computer science, but he is currently working on his senior project, which will be used for this kind of decisive system in robtics: ROS.org. He's also been programming computers since the 80's) has some things he'd like me to relay in this thread.  First, he just finished taking an ethics in technology and engineering class, so this kind of stuff is all fresh in his mind.  For the following please take into account I'm trying to say what he says, but in words I understand better.  So his thoughts on the subject are that no engineer is going to program the car to swerve, because swerving will open up a whole host of unpredictable outcomes.  Swerving might cause the car to aim towards another car, or another pedestrian, etc.

It will have as simple a set of programs as possible.  Much like EXISTING technology that AUTOMATICALLY stops new cars if they sense something in front of them, so you as a human don't have to, a Google car will most likely just do it's best to stop.  Even if it does not successfully stop before colliding with the pedestrian, it will do a better job than a human being of sensing the object or pedestrian, and stopping in time.  The speed will be far more reduced than it would be if a human driver were the one forced to react to the same situation.  

In short, it would hit the pedestrian, if indeed it did not have time from sensing the pedestrian to come to a full stop, but it would react so much quicker than a person that chance of injury or death would be greatly reduced from the same outcome with a human driver, for both the pedestrian AND the driver.

So, my husband says there is still no moral dilemma, not really. (Also, he said, what if there is a whole family in the car?  Or an oncoming car or even a buss full of nuns?  lol.  This is why automatic cars will not be programmed to swerve). 
Current technology for the situation you are talking about exists, and is becoming mass use, and it shown to lower the death toll of drivers and pedestrians.  That's why so many new cars have automatic breaks. The technology you are using for your dilemma scenario poses no dilemma that he can find.  It is widely agreed that computers react faster and are much more likely to save the life of the driver and the pedestrian with automatic breaking than with a human driver.  Deaths will still happen, though.  That's a dilemma for driving itself, not for automatic cars.

Sorry that was a bit rambly, my husband is talking to me while I type, and he's always hard for me to translate, lol. 

Oh, he just sent me this table for how engineers determine some ethical situations as well.
Workable Ethical Theories

So....that's my response as well.  I have to listen to the man constantly on topics like this, and the truth is....I agree.  Not just because he's my hubby (although he may be my hubby because we tend to agree on stuff like this, if you see what I mean). Big Grin

Updated to correct some confusing wording.
“Eternity is a terrible thought. I mean, where's it going to end?” 
― Tom StoppardRosencrantz and Guildenstern Are Dead
Reply
#74
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 9:40 pm)Aroura Wrote: So....that's my response as well.  I have to listen to the man constantly on topics like this, and the truth is....I agree.  Not just because he's my hubby (although he may be my hubby because we tend to agree on stuff like this, if you see what I mean). Big Grin

I do not necessarily disagree with you or hubby, however, this is a new field. A software game; It is not like immoral/errant programming is going to let loose the bad guy into our world to reek havoc. Electric plants; it is not like immoral/errant programming is going to allow the power lines to slither around and attack people. Brownouts need to consider hospitals and public safety, but that is pretty much cut and dried.

The one point I disagree on is that there is a moral consideration due to the fact that the programmer does have to consider life and death situations that can be directly attributed to the device and software rather than an indirect result. The program itself will have no concept of that or anything for that matter, it will be all up to the programmer that is far removed from any situation that may arise. Granted that after the fact the programmers will be intimately involved along with the company, lawyers, family and anyone else that has suffered a loss or has something to gain.

In the first trial of 'death by driverless vehicle', any bets on how many times "moral" will be said?
You make people miserable and there's nothing they can do about it, just like god.
-- Homer Simpson

God has no place within these walls, just as facts have no place within organized religion.
-- Superintendent Chalmers

Science is like a blabbermouth who ruins a movie by telling you how it ends. There are some things we don't want to know. Important things.
-- Ned Flanders

Once something's been approved by the government, it's no longer immoral.
-- The Rev Lovejoy
Reply
#75
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 10:16 pm)IATIA Wrote:
(November 17, 2015 at 9:40 pm)Aroura Wrote: So....that's my response as well.  I have to listen to the man constantly on topics like this, and the truth is....I agree.  Not just because he's my hubby (although he may be my hubby because we tend to agree on stuff like this, if you see what I mean). Big Grin

I do not necessarily disagree with you or hubby, however, this is a new field.  A software game; It is not like immoral/errant programming is going to let loose the bad guy into our world to reek havoc.  Electric plants; it is not like immoral/errant programming is going to allow the power lines to slither around and attack people.  Brownouts need to consider hospitals and public safety, but that is pretty much cut and dried.

The one point I disagree on is that there is a moral consideration due to the fact that the programmer does have to consider life and death situations that can be directly attributed to the device and software rather than an indirect result.  The program itself will have no concept of that or anything for that matter, it will be all up to the programmer that is far removed from any situation that may arise. Granted that after the fact the programmers will be intimately involved along with the company, lawyers, family and anyone else that has suffered a loss or has something to gain.

In the first trial of 'death by driverless vehicle', any bets on how many times "moral" will be said?

Of course morals are involved...I even linked the ethical chart and talked about why the car will be programmed to attempt to stop and not to swerve, and the ethical reason (not just the logical ones) why that is.  But your scenario poses no DILEMMA, that's all we are saying.  The only dilemma here is weather there should even be cars in the first place.

Driverless cars will reduce the number of automobile fatalities.  Period.  By a LOT. There will be times they cause them, just like seat belts kill people instead of saving them 1 time in 10,000 accidents.  And of course, some people will freak out, just like there are morons out there who won't wear a seat belt because of that one time in a 10,000.  

I expect this will eventually be like vaccines.  Driverless cars will actually be SO good at reducing fatalities, we will forget (look, some of you already forgot and this is all just hypothetical) how many humans cause automobile accidents every year.  So people will rail against the thing that saves them, irrationally claiming it is the thing causing harm. 

Here is the thing.  When I go out and drive, I am trusting fallable people.  Including the meth addicts, drunks, tons of people on ADD meds, overworked tired people, stressed out moms, etc. to pay attention and not kill me and my family in the car.  When I walk on a sidewalk, same thing.
Driverless cars are not just some mindless machine.  They are programmed by human beings who will do their VERY BEST (because of law suits and such, as well as because their families will be using the damned things) to make sure they don't cause unnecessary fatalities, and indeed in most cases, will prevent them. And they will be implemented by devices which are far less fallible than human beings.

So, we all put our lives in the hands of other people when we use the roads and sidewalks.  I would prefer to place my life in the hands of a small group of engineers who will be trained to think about these things, than the drunk lady next door, or that meth head in the car with no muffler and bad breaks because his money is going ot more meth instead of repairing his breaks.  

So again, I get that morals are involved, I just don't see the dilemma.
“Eternity is a terrible thought. I mean, where's it going to end?” 
― Tom StoppardRosencrantz and Guildenstern Are Dead
Reply
#76
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 4:33 pm)Rhythm Wrote: You mean..if the sky were full of human error.........you'd favor yet more potential for human error........

Wait, what?

I see your point, I just don't see how you will get streets "cleaned up" from human error and randomness to that extent
The fool hath said in his heart, There is a God. They are corrupt, they have done abominable works, there is none that doeth good.
Psalm 14, KJV revised edition

Reply
#77
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 12:55 pm)Rhythm Wrote: I have to ask, if we're all going to be in driverless cars someday, why don't we just cut to the chase and build better public transit systems...?  Wtf is wrong with a train...lol?

Ha!  I was just mentioning this to my hubby.  I wonder if driverless cars could lead us to more mass transit?  Instead of owning cars, could I call for one like a taxi to come pick me up, tell it where to drop me off? There would still be a lot if individual cars needed in this scenario, but much less than everyone owning one (or even 2).  There might be driverless busses, as well.  

I'm all for going back to the days of trolly's and streetcars, but with modern tech.  Screw everyone having to own a their own car.

But..capitalism will keep that from happening. Sad
“Eternity is a terrible thought. I mean, where's it going to end?” 
― Tom StoppardRosencrantz and Guildenstern Are Dead
Reply
#78
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 1:24 pm)DespondentFishdeathMasochismo Wrote: It just seems like a spectacularly well timed homicide, if you know exactly when to get into your car, in order to make it override it's own programming of not hitting pedestrians, in order to kill a pedestrian. I just don't see how someone could possibly use the car to their advantage in that way.

I made it clear that I was being hypothetical. I was demonstrating a point.
Reply
#79
RE: Should driverless cars kill their own passengers to save a pedestrian?
(November 17, 2015 at 8:25 pm)IATIA Wrote: And they thought Apple was wasting time and effort with the iPad.

More than a little different if you ask me. iPads don't really have the same propensity to affect transport related deaths as what airships or driverless cars do. My whole point centred around the fact that people think it's a nice idea until someone is killed as a result of it not being safe enough for mass use.
Reply
#80
RE: Should driverless cars kill their own passengers to save a pedestrian?
8 pages in and I'm still amazed that driverless cars are a thing. Cheers from Underarocksville .-.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Are cats evil beasts that should be killed to save mice? FlatAssembler 34 3609 November 28, 2022 at 11:41 am
Last Post: Fireball
  Does anyone own "The Moral Landscape"? robvalue 191 21357 October 18, 2018 at 4:39 pm
Last Post: vulcanlogician
  My own moral + ontological argument. Mystic 37 12446 April 17, 2018 at 12:50 pm
Last Post: FatAndFaithless
  If You Could Choose Your Own Desires Edwardo Piet 34 4452 November 12, 2016 at 1:43 pm
Last Post: Thumpalumpacus
  Would you kill the person who is about to kill? brewer 63 10171 December 10, 2015 at 2:07 pm
Last Post: Whateverist
  If there were a creator, what would be their limits? Razulxe 2 54 10428 February 19, 2015 at 9:32 pm
Last Post: IATIA
Tongue Just for fun: Make your own "Proof by Anselm" thedouglenz 0 908 June 10, 2014 at 11:01 pm
Last Post: thedouglenz
  Do We Own? Walking Void 43 13619 July 21, 2013 at 4:15 am
Last Post: genkaus
Question One thing that makes you doubt your own world view? Tea Earl Grey Hot 9 3066 July 14, 2013 at 4:06 pm
Last Post: Something completely different
  Do we own our own lives? A discussion on the morality of suicide and voluntary slavery. Kirbmarc 36 15638 December 13, 2012 at 8:08 pm
Last Post: naimless



Users browsing this thread: 1 Guest(s)