RE: Trolley problem: 2035 version
May 27, 2015 at 11:35 am
(This post was last modified: May 27, 2015 at 12:14 pm by Pyrrho.)
First of all, this is nothing like the trolley problem, or, at least, is not in any obvious way like the trolley problem.
Second, your problem is really more complicated than people seem to be taking it. The serious replies so far seem to regard using the nanobots as a violation of personal autonomy. [Edit: I see that while I was writing this, someone else has posted something bringing up the issue in this post.] However, there is more at play here.
You don't say how the nanobots are administered, so I will just make up something convenient for explaining why it is more complicated than just a matter of personal autonomy.
Suppose we can just disperse the nanobots into the air, and they will seek out hosts. Now, imagine spraying a bunch of these nanobots over an area with religious strife, like Syria and Iraq. Do you think that if everyone in Syria and Iraq suddenly became atheists, that they would kill each other less? Or suppose we got them to Saudi Arabia, do you think that they would stop religious oppression if everyone suddenly became an atheist?
My point is, this is not merely an issue of personal autonomy, but also about how these things will affect others. I am not saying (yet) that these things should be done if they were possible, but I am saying that one should consider more than just the issue of personal autonomy. Is, for example, the personal autonomy of a religious murderer more important than the lives of his victims?
So, you really have a hypothetical issue that has a good deal of complexity to it.
Second, your problem is really more complicated than people seem to be taking it. The serious replies so far seem to regard using the nanobots as a violation of personal autonomy. [Edit: I see that while I was writing this, someone else has posted something bringing up the issue in this post.] However, there is more at play here.
You don't say how the nanobots are administered, so I will just make up something convenient for explaining why it is more complicated than just a matter of personal autonomy.
Suppose we can just disperse the nanobots into the air, and they will seek out hosts. Now, imagine spraying a bunch of these nanobots over an area with religious strife, like Syria and Iraq. Do you think that if everyone in Syria and Iraq suddenly became atheists, that they would kill each other less? Or suppose we got them to Saudi Arabia, do you think that they would stop religious oppression if everyone suddenly became an atheist?
My point is, this is not merely an issue of personal autonomy, but also about how these things will affect others. I am not saying (yet) that these things should be done if they were possible, but I am saying that one should consider more than just the issue of personal autonomy. Is, for example, the personal autonomy of a religious murderer more important than the lives of his victims?
So, you really have a hypothetical issue that has a good deal of complexity to it.
"A wise man ... proportions his belief to the evidence."
— David Hume, An Enquiry Concerning Human Understanding, Section X, Part I.