Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: December 22, 2024, 10:33 am

Poll: Is internal manipulation of personal beliefs ethically acceptable?
This poll is closed.
YES! Send in the 'bots.
20.00%
1 20.00%
NO! It is a violation of personal autonomy.
60.00%
3 60.00%
Other: please explain in thread below.
20.00%
1 20.00%
Total 5 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Trolley problem: 2035 version
#1
Trolley problem: 2035 version
Hypothetical scenario:

Congratulations!  Your DARPA funded project is a complete success!
You have discovered and fully characterized two neural subnets in the Allen catalog of the human brain.
1) A sub-ganglion whose configuration and activity fully characterize and implement gender beliefs and behavior.
2) A network which implements unique religious faith.

Fellow researcher, "Bob" has developed nano scale robots which are able to manipulate neurons at the molecular level.  Applying these to the above networks gives a capability of fully determining future beliefs and behavior.  Correctly used, this process is fully controlled and without deleterious side effects (impossible, I know, but this is my hypothetical.)  For example:
1) You can fully convert homosexual desire and behavior to stereotypically heterosexual.
2) You can convert any true believer in any religious faith to any other or remove any religious bias whatsoever, leaving the person atheist.

Is it ethically acceptable to use the nanobots to make the above modifications?
So how, exactly, does God know that She's NOT a brain in a vat? Huh
Reply
#2
RE: Trolley problem: 2035 version
nah.... It's so much more fun to laugh at creationists.
However, I see few people laughing at gays...

Can you also use the same sort of nanobots to cure cancer? That would be soooo much more helpful.
Reply
#3
RE: Trolley problem: 2035 version
And here I was waiting to decide if I should defect the missile set to destroy planet X with several billion inhabitants, to spaceship Y with only a thousand people on board. . .

So getting away from the trolley and turning to mind altering bots. . .

No, altering someone's mind to change their beliefs and or desires is a terrible idea.  Using robots makes it no less horrible than if it were done in a "liberation camp."  The only moral use I can see for such a bot is voluntary change on the part of the person whose mind is to be altered.  And even then, I would think serious counseling would be in order before use.
If there is a god, I want to believe that there is a god.  If there is not a god, I want to believe that there is no god.
Reply
#4
RE: Trolley problem: 2035 version
I do think the world would be a better place without religion but only because that world would require the people in it to have been enlightened enough and rational enough to leave their religion behind on their own, artificially manipulating someone's mind or enforcing a ban on religions would defeat the purpose
“The larger the group, the more toxic, the more of your beauty as an individual you have to surrender for the sake of group thought. And when you suspend your individual beauty you also give up a lot of your humanity. You will do things in the name of a group that you would never do on your own. Injuring, hurting, killing, drinking are all part of it, because you've lost your identity, because you now owe your allegiance to this thing that's bigger than you are and that controls you.”  - George Carlin
Reply
#5
RE: Trolley problem: 2035 version
If I make everyone exactly like me, and I still fucking hate humanity, then what will that say?

I'd rather not find out.
Reply
#6
RE: Trolley problem: 2035 version
This does raise a question about homosexuality though. Sexual orientation is not a choice now, but such a bot would make it a choice. That raises no real ethical dilemma for me, as I don't see anything immoral same gender sex. But, there are people, mostly Christian I imagine, who excuse homosexuality only because it is not a personal choice, or who condemn it because they think it is a personal choice. Such a bot might set gay rights back decades.
If there is a god, I want to believe that there is a god.  If there is not a god, I want to believe that there is no god.
Reply
#7
RE: Trolley problem: 2035 version
(May 27, 2015 at 9:56 am)JuliaL Wrote: [...]

Is it ethically acceptable to use the nanobots to make the above modifications?

Yup, totally fine.  But then again - I'm evil.
"The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one." - George Bernard Shaw
Reply
#8
RE: Trolley problem: 2035 version
I'm not clear whether you are asking:

1) Is it permissible for me to do for my own reasons?

2) Is it permissible for me to share the technology with a person who wishes for one of these outcomes?


To 1 I would answer no. To 2 I would want a legally binding disclaimer signed and witnessed as well as a suitable fee for the service.

(Of course, Bob and I would probably both have to sign off on it.)
Reply
#9
RE: Trolley problem: 2035 version
(May 27, 2015 at 9:56 am)JuliaL Wrote: Hypothetical scenario:

Congratulations!  Your DARPA funded project is a complete success!
You have discovered and fully characterized two neural subnets in the Allen catalog of the human brain.
1) A sub-ganglion whose configuration and activity fully characterize and implement gender beliefs and behavior.
2) A network which implements unique religious faith.

Fellow researcher, "Bob" has developed nano scale robots which are able to manipulate neurons at the molecular level.  Applying these to the above networks gives a capability of fully determining future beliefs and behavior.  Correctly used, this process is fully controlled and without deleterious side effects (impossible, I know, but this is my hypothetical.)  For example:
1) You can fully convert homosexual desire and behavior to stereotypically heterosexual.
2) You can convert any true believer in any religious faith to any other or remove any religious bias whatsoever, leaving the person atheist.

Is it ethically acceptable to use the nanobots to make the above modifications?

In most cases, no.  But never say never.

Let's say in 2036, a Mars sized rouge planet is detected heading towards earth on a collision course.   It is vastly beyond human capacity to avert the big splat.   There is nowhere near that we can escape to.   It take essentially the wholehearted effort of the whole mankind to build a generation ship that might allow a small human seedling population to depart on a long trip to survive.   That's humanity's only chance.  But half the population would rather pray for salvation at end time. I would say send in the 
Nanobots.

BTW, there is an answer to a more classical trolley car problem in there somewhere.
Reply
#10
RE: Trolley problem: 2035 version
First of all, this is nothing like the trolley problem, or, at least, is not in any obvious way like the trolley problem.


Second, your problem is really more complicated than people seem to be taking it.  The serious replies so far seem to regard using the nanobots as a violation of personal autonomy.  [Edit:  I see that while I was writing this, someone else has posted something bringing up the issue in this post.]  However, there is more at play here.

You don't say how the nanobots are administered, so I will just make up something convenient for explaining why it is more complicated than just a matter of personal autonomy.

Suppose we can just disperse the nanobots into the air, and they will seek out hosts.  Now, imagine spraying a bunch of these nanobots over an area with religious strife, like Syria and Iraq.  Do you think that if everyone in Syria and Iraq suddenly became atheists, that they would kill each other less?  Or suppose we got them to Saudi Arabia, do you think that they would stop religious oppression if everyone suddenly became an atheist?

My point is, this is not merely an issue of personal autonomy, but also about how these things will affect others.  I am not saying (yet) that these things should be done if they were possible, but I am saying that one should consider more than just the issue of personal autonomy.  Is, for example, the personal autonomy of a religious murderer more important than the lives of his victims?

So, you really have a hypothetical issue that has a good deal of complexity to it.

"A wise man ... proportions his belief to the evidence."
— David Hume, An Enquiry Concerning Human Understanding, Section X, Part I.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Trolley Problem/Consistency in Ethics vulcanlogician 150 22800 January 30, 2018 at 11:01 pm
Last Post: bennyboy
  Very short version of the long argument. Mystic 68 12614 September 18, 2017 at 9:38 am
Last Post: The Grand Nudger
  #1 Thought experiment - "The Trolley Problem" ErGingerbreadMandude 108 15672 May 20, 2016 at 8:13 am
Last Post: Athene
Lightbulb Pascal's Wager (the new version) Muslim Scholar 153 42179 March 12, 2013 at 1:27 am
Last Post: KichigaiNeko



Users browsing this thread: 1 Guest(s)