Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: March 28, 2024, 9:00 pm

Poll: Is internal manipulation of personal beliefs ethically acceptable?
This poll is closed.
YES! Send in the 'bots.
20.00%
1 20.00%
NO! It is a violation of personal autonomy.
60.00%
3 60.00%
Other: please explain in thread below.
20.00%
1 20.00%
Total 5 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Trolley problem: 2035 version
#11
RE: Trolley problem: 2035 version
Thanks all for the answers.


pocaracas Wrote:Can you also use the same sort of nanobots to cure cancer? That would be soooo much more helpful.
Cancer was cured in 2023.  Try to keep up.

Jenny A Wrote:This does raise a question about homosexuality though.   Sexual orientation is not a choice now, but such a bot would make it a choice.  That raises no real ethical dilemma for me, as I don't see anything immoral same gender sex.  But, there are people, mostly Christian I imagine, who excuse homosexuality only because it is not a personal choice, or who condemn it because they think it is a personal choice.  Such a bot might set gay rights back decades.
I worry more about creation of zealot soldiers for the empire.  Whether or not they choose to be gay is a secondary issue.

Homeless Nutter Wrote:Yup, totally fine.  But then again - I'm evil.
Yep, evil.  But are you chaotic, neutral or lawful evil?

(May 27, 2015 at 10:56 am)whateverist Wrote: I'm not clear whether you are asking:
1)  Is it permissible for me to do for my own reasons?
2)  Is it permissible for me to share the technology with a person who wishes for one of these outcomes?
To 1 I would answer no.  To 2 I would want a legally binding disclaimer signed and witnessed as well as a suitable fee for the service.
(Of course, Bob and I would probably both have to sign off on it.)
The motives of the operator of the technology are not material unless, as JennyA pointed out above, they were of the subject.

Chuck Wrote:But never say never.

Let's say in 2036, a Mars sized rouge planet is detected heading towards earth on a collision course.   It is vastly beyond human capacity to avert the big splat.   There is nowhere near that we can escape to.   It take essentially the wholehearted effort of the whole mankind to build a generation ship that might allow a small human seedling population to depart on a long trip to survive.   That's humanity's only chance.
"It depends" is generally a valid answer.  I'll chalk that up as the only current vote for "Other."

(May 27, 2015 at 11:35 am)Pyrrho Wrote: First of all, this is nothing like the trolley problem, or, at least, is not in any obvious way like the trolley problem.

Do you think that if everyone in Syria and Iraq suddenly became atheists, that they would kill each other less?  Or suppose we got them to Saudi Arabia, do you think that they would stop religious oppression if everyone suddenly became an atheist?

My point is, this is not merely an issue of personal autonomy, but also about how these things will affect others.  

It's just an ethical dilemma problem.  Not all trolly problems have trollies.
I expect human conflict will continue regardless of religious orientation, forced or otherwise.
Another vote for "it depends."

How come I can identify three arguments for "other," but there is only one vote for that in the poll?
So how, exactly, does God know that She's NOT a brain in a vat? Huh
Reply
#12
RE: Trolley problem: 2035 version
Are we talking in a future where authority doesn't have control?
If so, then it's irrelevent, personal preference.
If in this same future, people are still killing over religion, then we won't have a choice as the athorities will have already banned the technology for security reasons.
No God, No fear.
Know God, Know fear.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Trolley Problem/Consistency in Ethics vulcanlogician 150 17805 January 30, 2018 at 11:01 pm
Last Post: bennyboy
  Very short version of the long argument. Mystic 68 10503 September 18, 2017 at 9:38 am
Last Post: The Grand Nudger
  #1 Thought experiment - "The Trolley Problem" ErGingerbreadMandude 108 12009 May 20, 2016 at 8:13 am
Last Post: Athene
Lightbulb Pascal's Wager (the new version) Muslim Scholar 153 37521 March 12, 2013 at 1:27 am
Last Post: KichigaiNeko



Users browsing this thread: 1 Guest(s)