Posts: 20476
Threads: 447
Joined: June 16, 2014
Reputation:
110
Thought experiement# 117. The measure of man.
January 28, 2020 at 5:02 am
(This post was last modified: January 28, 2020 at 5:02 am by ignoramus.)
In the future we create AI. They programmatically "evolve" to become functioning self aware citizens.
Their human rights status is still not clear to me atm.
Like Data from TNG, they are well behaved, unlike Data's brother who went rogue.
Now let's say some of these AI synths "evolved" to become "gay" or show gaylike tendencies.
Or, in other cases, some were drawn to the mystery of Islam, etc
Now, would we, as humans, accept them for being "themselves" and afford them respect, or would we likely assume they are "broken" or "faulty".
You know, like a dodgy toaster from Walmart.
What I'm asking is: is our human selfish naturally xenophobic chimp gene still going to play the same old game?
You know, The same game where many religious hide their prejudices and xenophobias behind the veil of the holy texts
Or maybe, in like 45 million years, we may evolve past it!
No God, No fear.
Know God, Know fear.
Posts: 46660
Threads: 543
Joined: July 24, 2013
Reputation:
108
RE: Thought experiement# 117. The measure of man.
January 28, 2020 at 6:51 am
(January 28, 2020 at 5:02 am)ignoramus Wrote: In the future we create AI. They programmatically "evolve" to become functioning self aware citizens.
Their human rights status is still not clear to me atm.
Like Data from TNG, they are well behaved, unlike Data's brother who went rogue.
Now let's say some of these AI synths "evolved" to become "gay" or show gaylike tendencies.
Or, in other cases, some were drawn to the mystery of Islam, etc
Now, would we, as humans, accept them for being "themselves" and afford them respect, or would we likely assume they are "broken" or "faulty".
You know, like a dodgy toaster from Walmart.
What I'm asking is: is our human selfish naturally xenophobic chimp gene still going to play the same old game?
You know, The same game where many religious hide their prejudices and xenophobias behind the veil of the holy texts
Or maybe, in like 45 million years, we may evolve past it!
I don't see a homophobe or a xenophobe giving a pass simply because the object of their phobia is an artificial person. The core drivers of bigotry are fear and ignorance, and I think it's a safe bet that these drivers are always going to be in play.
Boru
‘I can’t be having with this.’ - Esmeralda Weatherwax
Posts: 9538
Threads: 410
Joined: October 3, 2018
Reputation:
17
RE: Thought experiement# 117. The measure of man.
January 28, 2020 at 8:43 am
A good programmer would anticipate an AI "evolving" - and would have a self fix subroutine to take care of those annoying deviations from program parameters..
Posts: 28534
Threads: 525
Joined: June 16, 2015
Reputation:
89
RE: Thought experiement# 117. The measure of man.
January 28, 2020 at 9:57 am
We're not really going to be stupid enough to give the AI's penises are we?
Boobs I can understand, but not dicks.
Being told you're delusional does not necessarily mean you're mental.
Posts: 7677
Threads: 635
Joined: January 19, 2013
Reputation:
30
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 6:24 pm
(This post was last modified: January 30, 2020 at 6:27 pm by WinterHold.)
(January 28, 2020 at 5:02 am)ignoramus Wrote: In the future we create AI. They programmatically "evolve" to become functioning self aware citizens.
Their human rights status is still not clear to me atm.
Like Data from TNG, they are well behaved, unlike Data's brother who went rogue.
Now let's say some of these AI synths "evolved" to become "gay" or show gaylike tendencies.
Or, in other cases, some were drawn to the mystery of Islam, etc
Now, would we, as humans, accept them for being "themselves" and afford them respect, or would we likely assume they are "broken" or "faulty".
You know, like a dodgy toaster from Walmart.
What I'm asking is: is our human selfish naturally xenophobic chimp gene still going to play the same old game?
You know, The same game where many religious hide their prejudices and xenophobias behind the veil of the holy texts
Or maybe, in like 45 million years, we may evolve past it!
You neglected a very important rule in the development of an AI : the "rules" themselves that act as the context borders of the AI.
An AI is a mere huge network of conditional statements put by the programmer behind it; so the resulting program would never act "gay" unless the programmer allows it to.
The synth though can choose. I won't blame another "synth" for reminding the synth who chose to act gay about the original laws the programmer put. But I would indeed blame a synth who chose the rule of acting gay; especially if the programmer put warning signs.
Posts: 4526
Threads: 13
Joined: September 27, 2018
Reputation:
17
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 7:00 pm
(January 28, 2020 at 5:02 am)ignoramus Wrote: In the future we create AI. They programmatically "evolve" to become functioning self aware citizens.
Their human rights status is still not clear to me atm.
Like Data from TNG, they are well behaved, unlike Data's brother who went rogue.
Now let's say some of these AI synths "evolved" to become "gay" or show gaylike tendencies.
Or, in other cases, some were drawn to the mystery of Islam, etc
Now, would we, as humans, accept them for being "themselves" and afford them respect, or would we likely assume they are "broken" or "faulty".
You know, like a dodgy toaster from Walmart.
What I'm asking is: is our human selfish naturally xenophobic chimp gene still going to play the same old game?
You know, The same game where many religious hide their prejudices and xenophobias behind the veil of the holy texts
Or maybe, in like 45 million years, we may evolve past it!
This is basically a Foucault-style question: who decides what is sane? what is normal? What ideology goes into these decisions?
In the case of AI, I'd say the game of hiding prejudices and xenophobia would come from the programmers and, possibly, the people who decided to shut off the machine. It's likely that computer programmers who haven't examined their own ideologies would consider these to be so normal that putting them into the AI would go without saying.
I read somewhere (I can't say it's true for sure) that Amazon warehouses are organized by AI to maximize efficiency. Whereas you or I might put all the books on one side and all the kitchen ware on another, the AI puts unrelated objects side by side, in a system which humans can't understand. It works, but only the computer understands it.
So if the whole AI thing happens at a bigger scale, it seems likely that they would reach conclusions which don't make sense to us. If by some crazy chance an AI did conclude that Islam is best, shutting it off would merely be a prejudice or xenophobic decision on our part, which possibly results from not understanding the computer.
Posts: 4446
Threads: 87
Joined: December 2, 2009
Reputation:
47
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 7:32 pm
You can have rules that are non-all inclusive from a programming standpoint. Most rules are if X then y not do only x or y
"There ought to be a term that would designate those who actually follow the teachings of Jesus, since the word 'Christian' has been largely divorced from those teachings, and so polluted by fundamentalists that it has come to connote their polar opposite: intolerance, vindictive hatred, and bigotry." -- Philip Stater, Huffington Post
always working on cleaning my windows- me regarding Johari
Posts: 1750
Threads: 0
Joined: December 11, 2019
Reputation:
9
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 9:46 pm
As long as the robots follow Asimov's rules they will neither harm themselves through sexuality nor living things through religion.
Posts: 20476
Threads: 447
Joined: June 16, 2014
Reputation:
110
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 9:56 pm
(This post was last modified: January 30, 2020 at 9:57 pm by ignoramus.)
Would Asimov's rules need to be extended to include hurting other non human entities?
EG: could the Christian AI synth go all crusady on the poor old Islamic synth?
If they are to pass the turing test, then they'd probably have to. Otherwise they're clearly not replicating human behaviour! lol.
No God, No fear.
Know God, Know fear.
Posts: 1750
Threads: 0
Joined: December 11, 2019
Reputation:
9
RE: Thought experiement# 117. The measure of man.
January 30, 2020 at 11:49 pm
That's a tough one. Where's the fun of robots if we make a rule they can't fight each other? "Masturbate before any big decision" would work if not for the rule against self abuse. Humans are good at carrying around conflicting ideas and being blind to their failures. AI people are definitely gonna need lots and lots of therapy.
|