Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: October 4, 2025, 8:22 pm

Poll: Will artificial intelligence ever achieve true sentience?
This poll is closed.
There are good reasons to think this will never happen.
11.11%
3 11.11%
I can't prove it but absolutely not. The idea of artificial sentience is absurd..
11.11%
3 11.11%
There is no telling what the future may hold. It's a coin flip.
14.81%
4 14.81%
Yes, smart machines are trending in that direction already.
44.44%
12 44.44%
Absolutely yes and I can describe to you the mechanisms which make it possible in principle.
7.41%
2 7.41%
Other. (Please explain.)
11.11%
3 11.11%
Total 27 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Will AI ever = conciousness or sentience?
#70
RE: Will AI ever = conciousness or sentience?
(December 3, 2012 at 4:12 am)DoktorZ Wrote:
Quote:Free will alert, free will alert .. abort, abort. That was close.

Lol, exactly. How do we meaningfully distinguish a system that adjusts its behavior to produce optimal outcomes from a system that "desires" those outcomes--if such desires are rooted in a material thing like a brain?

Now, I'm carrying along in life merrily as if I truly possess free will, in part because of the ethical ramifications of not having it. But, the more dynamic systems theory I read, the less liberal humanism I actually am willing to accept. Some of the justifications I've read for free will seem to be the deformed discursive offspring of theology, and not adequately separated from it--sort of like that film "Basket Case."

More importantly, much of the advocacy for intention and free will seems to be rooted in our need for ethics. I have a pet theory that gradual examination of modern horrors like the Holocaust and the Great Famine in China has (re-)created a moral need for theories of "free will," and other sentimentalities of that sort. Wink

I'm of the opinion that much of the defense of free will takes the form of a frantic attempts to save a particular view of ethics. Like Christians who hypothesize that you cannot have goodness and morality without God, people can't imagine how you can have a workable society without the ethics of personal responsibility, and personal responsibility without free will. I'm persuaded though, that when you look at the practical application of ethics, that of controlling behavior through law, incentives and punishments, the need for free will evaporates. As noted elsewhere, modern criminal punishment has four primary goals. 1) protection of society, by removing dangerous elements, 2) retribution or an eye for an eye, 3) rehabilitation, changing a problem behavior, 4) deterrence, providing an incentive for people to avoid those behaviors. Of these four goals, only retribution seems to depend on free will and moral culpability, and it has long been recognized as the theory of punishment with the most practical and ethical problems.



@whateverist:

I think you're engaged in a bit of question begging and a failure of imagination. You're enacting the very fear that DoktorZ voiced, that if you only acknowledge sentience of a certain pattern, the human pattern, you will blind yourself to other equally valid patterns. The questions of mortality and pain bring this to the fore. There's an analogous situation in that people are often said to imagine robots as a box with wheels, and given this mindset, it limits what they can imagine a robot to be. At bottom, unless you're advocating that we are something supernatural or basically inexplicable, we too are machines; biological machines, but machines nonetheless. If other machines are not considered to be capable of feeling pain, then we don't "really" feel pain either, and this pain you speak of becomes a mere placeholder for 'is human' or 'is biological'. At bottom, we are nothing more than boxes with wheels, too. Our skulls, and chest, the box, our brains the computer, and our arms and legs the wheels. Anything you can deny a non-biological machine from possessing, you must either equally deny to humans or animals, or find some supernatural explanation for it. Because if pain is just an idea in a biological machine prompted by certain dispositions of its systems, which it pretty clearly is, I don't see why you think a machine intelligence would be incapable of similar ideas. Perhaps you're imagining a machine intelligence as similar to a box with wheels, as a box, without receptors and sensors and actuators and the ability to monitor the status of its systems, just as our pain receptors do for us. If so, then you're illegitimately sneaking in rather self-serving limitations on this machine sentience and falling into the trap that DoktorZ outlined.

Let me offer a couple of hypotheticals.


First, imagine a future in which we've determined that the important functions of our biological life form are all macroscopically well described, that quantum effects play no significant role. We've progressed to the point that we can scan and digitize the entirety of a human being and store it, or process that scan for diagnostic purposes. Suppose, furthermore, that we can take these scans, and recreate the pattern contained in the image. We can create more of you just by scanning you and printing off copies. Now suppose you go off to explore a planet for a few years and you're eaten by a grue or something. But, we still have your scan from your last physical. So the doctors simply print off a new copy of you and pat you on the head and tell you to go your merry way. Does this mean that you didn't die? Does this mean you are no longer mortal? If not, how is this any different from disrupting the systems of a sentient machine and then resetting it to a state it might have been in at a prior time? (And if current machine intelligence is any indication of a trend, we would be as unable to recreate a meaningfully recent state of our machine intelligence in the same way that recreating you would be very difficult to impossible currently.)


One suggested scenario is that someday, instead of relying on humans to fight our wars, to go into urban centers and root out insurgents, we might create monkey soldiers to perform this job. This would be accomplished by attaching computers to a monkey's brain and nervous system, either externally or as implants, to monitor, adjust, and direct the monkey's brain so that the monkey performs the tasks we want it to perform. When we want it to enter and clear a building, we overlay its perception of building and human insurgents with patterns its monkey brain understands, like sexual rivals or the presence of a monkey from an enemy social group. Similarly, we can expect this computer system adjunct to the monkey to monitor the monkey's biological systems, its heart rate, its blood oxygen levels, indicators of fatigue and so on. It would seem a given that we would monitor its network of pain receptors to maintain an accurate understanding of the status of the monkey's biological tissues. Moreover, let's assume that this computer is not just a passive computational device, a box with wheels so-to-speak, but rather is a sentient machine. Is there any practical limitation that would prevent the monkey's copilot from experiencing the status of the monkey's biological pain receptors as pain? Let's take the hypothetical one step further and suppose an improvement upon this design. That instead of using non-biological computational devices, which are expensive to build, program and use, they learn to substitute a biological computer. Instead of a machine brain, they use specially developed and genetically modified cat brains that have been custom programmed and hooked up to perform the same computational tasks that the non-biological brain performed. If you say the non-biological "copilot brain" couldn't feel pain, then you'd seem obligated to conclude that the cat brain copilot can't feel pain either. And if the cat brain can't feel pain as a copilot, how is it able to feel pain as a normal cat brain? Where does this, "the system is incapable of feeling pain" enter into the question of machine sentience, both biological and non-biological, and what are the minimums for a system to "feel pain" ?


[Image: extraordinarywoo-sig.jpg]
Reply



Messages In This Thread
Will AI ever = conciousness or sentience? - by Whateverist - November 28, 2012 at 1:28 am
RE: Will AI ever = conciousness or sentience? - by Voltron - November 28, 2012 at 2:02 am
RE: Will AI ever = conciousness or sentience? - by Minimalist - November 28, 2012 at 2:09 am
RE: Will AI ever = conciousness or sentience? - by JohnDG - November 28, 2012 at 3:33 am
RE: Will AI ever = conciousness or sentience? - by Rayaan - November 28, 2012 at 4:45 am
RE: Will AI ever = conciousness or sentience? - by Whateverist - November 28, 2012 at 11:12 am
RE: Will AI ever = conciousness or sentience? - by Napoléon - November 28, 2012 at 1:44 pm
RE: Will AI ever = conciousness or sentience? - by Napoléon - November 28, 2012 at 1:41 pm
RE: Will AI ever = conciousness or sentience? - by Darkstar - November 28, 2012 at 10:27 am
RE: Will AI ever = conciousness or sentience? - by Whateverist - November 28, 2012 at 10:58 am
RE: Will AI ever = conciousness or sentience? - by LastPoet - November 28, 2012 at 11:00 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 11:34 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 11:53 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 11:44 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 1:31 pm
RE: Will AI ever = conciousness or sentience? - by Napoléon - November 28, 2012 at 1:53 pm
RE: Will AI ever = conciousness or sentience? - by Whateverist - November 28, 2012 at 11:11 pm
RE: Will AI ever = conciousness or sentience? - by Angrboda - November 28, 2012 at 4:27 pm
RE: Will AI ever = conciousness or sentience? - by Rayaan - November 28, 2012 at 6:33 pm
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 7:37 pm
RE: Will AI ever = conciousness or sentience? - by jonb - November 28, 2012 at 9:45 pm
RE: Will AI ever = conciousness or sentience? - by Tiberius - November 29, 2012 at 4:44 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 29, 2012 at 7:07 am
RE: Will AI ever = conciousness or sentience? - by Angrboda - November 29, 2012 at 1:55 pm
RE: Will AI ever = conciousness or sentience? - by Whateverist - November 29, 2012 at 11:10 am
RE: Will AI ever = conciousness or sentience? - by DoktorZ - November 29, 2012 at 5:24 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 29, 2012 at 8:20 am
RE: Will AI ever = conciousness or sentience? - by DoktorZ - November 29, 2012 at 10:01 am
RE: Will AI ever = conciousness or sentience? - by Ryantology - November 29, 2012 at 2:55 pm
RE: Will AI ever = conciousness or sentience? - by Angrboda - November 29, 2012 at 3:07 pm
RE: Will AI ever = conciousness or sentience? - by Ryantology - November 29, 2012 at 3:16 pm
RE: Will AI ever = conciousness or sentience? - by jonb - November 29, 2012 at 6:56 pm
RE: Will AI ever = conciousness or sentience? - by Angrboda - December 2, 2012 at 5:20 pm
RE: Will AI ever = conciousness or sentience? - by Napoléon - December 2, 2012 at 9:15 pm
RE: Will AI ever = conciousness or sentience? - by Ryantology - December 2, 2012 at 10:20 pm
RE: Will AI ever = conciousness or sentience? - by Cato - November 29, 2012 at 6:43 pm
RE: Will AI ever = conciousness or sentience? - by Angrboda - November 29, 2012 at 7:41 pm
RE: Will AI ever = conciousness or sentience? - by DoktorZ - November 30, 2012 at 8:38 am
RE: Will AI ever = conciousness or sentience? - by genkaus - November 30, 2012 at 8:17 pm
RE: Will AI ever = conciousness or sentience? - by genkaus - December 4, 2012 at 12:50 am
RE: Will AI ever = conciousness or sentience? - by jonb - November 30, 2012 at 8:43 pm
RE: Will AI ever = conciousness or sentience? - by DoktorZ - December 3, 2012 at 4:12 am
RE: Will AI ever = conciousness or sentience? - by Angrboda - December 3, 2012 at 5:22 am
RE: Will AI ever = conciousness or sentience? - by DoktorZ - December 3, 2012 at 9:58 am
RE: Will AI ever = conciousness or sentience? - by DoktorZ - December 3, 2012 at 10:37 am
RE: Will AI ever = conciousness or sentience? - by Angrboda - December 1, 2012 at 4:46 pm
RE: Will AI ever = conciousness or sentience? - by jonb - December 3, 2012 at 8:59 pm
RE: Will AI ever = conciousness or sentience? - by Aegrus - December 3, 2012 at 10:26 pm

Possibly Related Threads...
Thread Author Replies Views Last Post
  Uploading Conciousness to Computer AFTT47 26 10226 January 29, 2015 at 3:50 pm
Last Post: Faith No More
Shocked The burden of proof relating to conciousness, free choice and rationality marx_2012 107 41838 December 6, 2014 at 12:40 am
Last Post: robvalue
  Sentience and Love BrokenQuill92 6 2047 March 23, 2014 at 6:50 pm
Last Post: bennyboy
  conciousness justin 18 4832 February 24, 2013 at 7:28 pm
Last Post: ManMachine
  Sentience Captain Scarlet 17 6576 December 29, 2010 at 7:51 am
Last Post: Edwardo Piet



Users browsing this thread: 1 Guest(s)