Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: April 26, 2024, 10:55 pm

Poll: Will artificial intelligence ever achieve true sentience?
This poll is closed.
There are good reasons to think this will never happen.
11.11%
3 11.11%
I can't prove it but absolutely not. The idea of artificial sentience is absurd..
11.11%
3 11.11%
There is no telling what the future may hold. It's a coin flip.
14.81%
4 14.81%
Yes, smart machines are trending in that direction already.
44.44%
12 44.44%
Absolutely yes and I can describe to you the mechanisms which make it possible in principle.
7.41%
2 7.41%
Other. (Please explain.)
11.11%
3 11.11%
Total 27 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Will AI ever = conciousness or sentience?
#1
Will AI ever = conciousness or sentience?
On another website I visit it seems most people think our machines will be joining us as sentient beings in their own right any day now. I'd like to poll our larger group to find out what is the prevailing opinion here.

Can machines even be said to think? What counts as thinking? If the execution of decision trees is thinking then indeed they already do 'think'. A program that can diagnose diseases strikes me as very intelligent, but its intelligence of course reflects that of its programmer.

I'm not convinced that machines are or ever will be up to all the tasks me might describe as thinking, but I'll concede that at least some 'thinking' tasks can be accomplished by machines.

Even so, is there any reason to think a program will ever experience subjective states or become self aware or have an identity crisis or be said to exhibit wisdom that does not directly reflect that of its programmer? I see that as a very different question than asking whether a machine could be programmed in such a way as to fool us into thinking these things are going on. I'm skeptical to the point of finding the idea aburd.

I can't make an air tight argument against the possibility but I don't believe it is or ever will be possible. What do you think?
Reply
#2
RE: Will AI ever = conciousness or sentience?
Is this other website a Star Trek site by chance? Wink

I'm with you on this one. The programmer would have to successfully replicate human feelings. I don't know that that will ever be possible.
Reply
#3
RE: Will AI ever = conciousness or sentience?
Did someone say Star Trek?!


My ignore list




"The lord doesn't work in mysterious ways, but in ways that are indistinguishable from his nonexistence."
-- George Yorgo Veenhuyzen quoted by John W. Loftus in The End of Christianity (p. 103).
Reply
#4
RE: Will AI ever = conciousness or sentience?
I gotta go with "no."

[Image: Michele-Bachmann-Quotes-Free-Wallpaper-2.jpg]
Reply
#5
RE: Will AI ever = conciousness or sentience?
How can you say no? If we create them to be simple machines they will be simple machines. But if we create them to attain self awareness and act of its own free will yes. It just depends on what you want your machine to do. Right now scientist are trying to create a brain based off our own so it will probably act like a brain.
Live every day as if already dead, that way you're not disappointed when you are. Big Grin
Reply
#6
RE: Will AI ever = conciousness or sentience?
(November 28, 2012 at 1:28 am)whateverist Wrote: Even so, is there any reason to think a program will ever experience subjective states or become self aware or have an identity crisis or be said to exhibit wisdom that does not directly reflect that of its programmer? I see that as a very different question than asking whether a machine could be programmed in such a way as to fool us into thinking these things are going on. I'm skeptical to the point of finding the idea absurd.

I don't think that a program or a machine could ever be self-aware or have a consciousness in the same way that we do. I think that the level of self-referentiality that exists in the human mind is much deeper than the level of self-referentiality that exists in machines and/or computer programs.

Interestingly, however, I've read in a few articles that a computer program can be thought to have consciousness - or a mind of it's own, so to speak - depending on how you define the word "conscious". There are certain definitions of consciousness in relation to computational properties that, when applied to the behavior of a computer program, the program itself can be considered to be "conscious" or "self-aware." You can see some of those definitions and their applications on page four in the link below:

Conscious Machines and Consciousness Oriented Programming
Reply
#7
RE: Will AI ever = conciousness or sentience?
(November 28, 2012 at 1:28 am)whateverist Wrote: On another website I visit it seems most people think our machines will be joining us as sentient beings in their own right any day now.
I think that it is theoretically possible, but very, very difficult to do. Any day now? Definitely not. In the distant future? Only if it is done on purpose. If, theoretically, a machine were created to [perfectly] replicate the human brain, couldn't said machine be called sentient?
Reply
#8
RE: Will AI ever = conciousness or sentience?
(November 28, 2012 at 1:28 am)whateverist Wrote: On another website I visit it seems most people think our machines will be joining us as sentient beings in their own right any day now. I'd like to poll our larger group to find out what is the prevailing opinion here.

Can machines even be said to think? What counts as thinking? If the execution of decision trees is thinking then indeed they already do 'think'. A program that can diagnose diseases strikes me as very intelligent, but its intelligence of course reflects that of its programmer.

I'm not convinced that machines are or ever will be up to all the tasks me might describe as thinking, but I'll concede that at least some 'thinking' tasks can be accomplished by machines.

Even so, is there any reason to think a program will ever experience subjective states or become self aware or have an identity crisis or be said to exhibit wisdom that does not directly reflect that of its programmer? I see that as a very different question than asking whether a machine could be programmed in such a way as to fool us into thinking these things are going on. I'm skeptical to the point of finding the idea aburd.

I can't make an air tight argument against the possibility but I don't believe it is or ever will be possible. What do you think?

Well, I think that with a couple tiny tweaks to -your- programming the idea will seem a hell of alot less absurd.

- Do you exhibit any wisdom that does not directly reflect your programmer? More aptly, your various programmers. Is there some part of your thought process that you feel arose all on it's own, without instruction or structure from your genetics or your environment? These things could be called your programmers, even if you're not accustomed to considering them as such. Of course, we could consider ourselves as another programmer in that group (but this wouldn't matter...because our machines are already capable of altering their own code).

- Is there any "fooling us into thinking that these things are going on"- going on? Is there anything other than the effect which we refer to when we consider something to be intelligent, to be thinking? Are we just tricking each other and ourselves when we "think"?

I think that your skepticism has been subverted by anthropic bias. Human thought doesn't have a programmer, machines would have to "trick" us.

Now, for what I think about this. Currently, it would take a very large machine (packed into a very small space) and an extremely robust software suite to accomplish this. We have a head-start on the order of 3.5 billion years (and the field testing, QA, and Tech departments were downright murderous.....)....but look how quickly our machines have been able to play catch up - and yes..even overtake us. We do seem to be trending towards denser, smaller hardware (which allows us to process more data, more quickly)....and our ability to create software with things like pattern recognition ( one of the keys to the ability to "learn") does seem to be improving. Something I would wonder, about AI (as I see it as a definite possibility) is what strange human quirks we would be programming into it (even unintentionally) as it's programmers...as our environment and genetics programmed ticks into us.
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
Reply
#9
RE: Will AI ever = conciousness or sentience?
(November 28, 2012 at 10:27 am)Darkstar Wrote:
(November 28, 2012 at 1:28 am)whateverist Wrote: On another website I visit it seems most people think our machines will be joining us as sentient beings in their own right any day now.
I think that it is theoretically possible, but very, very difficult to do. Any day now? Definitely not. In the distant future? Only if it is done on purpose. If, theoretically, a machine were created to [perfectly] replicate the human brain, couldn't said machine be called sentient?

I agree that machines are not going to accidently become aware. If it is to happen at all we would have to make a very purposeful effort. Lets imagine we're willing to do that and manage to put the best program possible into a robotic body equipped to physically do what we can do and with the best sensory input devices we can find. While we should be able to fool a few people, is that really the only test? Are we sentient because we are able to present ourselves in such a way as to fool others? If we fool anyone it should be because we seem to have qualities of a certain kind. Surely it can't all be a matter of imitation if in the end there isn't something in particular which is being imitated. What is that something?

If the machine has sentience then it cares about what happens and it has subjective states arising from its perceptual input. How exactly do we program in those? We could probably program it to deliver variations on the macbethian theme and to utilize reflective language (no small feat) in response to selected criteria, but it is hard for me to see how this gets at the experience of sentience itself and not just its appearance.

The other site hasn't a Star Trek theme but it does harbor many who think free will is illusory. These guys are monists who probably do think their own subjective states are programmed into them and no more or less genuine than that which we could program into a machine. (I think they need to get outside more often.)
Reply
#10
RE: Will AI ever = conciousness or sentience?
We are able to construct our own abstractions. As of these times it isn't possible for a machine to do that. To do that a machine would need to learn. Feel the world around it. I don't know any tecnology capable to do that ATM.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Uploading Conciousness to Computer AFTT47 26 7752 January 29, 2015 at 3:50 pm
Last Post: Faith No More
Shocked The burden of proof relating to conciousness, free choice and rationality marx_2012 107 33815 December 6, 2014 at 12:40 am
Last Post: robvalue
  Sentience and Love BrokenQuill92 6 1497 March 23, 2014 at 6:50 pm
Last Post: bennyboy
  conciousness justin 18 3612 February 24, 2013 at 7:28 pm
Last Post: ManMachine
  Sentience Captain Scarlet 17 5142 December 29, 2010 at 7:51 am
Last Post: Edwardo Piet



Users browsing this thread: 1 Guest(s)