RE: Will AI ever = conciousness or sentience?
November 28, 2012 at 11:11 pm
(This post was last modified: November 28, 2012 at 11:57 pm by Whateverist.)
(November 28, 2012 at 3:49 pm)Rhythm Wrote: @ bolded- I think I've been remiss in explaining the challenge. Like you, I wouldn't imagine AI to "think like we do", to experience things "as we do"....but I don't think that this would disqualify AI from being self aware, or intelligent, from being genuine. In the same way that other creatures may experience "consciousness" differently than we do, a machine may be capable of experiencing something (perhaps even by a similar means logic gates to nuerons) indistinguishable from consciousness, at which point it's difficult to see why we would withhold the term.
I'm okay with thinking happening in a different way than we do it.
I'm also okay with describing a program's ability to take into account its own effects in the execution of a task as a kind of "self awareness", though there is very little special sauce in such a meaning.
(November 28, 2012 at 3:49 pm)Rhythm Wrote: Try to distance yourself from the notion of a trick to begin with.
Check. Simulations are off the table.
(November 28, 2012 at 3:49 pm)Rhythm Wrote: @ Both. Why assume that consciousness is somehow measured by our own as a "thing", and similarly why ignore those similarities our consciousness (and the structure we feel is at least somehow involved) has with things that are not ourselves? Why not consider "human consciousness" a -type- of consciousness that leverages principles which can be, though aren't always, leveraged by other things? Why assume that ours is the real deal and not a "trick" to begin with - wouldn't it be better to establish this than simply declare it..especially if we're trying to compare two proposed models of how something might be "conscious"? It bears mention in this, that I'm not trying to devalue your experience of consciousness by likening it to a "trick"...if it were a "trick" it would be a very valuable one, on that I think we both agree- just trying to pry this idea of what is or isn't trickery away from what is or isn't human, or like us. Trying to make this something other than bare bias towards what you and I possess and call "consciousness"
(I could argue against my own choice of words in this post all day long by the way...just hoping to have conveyed the general theme)
I just wouldn't know how to begin to define consciousness apart from what I know of it from the inside. I have no idea how it works, though a beating heart and the availability of oxygen seem crucial .. albeit only to me and my kind. What are the hallmarks of consciousness? John Searle says things like it (consciousness) being to the brain what digestion is to the stomach. (Was that Apo I just heard wailing?) Like him (only much more so) I don't have a clue what consciousness is in and of itself.
I have no doubt that my dogs are conscious and I don't think they ever formulate a proposition or contemplate ones validity. The way we process information would seem to be closer to "reasoning" and "thinking". But the fact that my dogs move purposefully through the world means there are other ways to do it. So for sure, robots and programs can be made to move successfully through the world attending to complex tasks. I really want to say they would be assigned their tasks by us of course, but then you would want to know what assigns ours' .. and I have no wish to invoke another free will discussion.
Really I have no problem with robots having as robust an inner life as possible. I'm not a speciesist so why should I want to favor carbon based life over any other? I just can't seem to rap my head around how that would work, even though I also don't know how it works in us.
(November 28, 2012 at 7:37 pm)jonb Wrote: So lets play with the idea.
Billionaire Mad scientist Dr Moreau, manufactures a sentient being and puts it to work making other sentient beings. The Robot turns round and says I am sentient, therefore I have a right to self determination, and appeals to us for help. Dr Moreau says it is my property I made it, it belongs to me and it lives on my island it and all that come after it are my slaves and have to do as I say and I have the right to do with all of them as I wish. Where do we stand?
Hypothetically speaking .. would there be pleasure-bots with special skills?
(November 28, 2012 at 4:27 pm)apophenia Wrote:
This question has two, more or less equally important halves. The question as posed is largely incoherent, postulating capabilities of human minds that are so poorly defined as to compete with contemporary theology for emptiness
Pretty much what I was going for so far.
(November 28, 2012 at 4:27 pm)apophenia Wrote: .., and much
So the first half of answering the OP is clearing away all the bullshit, folk psychology, unsubstantiated metaphysical kruft and new agey whackadoo that infects the question as posed.
Now hold on there. You're taking away some of my favorite sources!
(November 28, 2012 at 4:27 pm)apophenia Wrote: Do you have an actual entity with known properties that you are asking to be duplicated, or are you asking whether God can create a rock that he cannot lift?
Yes and no. We are entities but that doesn't mean we know everything there is to know about how we work. Not sure how you get God from any of this though. Whatever we have going that we call consciousness is entirely down to earth as far as I'm concerned. No deity required.
(November 28, 2012 at 4:27 pm)apophenia Wrote: In philosophy of mind discussions, the types of properties of mind which the OP and others allude to, I often refer to simply as "special sauce," ala the McDonald's Big Mac, whose "special sauce" was considered an essential contributor to the sandwich's unique appeal. Consciousness and similar effects of minds or brains is often put in the role of the special sauce, that we have a thinking machine, but we also have this something extra which makes the operation of the human mind categorically different from other machines (and incidentally, different from all or at least some other animals, which poses innumerable difficulties given the nature of biological evolution). I've yet to see a good, workable definition of either consciousness or this special sauce in general, but if the OP will provide it, I may oblige with a more substantive response.
No can do. I have no idea what it is either. I certainly can't generalize beyond what I know about it from direct experience, but then it is hard to isolate exactly what the it is, isn't it?
I certainly don't think our brand of consciousness is categorically different than that of any other animal, and the closer you get on the family tree the greater the similarity I would imagine. But I do think that whatever it is has something to do with the way it feels to exist as the kind of organism you happen to be. It has something to do with what matters to you and what that feels like. Being a language using animal with some powers of abstraction means I can shoot my mouth off about this all day even though I admit I don't know what the hell I'm talking about. (I'm pretty sure I'm at the right party as far as that's concerned.)
(November 28, 2012 at 4:27 pm)apophenia Wrote: For my part, my belief and my theory is that there is no special sauce. Once you remove the questionable assertion that this special sauce exists with the magical properties frequently attributed to it (such as the degree of self-referentiality referred to above), the business of explaining the human mind becomes much more tractable (and in my view, more realistic). So, in a nutshell, yes, we might be able to recreate a mind like that which humans possess. The only obstacles being the practical and political ones which face any technological project. The brain, excluding extra-brain contributions to mind, consists of 100 billion neurons (1) (in addition to other cells and biological materials; the role of glial cells is coming to be appreciated to be much greater than previously surmised, and a human without a body and functioning endocrine system would hardly be human). And we still understand its operation largely piecemeal and by inference. We do not yet have a Darwin of brain science who has proposed a plausible unifying theory. (A saying in neuroscience is that, "neurons which fire together, wire together; neurons that fire apart, wire apart"; this rather stochastic feature of brain development likely underlies the topological layout of things like tactile sensation in the cortical tissues, which mirrors that of the physical tissue in its layout; a hand neuron will fire more often, more closely in time to a forearm neuron than it will a toe neuron, and thus ends up being wired more closely to it; this principle likely has dramatic implications for the nature and function of our brain systems, but as yet, it's difficult to extend it beyond a few isolated modalities.)
I don't have any reason to think we could not in principle understand the workings of biological systems to the degree necessary to create our own biological creation. Perhaps we could surpass ourselves? It wouldn't shock or offend me but then again there are other projects I'd prefer to attend to.
(November 28, 2012 at 4:27 pm)apophenia Wrote: So what we lack is twofold. A proper understanding of the question. And the actual answer to the question.
That about sums it up alright.
(November 28, 2012 at 4:27 pm)apophenia Wrote: The preference matching algorithm at Amazon.com is an example of modern machine intelligence. It performs its function supremely well, yet beyond basic principles of operation, we have no clue as to the specifics of how it does what it does. They created the basic form of its intelligence, and turned it loose to grow and learn and perform. What is lacking in the machine mind of the Amazon.com preference machine that you would want to add to it, from the capability of the human mind, speaking specifically of those things that you suspect the human mind uniquely capable, or that you believe is not duplicable?
Not a thing. That is exactly what I think of when I think of AI. It is a perfect example of disembodied, self-sufficient intelligence. My OP was precipitated by discussions elsewhere with folks who wanted to imagine AI as being sentient and conscious in the exact same way we think of ourselves as being (whatever the hell that actually may be). AI is interesting as a feat of human intelligence. It doesn't need to wonder what it all means or worry about its civil rights. It is like all of our intelligence with none of our neuroses.