RE: Will AI ever = conciousness or sentience?
December 1, 2012 at 12:00 pm
(This post was last modified: December 1, 2012 at 12:46 pm by Whateverist.)
(December 1, 2012 at 5:46 am)DoubtVsFaith Wrote: @ the OP.
Yes.
Why not? If life came from "non-life" (or life that is so lifeless that it's virtually and practically "non-life") then why can't conscious and sentient biological life develop into and become sentient and conscious non-biologically mechanical life (and I don't mean mechanical in a "bad" way... mechanisms are more complicated than that. After all, suppose a paradoxical mechanism developed that allowed a mechanism that was both free and orderly?)?
Interesting. I certainly believe we have on this planet seen a progression of inorganic to preorganic to organic and, somewhere along the way, sentience. Hardware and software start off as inorganic but what is preventing them from proceeding in the same way? (I'm not sure whether software starts off as inorganic, virtual or both?) I've got nothing in principle to offer against it. So at this level I have to concede the possibility. The transformation -if it comes- would be no more remarkable than the progression of life. You may have just budged me. Let me sleep on it.
(November 29, 2012 at 1:55 pm)apophenia Wrote: First of all, let me say that I'm not suggesting (as has been done elsewhere if not here), that if you have a sufficiently complex system of computation, perhaps performing a specific set of functions that allows it to map inputs to outputs (behaviors) in certain ways that you will have what we call consciousness or sentience. It requires a complex computational machine, yes, but a complex computational machine with the right "program" (much as I hate to use that word in this context, as it misleads). This is a first fundamental distinction which needs to be made. Again, with caveats regarding the chosen metaphor, there are those who believe that consciousness is (largely) the result of the specific "software" that our brains run, and that the hardware is effectively irrelevant, and therefore it can indeed be duplicated on another, non-biological platform. Then there are those, like Searle, who argue that the specific nature of our biological hardware is essential to the genesis of consciousness (though in what way he does not say). There is a further camp which argues that not only is it the specific hardware that is essential, but that consciousness depends on properties of that hardware and their functions which science does not yet understand or appreciate adequately (quantum consciousness, microtubules and Crick's resonance hypothesis being examples).
I believe that consciousness results from a certain configuration of computational processes, but that there is nothing unique to the computational abilities of its putative host, the human brain, which make it uniquely capable of carrying out these processes. Nonetheless, consciousness, in my view, is a result of specific processes and their supporting processes, and that without them, or something functionally equivalent, you will not have a machine possessed of consciousness and sentience, just an intelligent machine. Consciousness and sentience are special in that they are distinct, identifiable kinds, but they are not special in that they require a specific hardware host, or even an exactly equivalent program.
We might actually agree about some things. (I won't tell if you don't.) The second bolding (all mine of course) comes very close to what I would say too. I think the special sauce comes from the supporting systems. I think it is something intrinsic to our organic heritage that gives us an affinity for certain taste and scent sensations, movements, music and visual arts. Associations between understandings play a part but must be grounded in the 'flavor' the sensation has for us. That flavor in the sensations is more than a list of the component chemicals. I don't need to check the label to know how to respond. I respond by grocking the sensation itself. Then associations and thoughts come in to thicken the soup of experience. I suspect our smart machines will always be puzzled by this.
(November 29, 2012 at 1:55 pm)apophenia Wrote: One observation I would make is that consciousness, popularly conceived, and as I experience it myself, does not exist in the physical world.
That's okay. I'm kind of bi- myself when it comes to dualism/monism. It's fine to call them like you see them and let the chips fall where they may.
[I'm not done with this post of yours but am still chewing.]
(November 30, 2012 at 8:38 am)DoktorZ Wrote: This boundary between intelligence and consciousness, as pointed out above by others, is a definitional problem.
Yes but it isn't an arbitrary boundary. Intelligence is the possession of the relevant information with the capacity to apply it appropriately to achieve desired outcomes. As "appropriately" approaches "optimally" we move from smart to smarter. But consciousness includes loads of special sauce. In addition to having the intelligence to perform smartly, to be conscious/sentient we should also care about those outcomes. So long as the programmer determines the preferred outcome .. [Free will alert, free will alert .. abort, abort.] That was close.
I'm appreciating hearing your thoughts on this and have more reaction to this post. But in the interest of having a life that is more than virtual, I need to get outside.