RE: Will AI ever = conciousness or sentience?
November 29, 2012 at 1:55 pm
(This post was last modified: November 29, 2012 at 2:10 pm by Angrboda.)
(November 29, 2012 at 7:07 am)jonb Wrote:(November 29, 2012 at 4:44 am)Tiberius Wrote: We have consciousness only in the form that we are aware of our actions after being forced into performing them.
I heard on the radio a few months ago about a paper seeming to support this, but I have been unable to find the links.
As I remember it showed that we only became concious of our actions after the decision to make the action was already in place. The consciousness only being a sort of way of justifying ourselves.
I believe the work you are referring to is that of Benjamin Libet. It's been discussed at length in the literature, as ti's not at all recent, but you might want to pay particular attention to Daniel Dennett's responses on the subject.
Also of relevance here are the adaptive unconscious and experiments on split-brain subjects (pioneered by Sperry and Gazzaniga). (Jonathan Haidt's "wag the dog" idea echoes something similar if I'm recalling it correctly; I think I still have that paper here somewhere.)
First of all, let me say that I'm not suggesting (as has been done elsewhere if not here), that if you have a sufficiently complex system of computation, perhaps performing a specific set of functions that allows it to map inputs to outputs (behaviors) in certain ways that you will have what we call consciousness or sentience. It requires a complex computational machine, yes, but a complex computational machine with the right "program" (much as I hate to use that word in this context, as it misleads). This is a first fundamental distinction which needs to be made. Again, with caveats regarding the chosen metaphor, there are those who believe that consciousness is (largely) the result of the specific "software" that our brains run, and that the hardware is effectively irrelevant, and therefore it can indeed be duplicated on another, non-biological platform. Then there are those, like Searle, who argue that the specific nature of our biological hardware is essential to the genesis of consciousness (though in what way he does not say). There is a further camp which argues that not only is it the specific hardware that is essential, but that consciousness depends on properties of that hardware and their functions which science does not yet understand or appreciate adequately (quantum consciousness, microtubules and Crick's resonance hypothesis being examples).
I believe that consciousness results from a certain configuration of computational processes, but that there is nothing unique to the computational abilities of its putative host, the human brain, which make it uniquely capable of carrying out these processes. Nonetheless, consciousness, in my view, is a result of specific processes and their supporting processes, and that without them, or something functionally equivalent, you will not have a machine possessed of consciousness and sentience, just an intelligent machine. Consciousness and sentience are special in that they are distinct, identifiable kinds, but they are not special in that they require a specific hardware host, or even an exactly equivalent program.
Wikipedia Wrote:Unreliability of introspection
"[Introspection] does not provide a direct pipeline to nonconscious mental processes. Instead, it is best thought of as a process whereby people use the contents of consciousness to construct a personal narrative that may or may not correspond to their nonconscious states."
— Timothy D. Wilson and Elizabeth W. Dunn (2004)
A 1977 paper by psychologists Richard Nisbett and Timothy D. Wilson challenged the directness and reliability of introspection, thereby becoming one of the most cited papers in the science of consciousness. Nisbett and Wilson reported on experiments in which subjects verbally explained why they had a particular preference, or how they arrived at a particular idea. On the basis of these studies and existing attribution research, they concluded that reports on mental processes are confabulated. They wrote that subjects had, "little or no introspective access to higher order cognitive processes". They distinguished between mental contents (such as feelings) and mental processes, arguing that while introspection gives us access to contents, processes remain hidden.
One observation I would make is that consciousness, popularly conceived, and as I experience it myself, does not exist in the physical world.
There are two properties of consciousness which I suspect we all share (though please do speak up if you don't). Using my consciousness as a model, I experience myself (my consciousness) as a thing existing somewhere behind my eyes, located here, in the present moment. This appears to demonstrate that consciousness appears to, a) be unified spatially, it exists in one central, undistributed place, and, b) it is temporally unified, it is not spread out in time, it exists at an infinitesimally small "here" in time.
Neither of these can be true of any brain process we currently understand. Neurons and neuron assemblies take time to fire and coordinate their activity, so if consciousness is "occurring" then it is occurring spread out in time. Consciousness' perception of itself existing in the here and now is either an illusion, depends on some property of brains not currently known or understood, or isn't happening in the physical brain at all. Personally, I'm credulous of the latter two hypotheses, especially as I have a model of consciousness which accounts for all of consciousness' properties without need to appeal to such unknowns. (I think, anyway; a lot of dots need connecting, and it would need to undergo rigorous criticism before I was truly satisfied with it.) The other property, that consciousness is unified and localized in one spot, whether behind the eyes, or somewhere else, likewise falls to the same criticism. (See Daniel Dennett's Consciousness Explained for more on the subject.)
In a nutshell, if consciousness is a physical process of the brain, based on known physical processes, then it cannot have the properties which it thinks it does have.
![[Image: extraordinarywoo-sig.jpg]](https://i.postimg.cc/zf86M5L7/extraordinarywoo-sig.jpg)