(September 19, 2016 at 6:35 pm)bennyboy Wrote:(September 19, 2016 at 6:23 pm)Jörmungandr Wrote: I've never been a proponent of emergent properties. It has always seemed to me to be merely hand waving aside the hard problem, as in "enough complexity" poof, magically becomes consciousness. The tools we have for studying the operation of the brain are currently rather crude and primitive. My hope is that with better tools we will someday be able to see consciousness as just another set of brain processes, like memory. (Not to say we understand everything about memory, but through animal models we have been able to probe the mysteries of memory much further.) I don't find myself inclined to agree that the phenomena of consciousness will not yield to reduction. The processes in the brain are too large scale and can be too easily described in terms of classical mechanics for the cogs of the machine to not be capable of being elucidated.
Is there a non-arbitrary "critical mass" at which any mechanism or process represents the most fundamental thing that can still be called "consciousness?" It seems to me the end of the road is likely to be in those most simple states that could be said to represent information-- the emission and absorption of photons, for example, or changes in the energetic state of electron orbits.
I don't believe there's a critical mass, per se. My belief is that consciousness is "built on top of" processes like perception and language, that it is a very specific way of tying these systems together, but I suspect that consciousness arose very early in the evolution of brains. For example, I would suspect that fish have a rudimentary form of consciousness. To my way of thinking, consciousness originates from problems of control of the body, and predator-prey behaviors. The problem of control is one of integrating the various behaviors the organism's body can perform with an arbitrating mechanism which would be responsible for being the master selector of behavior. I suspect, in addition, that a sort of free will evolved very early. When a predator fish attempts to predict which way its prey is going to swim, it can't automatically assume one direction or the other; it must present a range of possible future positions of the prey, granting the prey a degree of freedom, and then satisficing what behavior of the predator is most likely to intersect the path of the prey. (Or in actuality, the process of guiding it toward the path most closely resembling the prey's next move. There is the phenomenon of a baseball player guiding a fly ball into his mitt. Apparently this works by keeping the head at a certain angle, and approaching or receding based upon the angle that the head must make to center the ball. In the fish predator-prey relationship, it would be somewhat similar. A vector to intercept the prey would be calculated based on where in the predator's site the prey is located. This left-right vector would occur in a space in which the prey would be seen as possessing the freedom to change it's angle of incidence left or right.)
Anyway, I don't view consciousness as a thing which requires a certain amount to "turn on." Rather, a companion set of neural circuits evolved on top of our neural circuits controlling body behavior and perception in order to tie our behavioral responses to our perceptions. This likely occurred very early on in the evolution of highly mobile animals, and is largely an all or nothing process. In a fish, it would be used to make behavioral decisions based on perception of the environment. My suspicion is that an animal like a black fly doesn't have this extra layer of decision making apparatus, and has more or less pre-programmed responses to light, shadow, smell, and sound in its environment. That a fly has an algorithm, whereas a fish has true consciousness. But I could be wrong. Perhaps a fly has consciousness, too.
![[Image: extraordinarywoo-sig.jpg]](https://i.postimg.cc/zf86M5L7/extraordinarywoo-sig.jpg)