(January 20, 2022 at 7:46 pm)emjay Wrote:(January 20, 2022 at 4:19 pm)polymath257 Wrote: The computer analogy is bad mainly because a computer doesn't interact with the environment to meet conditions for survival.
Suppose instead that we have a robot that must supply itself with fuel found int he environment. It has to deal with challenges from that environment to do so and the computer
that is processing the data has to react to a wide variety of different situations.
Yes, at a certain level of complexity, I would say that robot is conscious. it has to get information from the environment, use that information to 'make decisions' and react appropriately.
When it comes across a piece of information that is relevant to its goal (getting fuel), that piece of information is *meaningful* to that robot.
This is in the same way that detection of a chemical is meaningful to a bacterium and it responds by moving closer or farther away from it.
It seems to me that the line is crossed into phenomenological consciousness when an internal state is compared to incoming information in a continuous way.
Well, one of the reasons biological entities have 'goals' is that they are programmed for survival. That means they have to evaluate information from the environment and determine how it relates to survival (and reproduction). That is where meaning ultimately comes from initially, I think.
As for the 'what it is like'; no single molecule has a temperature. The concept of temperature only makes sense in systems of molecules. In the same way, only certain types of neural networks would have the feedback necessary to maintain an analysis of an internal state. The evaluation of the internal state is 'what it is like' to be in that state.
So, for example, it is pretty clear that a bacterium doesn't maintain a subsystem modeling its internal state and using that model to determine what it does next. But, for example, a dog clearly does. So the dog is conscious and the bacterium is not.
Again, it seems like an information processing issue and not something beyond what is physical.
And I respect your contributions here. it is always interesting to see alternative viewpoints in these matters.
Ultimately, it boils down to what I would consider to be an 'explanation' of consciousness. And, I would be satisfied by a translation process between neural correlates and conscious states that is predictive and reasonably universal. I'm ok with some flexibility around the edges (just like with temperature), and I admit there may be borderline cases (is a plant conscious? how about an earthworm?).
I guess if we have that, I see no need to postulate a dualistic metaphysics since all that we can observe is explained.
This seems like a natural stopping point for the conversation if that's okay with you? In these long conversations I'm just not very good at either disengaging or pacing myself, so just in the space of writing in this thread over the last few days, I've managed to get myself totally out of sync and barely sleeping, so I really need to disengage, relax, and chill... just go back to reading rather than partaking... til the next time.
But I think we're at the point where we understand each other now, and as I said I appreciate your viewpoint as expressed, especially in this post, but also throughout our conversation, and also like I said, it's definitely food for thought. I really mean that. So yeah, thanks for the chat and the insight
And thank you. It was interesting.