(January 17, 2022 at 8:32 pm)vulcanlogician Wrote:(January 17, 2022 at 7:44 pm)HappySkeptic Wrote: I just don't see this as being a "hard problem". Qualia are what the conscious mind experiences - full stop. The conscious mind must have some experience - it might as well be the qualia we know. It could be different qualia if we had different brains or different senses, but it has to be something.
Now, identity and a sense of self is a bit of a mystery, but I feel that is an illusion created by our mind. If we were part of the Borg collective, we wouldn't experience a singular identity. Our separateness and our memories creates the sense of self.
It's not that easy though. Just examine two competing theories of mind: functionalism and biological naturalism. Both theories are materialistic (ie. physicalist). Both theories posit that conscious experiences are causally dependent on brain functioning. They agree there. But, otherwise, they arrive at two different conclusions concerning what consciousness is.
One theory (functionalism) states that conscious states arise due to the information feedback that happens with brain function. According to this theory, a computer could have conscious experiences if it somehow transmitted the same information your brain does when say, eating a hamburger.
The biological naturalist disagrees. The biological naturalist says you can transmit that information in a computer system and the computer will not experience eating a burger. To the biological naturalist, consciousness is a product of the physical features of neurons. If you wanted to create an artificial consciousness, you'd need to create a physical object that does the same physical thing that a neuron does when it fires. (A whole bunch of them actually.) Then you'd need to get them to fire in one of the myriad ways a neuron can fire when hamburger-eating is being done.
Who of us can say which of these theories is correct? Each has its merits. Each has its problems.
So while, yes, our conscious experience has to be something ... exactly what that something is eludes us. Hence, questions about consciousness are worth exploring. And the problem is indeed hard.
***
As for identity, I tend to agree with you. It's a key assumption of many that "self" is an actual unified thing to begin with. I think Locke put together a fine explanation with continuity. Hume's thoughts are good too (self is illusory). But, at the end of the day, we don't want to dispense with the notion of THIS person or THAT person. I know I don't. And if we want to make such distinctions, we ought to be able to explain ourselves.
I'd guess that the differences between those two physicalist theories would have to be resolved by observation.
Do the particular neural events correlated with conscious states depend on extensive feedback mechanisms or not? If we find 'isomorphic' systems with other substrates, do they show the types of behavior that we want to classify as 'conscious'? Sort of like whether or not we want to classify Pluto as a planet or not.
No, we cannot resolve this question at this time because we don't have those neural correlates. Having those would certainly help in the resolution, though.
One of the problems in studying consciousness, I think, is that we can't seem to agree on classification. is a bacterium conscious? How about an earthworm? How about a dog? How about a plant? a fungus? an atom?
Maybe we need to expand our vocabulary to encompass *and* distinguish all of these. Without an agreed upon classification, no progress can be made determining when each of the different processes is involved and how they relate.