(January 17, 2022 at 8:32 pm)vulcanlogician Wrote:(January 17, 2022 at 7:44 pm)HappySkeptic Wrote: I just don't see this as being a "hard problem". Qualia are what the conscious mind experiences - full stop. The conscious mind must have some experience - it might as well be the qualia we know. It could be different qualia if we had different brains or different senses, but it has to be something.
Now, identity and a sense of self is a bit of a mystery, but I feel that is an illusion created by our mind. If we were part of the Borg collective, we wouldn't experience a singular identity. Our separateness and our memories creates the sense of self.
It's not that easy though. Just examine two competing theories of mind: functionalism and biological naturalism. Both theories are materialistic (ie. physicalist). Both theories posit that conscious experiences are causally dependent on brain functioning. They agree there. But, otherwise, they arrive at two different conclusions concerning what consciousness is.
One theory (functionalism) states that conscious states arise due to the information feedback that happens with brain function. According to this theory, a computer could have conscious experiences if it somehow transmitted the same information your brain does when say, eating a hamburger.
The biological naturalist disagrees. The biological naturalist says you can transmit that information in a computer system and the computer will not experience eating a burger. To the biological naturalist, consciousness is a product of the physical features of neurons. If you wanted to create an artificial consciousness, you'd need to create a physical object that does the same physical thing that a neuron does when it fires. (A whole bunch of them actually.) Then you'd need to get them to fire in one of the myriad ways a neuron can fire when hamburger-eating is being done.
Who of us can say which of these theories is correct? Each has its merits. Each has its problems.
So while, yes, our conscious experience has to be something ... exactly what that something is eludes us. Hence, questions about consciousness are worth exploring. And the problem is indeed hard.
***
As for identity, I tend to agree with you. It's a key assumption of many that "self" is an actual unified thing to begin with. I think Locke put together a fine explanation with continuity. Hume's thoughts are good too (self is illusory). But, at the end of the day, we don't want to dispense with the notion of THIS person or THAT person. I know I don't. And if we want to make such distinctions, we ought to be able to explain ourselves.
Functionalism is the correct view, but it may also be true that one can't create human-like consciousness without hardware that mimics some of the functionality of a real neural network. That isn't because I'm waffling on functionalism - it is because the function may be highly dependent on neural structure.
I am a fan of the work of Gerald Edelman, who believed that biological intelligence self-evolves from the structures of neuronal groups. If this is true, the nature of our intelligent conscious experience may be difficult to replicate without building a similar self-evolving AI.