(January 17, 2022 at 8:32 pm)vulcanlogician Wrote:(January 17, 2022 at 7:44 pm)HappySkeptic Wrote: I just don't see this as being a "hard problem". Qualia are what the conscious mind experiences - full stop. The conscious mind must have some experience - it might as well be the qualia we know. It could be different qualia if we had different brains or different senses, but it has to be something.
Now, identity and a sense of self is a bit of a mystery, but I feel that is an illusion created by our mind. If we were part of the Borg collective, we wouldn't experience a singular identity. Our separateness and our memories creates the sense of self.
It's not that easy though. Just examine two competing theories of mind: functionalism and biological naturalism. Both theories are materialistic (ie. physicalist). Both theories posit that conscious experiences are causally dependent on brain functioning. They agree there. But, otherwise, they arrive at two different conclusions concerning what consciousness is.
One theory (functionalism) states that conscious states arise due to the information feedback that happens with brain function. According to this theory, a computer could have conscious experiences if it somehow transmitted the same information your brain does when say, eating a hamburger.
The biological naturalist disagrees. The biological naturalist says you can transmit that information in a computer system and the computer will not experience eating a burger. To the biological naturalist, consciousness is a product of the physical features of neurons. If you wanted to create an artificial consciousness, you'd need to create a physical object that does the same physical thing that a neuron does when it fires. (A whole bunch of them actually.) Then you'd need to get them to fire in one of the myriad ways a neuron can fire when hamburger-eating is being done.
Who of us can say which of these theories is correct? Each has its merits. Each has its problems.
So while, yes, our conscious experience has to be something ... exactly what that something is eludes us. Hence, questions about consciousness are worth exploring. And the problem is indeed hard.
Maybe I'm misunderstanding something here, but neither of these two theories attempts to explain how phenomenological consciousness comes about, only how it could be simulated in non-human entities. The hard problem is not about the latter concern, but the former.