RE: My thoughts on the Hard problem of consciousness
February 13, 2017 at 8:56 am
(This post was last modified: February 13, 2017 at 8:59 am by I_am_not_mafia.)
(February 13, 2017 at 7:26 am)emjay Wrote: I've always wondered, is your perspective just a pragmatic thing because you're a scientist?... in which case I get it and share it for the most part in that the continued and perfect (imo) correlation between neuroscience and consciousness indicates they are one and the same... but nonetheless do you never, even just fleetingly or irrationally, wonder about the phenomenal nature of consciousness?
I think I have a pragmatic view of it because I actually think in terms of how to build an artificial intelligence. Everything has a use. Take emotions for example. People assume that they are a burden, but we need them in order to function. We evolved them for a reason and an AI agent will need them too.
If you want to build an agent that can co-operate with other agents then it will need some way of representing itself in its internal model in relation to others. In other words some a personal identity. Empathy allows animals to work together better as a pack and to learn from each other. That means processing visual stimuli and simulating what would happen if it was applied to oneself. And then there is the other function of self awareness, the check and balance of cognition running alongside emotion. That little voice that knows that you are angry for example and thinks, do you really want to be doing this. All this adds utility to the agent and allows it adapt better.
Asking yourself what it means to sense red is a waste of time. You'll never be able to answer it, and if you could, you wouldn't be able to demonstrate it and it probably wouldn't tell you anything interesting anyway. But that's what people are doing and that's why they think it's a hard problem. As a designer of AI, I know how my agents sense the colour of red and what effect it has on them.
(February 13, 2017 at 8:25 am)bennyboy Wrote:(February 13, 2017 at 5:51 am)Mathilda Wrote: I've never seen an adequate explanation for why consciousness should be considered a hard problem.
It's only hard for us because we're using our own brain to understand itself.
It's hard because no philosophical view easily reconciles the conflation of subject and object into a single framework. Idealism misses much of the simple truths that science arrives at; dualism has the problem of bridging; physicalism can't really explain subjective experience at all.
The equation of brain function with qualia, for example, is not so easy as it might seem at first.
This to me then suggests that the fault lays with the use of philosophy. You're using the wrong tool for the job.
You have to find the right paradigm to suit the problem.