(March 23, 2018 at 4:10 am)bennyboy Wrote: First of all, your definition of consciousness is going to be hotly challenged by anyone who's not already a material monist. With regard to a traditional dualist or other world view, you have to either:
1) redefine consciousness in physical terms rather than experiential ones: "Consciousness is the ability to process information from the environment and react to it."
2) make assumptions that beg the question-- for example, that the world is basically as it seems to be, but without explaining why it is so.
Both of these have serious philosophical problems. In the former case, I would say, "That's fine, but I'm not interested in robots. I'm interested in the experience of what it's like to be, and I don't think science has even the beginnings of a coherent theory of why I can do that" In the latter, I would argue that science itself very much undermines the assumptions upon which it largely rests. The world does not seem to us to be a collection of undefined wave functions, but that's what it is; seeming is over-rated.
Yeah and if that was likely to work then it would have done so by now. There's been thousands of years to figure out what consciousness is. You specifically use the example of robots when I mentioned both AI, psychology and neuroscience. BY doing this you are implying that we don't need more data to figure out what conscious is or to test hypotheses with experimental models. When experience shows us that the more we learn the more we can reason about a problem and eventually arrive at a conclusion.
What you are talking about is exactly this kind of 'philosophy' that my thread is about. Philosophers are like economists in a way. They introduce the subject by saying how applicable it is, but when it comes down to it they mainly seem interested in forming their own mental models that don't apply to the real world. You are essentially proving the point of my OP. This is the kind of philosophy that needs to die out.