(March 25, 2018 at 10:16 am)Hammy Wrote:(March 25, 2018 at 9:30 am)polymath257 Wrote: OK, so *why* do you no think a thermometer is likely to have a conscious experience?
I don't think a thenometer has conscious experience for the same reason that you don't believe humans have souls. There's no reason to believe it.
The absurd conclusion that thenometers must have conscious experience on some level is just one example of the nonsensical conclusions that can be reached from the absurd position that consciousness is merely "information processing" when there's literally no reason to believe that it's merely information processing. Certainly information processing tends to be involved but there's absolutely no reason to believe that it's necessary or that consciousness doesn't precede it (or that it doesn't precede conscious experience in some cases but not in others).
I didn't say that thermometers *must* have consciousness. I asked why you think that they don't. And then, why do you think other people do. As far as I can see, it is because of some sorts of *observation* of *objective* patterns of behavior that you make those distinctions.
Quote:Galen Strawson debunks Dennett's silly position on consciousness, while also praising the parts he gets right PERFECTLY in THIS review of Dennett's book "Consiousness Explained":
Here is the review: http://www.academia.edu/411597/The_self_..._Explained
To quote just one small part from it:
Quote:
[...]Dennett suggests that we can give an evolutionary explanation of why conscious experience exists: it exists because it has survival value. It is, however, a notorious fact that it is not yet possible to give a direct evolutionary explanation of the existence of conscious experience. This may seem very implausible. It may seem obvious that vision, say, has survival value. But a creature could enjoy all the benefits of vision without having any actual, conscious visual experience. It could have light-sensitive organs that enabled it to register information about its environment without having any visual experience (machines that do this can be easily constructed). The same can be said about pain. Experience of pain seems obviously useful because it motivates one to avoid sources of damage. But the tendency to avoid sources of damage could evolve without involving pain. Damage-recognition mechanisms could trigger damage-source-avoidance behaviour without there having to be any actual feeling of pain, or any other sort of experience. Perhaps some actual organisms on earth are like this.
The rest of the article is no less brilliant.
I will give it a read. But I am already inclined to disagree. I think that consciousness is an *aspect* of the information processing. Reacting to potentially damaging aspects of the environment requires increasingly higher levels of processing and, I think, consciousness arises out of exactly this type of information processing. It 'feels like' something *because* we are processing the information in a way that we can react to it.