RE: The Philosophy of Mind: Zombies, "radical emergence" and evidence of non-experiential
April 22, 2018 at 7:31 am
(This post was last modified: April 22, 2018 at 7:41 am by Edwardo Piet.)
(April 22, 2018 at 6:57 am)Khemikal Wrote: In any case, the employment of the term computational competence was to separate that from consciousness even if only as a matter of categorization. I think you're tilting at dennet like a windmill, but, as I've said, your issues with dennet are your issues with dennet.
That's a cop-out when you keep barely asserting his view without supporting it.
Quote:Though it might be helpful for you to know that dennet likes to employ the terms competence and comprehension to differentiate between what we do subconsciously and what we do consciously.
I know this already. I get the impression I'm a lot more familar with his work than you are.
Quote: If you, for example..think that computational competence is not consciousness, well..you and dennet agree.
Yes we agree on that and it's irrelevant.
Quote:
Our issue, the only thing I commented on in your op, is the notion of selective neutrality in the case of possession of consciousness. I don't think it's useless, and I don't agree that such a conclusion follows from a one line item...even if that one line item means alot to us and is a subject of deep interest....even if that one line item was traditionally conceived of as the role of consciousness in the human organism.
Once again, you're being vague and unclear. Define what you mean by 'consciousness' before you can pretend you're saying something about all forms of consciousness at once. Quit being equivocal.
Consciousness as normally defined, as qualia, does indeed seem to be useless. Dennett isn't wrong per se to come up with an alternative definition, but to deny qualia itself is batshit crazy. And I've already explained the mistakes he is making. Which you, again, haven't addressed.
Quote:It need not be -that-...to be useful, to confer advantage.
Again, this isn't helpful at all. You're being vague and unclear once again. What is this 'it' you speak of?
You appear to be barely asserting Dennett's view whilst at the same time saying my issue with Dennett is with Dennett. If you're attempting to support Dennett's view of consciousness, then be clear that is what you are doing. Yes, there's a difference between computational consciousness and computational competence. That doesn't address anything I've said. I never said otherwise. And yes, I agree with Dennett on that point. I wasn't talking about that and it has nothing to do with my actual argument. Are you going to address my argument or are you going to keep being irrelevant, vague and strawman what I'm actually saying?
What do you mean by consciousness? Do you mean what Dennett means by it? Because if so 1) You aren't addressing any of my arguments in the OP 2) You haven't been clear on whether you deny qualia like Dennett does.
You say my issues with Dennett are my issues with Dennett, but if you're going to vaguely speak of the same sort of thing, without being clear about it, and if you're going to seem like you're supporting his view, without being clear that that is what you are in fact doing. Then 1) Be clear about it. 2) My issues with Dennett are then relevant because it's helpful to know where you depart from him.
Quote: It need not be the only way to achieve whatever it does to confer advantage, and it very certainly could have evolved as a side effect of some other thing (like..say, computational competence and general intelligence). Wings weren't intitially a flight adaptation, either. Flight..at some point, becomes a "mere side effect" of a particular type of wing...but if we limit biological utility of wings to flight then all swimming and flightless birds have useless wings.
Again this is all just an irrelevant digression. You still have no evidence of consciousness as normally understood, as qualia, being useful. And that's what my argument is about. I don't deny that parts of the brain that aren't consciousness can be useful and can be labelled as 'consciousness'.
You're very vague and unclear and you're not actually addressing my arguments. You'e just stating irrelevant facts, and being unclear what your actual view is, without even supporting it.
What do you mean by 'consciousness'? Do you mean qualia, do you mean subjective experience, or do you mean 'something else'?
Quote:That doesn't track with what we understand about selection or adaptation, at all. It's a function of tunnel vision, a narrow definition that excludes all other utility by fiat in favor of "control" or "free will". If all consciousness did was make you comparatively more fuckable..it would have evolutionary utility just like display feathers that can't fly. In order for the "science to be on your side" on the claim of selective neutrality, on the issue of evolutionary utility......consciousness could not, itself...whatever it is... be even partially responsible for any advantageous thing.
More irrelevant stuff. Again, all the evidence indicates that our brains can do these things without consciousness. Conciousness doesn't appear to be doing anything. The entire point of my Strawson quote was to point that out. There's no reason to believe that consciousness is actually doing anything... and in fact, the evidence points in the other direction. The burden of proof is clearly in your camp.
Quote:I sometimes like to joke that, if consciousness were the "free willing" mechanism we thought it was for so long... it might actually -be- deleterious. It;s a good thing I can't choose or decide to stop my heart, for example. Plenty of us would have done it out of incompetence, curiosity, sheer boredom, or just plain bad luck and fumbling mental button fingers....long, long ago. That said, the benefits of a truly free agency might override that specific (hypothetical) risk, anyway. Just as the benefits of a consciousness would appear to override the many inherent flaws in our perception thereof (or flaws in itself) and the many ways that a self consciousness works counter-productively in human populations (and privately, within the human organism).
What free-willing mechanism? There is no such mechanism. Is this the part where you're going to equivocate again and start rambling irrelevantly about compatabilist free will which we already both believe in? (although it's clearly unhelpful and misleading to call that "free will" which is why I'm not a compatabilist).
Quote:Science, as you used for an example..seems at least partially dependent on consciousness (at least for now..)..and the posession of tools that can create something like that would -seem- to be immensely useful in propagating the humane genome. While consciousness may possess no computational utility (I doubt this as well..but I'm running with it to avoid truly useless disagreement), evolutionary utility is a whole different bag of worms, don't you think?
There's no evidence that consciousness actually does anything. It's not dependent on consciousness... my point is there's no evidence that science tests anything but consciousness. There's no evidence of an objective world outside of consciousness.
If everything is consciousness, then sure, consciousness does something because everything 'does' something. My point is that consciousness as we experience it as humans, doesn't appear to be required. The brain seems to do all the work without the stuff we're aware of. Being aware doesn't seem to do anything.
If the intrinsic nature of matter is consciousness, that doesn't mean the fact it is conscious actually does anything.... the intrinsic nature of matter could have been unconsciousness, and consciousness wouldn't exist anywhere... and it wouldn't appear to make a difference. If brains weren't conscious they would appear to be able to do exactly the same things as they do, just without the consciousness, that's my point. Creatures could react to being attacked and respond to danger without actually experiencing the feeling of pain. This is the entire point of the Strawson quote. Why are you missing it? As to Jor's point of him begging the question, on the contrary. The shoe is very much on the other foot. By saying that "consciousness just is how we see stuff" you're just begging the question. The point is our eyes could in principle do everything useful evolutionary without the consciousness. My question is what does consciousness actually do? It does indeed appear to be the intrinsic nature of the seeing, it doesn't appear to actually be a function. It doesn't appear to actually be doing anything useful. Thoughts have no consequences. Literally the only thing seems to be the fact that we are talking about it. But then moths also kill themselves on lamp shades. The point is consciousness just seems to be a by product and doesn't perform any important function. You could in principle have eyes that are just as useful without the experience of seeing anything. So what if we do happen to see things? So what if we have happened to evolve this way? That entirely ignores my point. How things are in practice is an entirely different matter to how they could be in principle, and whether consciousness is actually performing any useful function.