RE: The Philosophy of Mind: Zombies, "radical emergence" and evidence of non-experiential
April 28, 2018 at 4:03 pm
(April 28, 2018 at 10:03 am)LadyForCamus Wrote:(April 27, 2018 at 11:53 pm)Hammy Wrote: Well, it's like I said to Khem when I pointed out that robots can still behave as if they like stuff without having the qualia we call "liking". The only part that appears to be missing is the actually liking stuff. But the fact that a robot could behave exactly the same or a philosophical zombie could behave exactly the same without liking... just leads back to my same point about qualia being useless.
N.B. The trouble was Khem appeared to be begging the question by merely building liking itself into the definiton of qualia.
So, my point here is it's the same with empathy. If we literally define empathy itself as requiring consciousness then we win immediately and can simply declare victory by building empathy into the definition... which begs the question and goes back to the philosophical zombie argument and only changes from consciousness generally to empathetic conscious states specifically.
If a creature behaved completely empathetic without feeling empathy then you may say "they're not really empathetic because empathy requires feeling empathy", well sure... but then they wouldn't need to be conscious either. The feeling of empathy is a conscious state... so of course consciousness is required for the feeling of it: But then my point about empathy is exactly the same as my point about consciousness. As an experiential state, as qualia, it doesn't appear to actually do anything. All the useful behavior you get from consciousness or empathy... doesn't seem to require the qualia or the feeling. It's back to strawson's point about how a creature could react as if in pain and have alarm systems and sense danger and detect light all without experiencing any of those things subjectively.
Hmmm. I agree with you that consciousness as a whole likely plays little to no role in our decision-making processes, and that most of the information processing that effects behavior occurs absent our awareness. But, I do think that empathy plays a large role in driving those unconscious decisions, and you can’t experience empathy without consciousness, as empathy, by its definition, is literally the experience of a feeling. We behave in particular ways toward other humans largely due to empathy. That is empathy’s evolutionary utility.
Any mental reaction (conscious or unconscious) to an empathetic feeling depends upon the individual’s ability to experience that feeling in the first place. And so the way I see it, a non-conscious being without the ability to experience empathy would behave exactly like a sociopath. Sure, sociopaths can “fake it”, so to speak, so you could argue that they can model empathy in their behavior even if they don’t experience it, but that would also require consciousness manipulation.
Maybe this is more an argument for the utility of empathy than it is an argument for the utility of consciousness, but I would say that the utility of empathy is contingent upon the existence of consciousness.
And, holy shit, that was NOT concise, but I don’t have time to edit! Sorry!
*runs*
Still the same philosophical problem though. You could, at least hypothetically, program a robot to act as empathetic as a human does.
Let me cut through some red tape here, and assert (completely without proof or much evidence) that since all behaviors are physical, and since we judge the "mental state" of another agent on those physical behaviors, that at some point a machine should be able to replicate all behaviors, and therefore be taken as alive if judged on that basis.
Consciousness is required to appreciate a sunset. But a robot could seem to appreciate a sunset. Consciousness is required to truly enjoy Beethoven's 5th. But a robot could seem to appreciate a sunset.
I think it's very possible that in the not-distant future, AI will be sufficiently convincing that bleeding hearts will start marching for robot rights. Maybe robots will get the vote, and so on. Maybe robots will wipe out humanity.
The question is this-- will all that seeming mean that a new species has arisen with a new take on enjoyment? Or will the beauty of the sunset cease to exist in any meaningful way because there's nobody left with the capacity to actually experience it? I very strongly suspect the latter to be the case.