RE: The Philosophy of Mind: Zombies, "radical emergence" and evidence of non-experiential
April 28, 2018 at 8:39 pm
(This post was last modified: April 28, 2018 at 8:40 pm by LadyForCamus.)
(April 28, 2018 at 4:03 pm)bennyboy Wrote:(April 28, 2018 at 10:03 am)LadyForCamus Wrote: Hmmm. I agree with you that consciousness as a whole likely plays little to no role in our decision-making processes, and that most of the information processing that effects behavior occurs absent our awareness. But, I do think that empathy plays a large role in driving those unconscious decisions, and you can’t experience empathy without consciousness, as empathy, by its definition, is literally the experience of a feeling. We behave in particular ways toward other humans largely due to empathy. That is empathy’s evolutionary utility.
Any mental reaction (conscious or unconscious) to an empathetic feeling depends upon the individual’s ability to experience that feeling in the first place. And so the way I see it, a non-conscious being without the ability to experience empathy would behave exactly like a sociopath. Sure, sociopaths can “fake it”, so to speak, so you could argue that they can model empathy in their behavior even if they don’t experience it, but that would also require consciousness manipulation.
Maybe this is more an argument for the utility of empathy than it is an argument for the utility of consciousness, but I would say that the utility of empathy is contingent upon the existence of consciousness.
And, holy shit, that was NOT concise, but I don’t have time to edit! Sorry!
*runs*
Still the same philosophical problem though. You could, at least hypothetically, program a robot to act as empathetic as a human does.
Let me cut through some red tape here, and assert (completely without proof or much evidence) that since all behaviors are physical, and since we judge the "mental state" of another agent on those physical behaviors, that at some point a machine should be able to replicate all behaviors, and therefore be taken as alive if judged on that basis.
Consciousness is required to appreciate a sunset. But a robot could seem to appreciate a sunset. Consciousness is required to truly enjoy Beethoven's 5th. But a robot could seem to appreciate a sunset.
I think it's very possible that in the not-distant future, AI will be sufficiently convincing that bleeding hearts will start marching for robot rights. Maybe robots will get the vote, and so on. Maybe robots will wipe out humanity.
The question is this-- will all that seeming mean that a new species has arisen with a new take on enjoyment? Or will the beauty of the sunset cease to exist in any meaningful way because there's nobody left with the capacity to actually experience it? I very strongly suspect the latter to be the case.
But, a human being without empathy would most likely (and often does) behave differently than humans who do experience empathy, and so my response to Hammy’s idea (that consciousness is not useful to us in any way), is that it’s useful by way of allowing us to experience empathy, which drives outcomes in populations.
That IS how WE do it, isn’t Khem? 😛
As for AI...fuck that shit. I’m running for the hills!
https://youtu.be/rYLm8iMY5io
Nay_Sayer: “Nothing is impossible if you dream big enough, or in this case, nothing is impossible if you use a barrel of KY Jelly and a miniature horse.”
Wiser words were never spoken.
Wiser words were never spoken.