RE: Philosophical zombies
March 4, 2018 at 10:10 am
(This post was last modified: March 4, 2018 at 10:17 am by bennyboy.)
(March 3, 2018 at 8:42 pm)Khemikal Wrote:It proposes no discernible difference. Unless we know what causes any material system to have subjective experience, then we cannot say whether an indiscernible difference may be responsible for one system having full experience of qualia, and another seeming to.(March 3, 2018 at 8:03 pm)bennyboy Wrote: There are a lot of differences between me and between an Android.Sure, but the p-zombie proposition is explicitly designed to refer to just one, regardless of it;s ability to cogently comment upon it. I propose that there are alot of differences between you and I as well. Between both of us and an android, and between both of us, an android, and some other species of biological intelligence. We likely agree on each item. Different things are, well..different. That's part of what makes the p zombie prop a cognitive trap. It proposes a difference...with no difference.
Quote:You can either know that or not..but if you cannot know that, then you cannot know that your own consciousness isn't "just complex programming" to make so and so appear so and so..either. Because it;s a problem for your criteria of knowledge, it is a problem for all relevant categories of knowledge equally, or none equally. I can appreciate where you'rte coming from, but, I extend a certain criteria to human beings day in and day out. It wopuld make very little sense for me to come up with some other criteria for extension just because the hand I'm shaking is made of carbon fiber. I doubt I'll see it in my lifetime, I hope my children do. It's a lonely universe, after all, eh?No, it's a fair philosophical question. I suspect than in maybe 100 years, this question could be taken quite seriously.
(March 3, 2018 at 9:54 pm)LadyForCamus Wrote:(March 3, 2018 at 8:03 pm)bennyboy Wrote: Absolutely not, unless you define "self-aware" in those terms.
But the problem is this: I have a particular type of self-awareness that allows me to know what it feels like to watch a sunset or to drink a cup of hot chocolate. I do not believe this to be the same as a robot that can determine the chemical composition of fluids it has taken in and then verbalizing that composition.
Unless, that is, the Universe is panpsychic. Then it's all bets off.
We aren’t talking about robots or machines though. In this hypothetical we’re talking about humans; humans who are biologically identical; indiscernible from any other human walking the earth. I can say with a reasonable degree of certainty that I am able to recognize consciousness in a human being. If that human tells me about the awe he feels when watching a sunset, or how hot cocoa just isn’t the same with out those tiny, smushy marshmallows, because that’s how his mom used to make it, then that person is conscious. Yeah, robots can mimic consciousness, but AI is not the subject of the p-zombie thought experiment. Am I missing something here?
If it can't experience subjectively, is it human at all?
I think what we need is a litmus test-- by what criteria can a given system, human or otherwise, be determined to have the capacity for true subjective awareness rather than the mimicry of it?
Right now, I have solipsism and then some "pragmatic philosophical assumption"-- i.e. that I think other people are conscious without knowing why I think so. It is, bluntly, a feeling. But we are about to have our instincts completely tricked by machines of our own making: androids may soon make us laugh or cry, or have personality quirks that we find endearing. Where then are my pragmatic assumptions? Should I hold to them still, saying "I know that the Camustron-2100 model is real, simply because she seems so"?