(August 12, 2013 at 5:03 am)bennyboy Wrote: No we aren't. A descriptive definition works, too-- to explain what it IS LIKE to experience.
An accurate descriptive definition would require more than describing awareness in terms of experience - given that experience itself is a form of awareness.
(August 12, 2013 at 5:03 am)bennyboy Wrote: It just doesn't work for you, because such subjective language is too imprecise to take into a lab.
True enough. Your definition doesn't work for two specific reasons:
1. Adequacy - An adequate definition should be able to explain the concept or the phenomena to a person not already familiar with it. Do you think your definition can explain the meaning of awareness to an entity not capable of experience?
2. Operationalization - While studying the concept you do need to first operationalize it, i.e. specify the underlying principles, defined the limits etc. basically, make it good enough to be taken to the lab.
(August 12, 2013 at 5:03 am)bennyboy Wrote: All systems "process" all other systems to some degree, since they are all linked by gravity, and by the exchange of photons. Anyway, who's to say a particular collection of particles is a "system," and another is just a bunch of particles?
We are in the realm of ideology here, not one of objective reality.
Equivocation much? When we talk about data-processing systems, the "process" refers to a specific actions taking place with specific results - it does not automatically apply to all systems. And we are to say which collection of particles constitutes a system and which do not. And there is no such realm as ideology.
(August 12, 2013 at 5:03 am)bennyboy Wrote: The old "No, I'm not. You are!" defense. I'll be using that one liberally, as well.
I'm not begging the question-- I'm describing the world view that people generally have: that people have minds, and that machines do not. Now, it is possible that a machine CAN theoretically experience, just as I do. That's an exciting possibility, but I'm curious how you would confirm or disprove that possibility. I'm willing to extend the status of "sentient being" to people because I'm one, and I'm willing to assume that degree of similarity. But why would I extend it to a machine, regardless of how convincingly it mimics the behavior of actual humans?
Well, if you accept the theoretical possibility of machines being capable of experience, then you are not begging the question. However, if that possibility were shown to be true, you would also have to accept that phenomena such as mind/sentience/experience are not limited to dualism and have meaningful existence within monist context as well.
As for the question of determining actual, Id say not all aspects of human behavior can be mimicked. In fact, not even the behavior of your goldfish can be mimicked without there being the capacity for experience. Certain facts of your consciousness - such as your preferences, your tastes, your aversions etc. develop and change over time as a result of your capacity to experience. Should the machine not be capable of experience, this development would not be seen in it.
(August 12, 2013 at 5:03 am)bennyboy Wrote: I wish we could see this happen in our lives. Do androids dream of electric sheep?
But let's say your Cyberboy 2000 takes in Beethoven's 5th, simulates the musculature of a "moved" human being, or possibly sheds a tear in response. Does this prove it's more than just an elaborate, but nevertheless unfeeling, machine? Should we give such machines rights? Should we allow them equal status in social programs, or allow them to govern themselves (or us)?
Now this is where things get complicated. We do not yet know which elements of consciousness define it, which elements are necessary consequences and which are biological extras and how they all interact with each-other. Would androids need to dream, or even sleep at all? Why would we create something capable of feeling pain or suffering or sadness? Maybe we could just get rid of those elements. Would we be able to simulate the emotion of sadness completely - not just the the external signs of it? I mean, we all have faked emotions from time to time, so within that context we are acting like an unfeeling machine. But if anyone was looking at our brains in that moment, they'd know that whether or not we were actually experiencing those emotions.
Their status in society would be a different matter altogether. Currently, we don't grant rights based on sentience. That's why animals don't have the same rights as us. But, if we foolishly keep trying to deny them their rights, te world of matrix might easily become a reality (though not the human batteries part - that's just stupid).
(August 12, 2013 at 5:03 am)bennyboy Wrote: Because I have access to experience, and cannot prove it other than by insisting verbally that I have it. I cannot otherwise show that I actually experience, rather than seem to.
That's a rather defeatist attitude. Tell me then, why should I believe that you are an actual human being and not a philosophical zombie? Or, try a simpler problem: how would you - without referring to mutually accessible visual data, convince me that you are not blind?