RE: Pleasure and Joy
September 1, 2013 at 5:22 pm
(This post was last modified: September 1, 2013 at 5:27 pm by bennyboy.)
(September 1, 2013 at 11:32 am)genkaus Wrote: And as I've indicated many, many times before, direct access is not required to establish existence. Not for black-holes, dinosaurs or a murder and not for subjective experience either. The only assumption here is that subjective experience, like any other existent entity, have a specific form of existence and provide specific evidence of its existence. That assumption is made for all the other objects as well. Your continuous repetition of "if can't directly observe it, you can't know it exists" has been invalidated by science in multiple scenarios.You can "indicate" whatever you want as often as you want, but the post count isn't evidence of truth. The behaviors you are talking about are sufficient to establish brain function, not actual experience. If you are trying to establish that a black hole has fairies inside it, you can't get there from establishing that light seems to be bending in ways that indicate a black hole.
Quote:If I prove that the behavior requires actual experience of processed information, then I have proven that it is capable of actual experience. And guess what, this is precisely what the scientists ins the field of neuroscience have done.You can't prove that behavior requires actual experience, since you are a physical monist. You can only prove that behavior requires brain function. It's simple: data in, processing, behavior out. No fairies required.
Quote:That is the test. How are you not getting this? If it walks like a duck and quacks like a duck, then it is a duck.. . . or a duck-like mechanism, lacking ducky feelings.
Quote:Given, my argument is that subjective experience itself is mechanical, deterministic form of data-processing, I don't see the point of this statement.The point is I don't accept that the Cyberboy 2000 actually experiences, and that the philosophical implications of allowing a world of Cyberboys to compete with humans for resources because they "experience" (read: behave like they experience) would mean a loss of that special quality of actual experience which I believe humans have, and Cyberboys may not.
Quote:Actually, we do know that it has imagination. All we have to do is look at the Chess program.I do not accept your definition of imagination. Imagination requires an image or vision of the problem, not simply behaving like one has those things.
Quote:Is it possible? That reality is not what it seems? We are not talking about errors in perception here. Science accounts for those errors and has measures in place for correction. What you are suggesting here is the possibility that perception itself is invalid. Got a way to justify that possibility?Absolutely, I do. We could be in a BIJ, a BIV, the Matrix, or the Mind of God. We could be another living entitity having a long dream. There are many scenarios in which the world as it seems, including all the experiences we have of science, could be non-representative of reality.
As you've pointed out, our perception is fundamentally flawed.
Quote:Do I really have to remind you that science does not deal in absolutes?No, but reality might. Science is good at making bridges, but demanding that the universe conform to the physical monism required for science isn't to make a new discovery about it: it's as truth-seeking as religious insitutions declaring astronomy heretical. Either you can prove that your view represents reality, or you cannot. Currently, you cannot.
Quote:The reason this argument fails is that the parent/child statements are not identical. One addresses the nature of perception, the other addresses the nature of mind.Nope. You are using experience to prove the nature of experience.
The statement "mind perceives reality" (parent) is axiomatic - make no mistake about that. But it says nothing about the nature of mind itself. Any question regarding the nature of mind is a separate consideration.
Quote:AI ALWAYS produces behaviors which are not specifically programmed. That's what AI is: a programmed simulated evolutionary process, resulting in behaviors which aren't necessarily predictable to even the programmers.(September 1, 2013 at 6:25 am)bennyboy Wrote: Well, the Cyberboy 2000 says, "Yum yum, I want the chocolate ice cream, not the yucky strawberry," rubs its belly, drools a little, and opens its eyes slightly wider to show that what it is looking at "pleases" it. It stamps its little cyber-feet if you tell it no, and make annoying noises in the car on the way home. None of this means it's actually experiencing anything.
If we establish that the specific behavior is not present in the initial programming, then yes, it most certainly means that it is actually experiencing something.
I think the problem here is that you are flip-flopping between semantic sets: mind-existent words, and physical-monist words. Sure, you can call computer processing "imagination" if that word is useful in explaining a model you have in AI. However, that word is already refers to my subjective experience of forming ideas, where abstract images flutter around a kind of mental canvas. You could called the Cyberboy's foot-stamping "experience," but that word already refers to my ability to see red as redness, not simply to the function of stopping at an intersection if I detect light of a particular frequency.
The problem comes when you try to conflate "imagination" and "experience" with my actual imagination, and my actual experience.