(November 19, 2013 at 6:38 pm)bennyboy Wrote: You are still flirting with solipsism, here. I don't have direct access to ANY event or object-- in all cases I interface with them through my qualia.
This means that when I infer a plane crash, I'm remembering WHAT ITS LIKE to see a real crash, and believing that the image I see from the video matches that.
But in the case of qualia, when I describe it in its own terms, I'm talking about WHAT ITS LIKE to feel WHAT THINGS ARE LIKE.
That's another level, or layer, of reality we're talking about. Think function : derivative.
Solipsism does not enter the equation here. Within the context of this ANALOGY, your observation of a plane crash is equivalent to direct access to your own qualia. And your inference regarding the event from its images is as much of an agnostic assumption as inferring anyone else's qualia.
(November 19, 2013 at 6:38 pm)bennyboy Wrote: I never denied that they could subjectively experience. I denied that I'm willing to extend the same philosophical assumptions to robots that I make about humans, because robots are unlike me in important ways, while humans seem not to be
The philosophical assumption you are unwilling to extend happens to be subjective experience, which constitutes a denial for their capacity to do so.
(November 19, 2013 at 6:38 pm)bennyboy Wrote: I don't have an instinctive need to see them as conscious, and there's no pragmatic advantage to doing so. For my life to make sense, I don't have to believe that robots really experience.
That's the most basic mistake one can make in pursuit of knowledge - making assumptions based on need or advantage they provide.
(November 19, 2013 at 6:38 pm)bennyboy Wrote: Because the transmutation of energy can itself be seen as a kind of simple data processing.
How?
(November 19, 2013 at 6:38 pm)bennyboy Wrote: 1. Insufficient != Unnecessary
2. You'll have to describe how this is physically possible, or ever COULD be. The brain combines physical, chemical, and electrical interactions in ways so complex I don't think they can ever be modeled by anything less fantastically complex than a brain itself.
1. Did you miss the point? Function at cellular level is insufficient to give rise to qualia - which is why the notion of qualia at that level is nonsensical. It is, however, necessary to give rise to qualia at a higher level.
2. I'm inclined to agree. Any model of brain with replicated functions would have to be fantastically complex. A lot of scientists are still working on the "how" and they've made some great progress - including your own given examples of Artificial Neural Networks.