RE: Seeing red
January 23, 2016 at 2:31 pm
(This post was last modified: January 23, 2016 at 2:34 pm by bennyboy.)
(January 23, 2016 at 1:06 pm)Jörmungandr Wrote: Why do you ask, do you need to equivocate on a few terms?Quite the contrary. I want a non-arbitrary definition
Quote:I'm not talking about having or not having intention. I'm talking about defining it. What separates a system with "intention" from any other physical system? A brain is a collection of physical materials going through their individual processes, and so is a galaxy. Why do you say one is intentional, and one is just stuff happening?(January 23, 2016 at 12:02 am)bennyboy Wrote: It still seems to me that we are using a lot of substance-dualist words here: intention, meaning, even behavior. But it is WE as thinking humans who see intention, meaning and behavior in some systems.
Or those systems really have it and all your posturing about WE as thinking humans is just special pleading. "They have intentionality but it's not real intentionality, it's derived." Yes I have the ability to describe the robots behavior in terms of meaning and intention. Unless you're just begging the question, there is no difference between the applied use of these terms to describe the robot than there is to describe WE humans. "The robots have aboutness but it's not the kind of aboutness that I have." How would you know? All you're doing is suggesting that we have richer systems for intentionality and meaning in that we can use our concepts to ascribe meaning to the behavior of robots. That doesn't exclude the possibility that the robots behaviors are deserving of such description.
Quote:In an experiment at the Max Planck institute, dogs and chimpanzees were compared for their performance on a simple test. The test would put a small reward under one of two cups, and the experimenter would point to the cup that concealed the reward. The task was to find the reward. The dogs did well on the task, going straight to the cup with the reward, whereas the chimps were not helped by the pointing. Obviously the dogs understood that the pointing finger was 'about' something in a line from the end of the finger. They understood the intentionality of the gesture, and no amount of human reinterpreting can take that away from them.It seems to me that what we're really talking about is how human-like certain physical structures are. This is fine when you have an intact human-like organism, but it sheds little light on my post a couple pages ago: on what level of physical organization does mind supervene?
Quote:A few additional points. First, it's clear that if dogs can possess intentionality, it doesn't take much hardware to implement it. Much as we love our dogs, they aren't exceptional as a species in terms of intelligence. Second, while the test was not performed with wolves, I suspect wolves would be just as stymied as the chimpanzees; and if not the wolves, at least a recent ancestor of them. This would show that it takes a relatively short time, 10,000 years, to evolve such intentional behavior. This suggests that the machinery for such behaviors already exists in species like the wolves or the chimpanzees, simply awaiting for evolution to tease it out into a manifest behavior. Third, it points to the possibility that intentionality can evolve. Unless we are postulating doggy souls, some mechanistic change occurred in the brain over the course of those 10,000 years to bring out this intentionality. This suggests that the substance necessary for such behaviors is a mechanism, not some mysterious ectoplasm.I don't think you got my drift. Whatever the organism it is that arbitrarily imbues certain states as "information," and others as meaningless, the distinction is still arbitrary. Maybe dogs can do it, maybe chimpanzees can't. But does their lack of interest in any particular set of physical states really mean that that state doesn't represent information? Or is it just that those organisms are geared toward an interest in the states of certain kinds of systems?
So, no, it isn't only WE as thinking humans, but quite likely much of the animal kingdom.
This is important, because the last few pages, we've been talking about information processing as an indicator of mind, and I still haven't seen a non-arbitrary explanation of what physical states or systems do/don't represent information. But so far as I can tell, information seems to be something like "useful or interesting physical states or processes," (which implies that the observation by a subjective agent is required in order to say something is information) whereas anything lacking interest is just stuff that happens. But this seems far too subjective to serve as the basis for a theory of mind.