RE: consciousness?
February 24, 2013 at 11:17 am
(This post was last modified: February 24, 2013 at 11:19 am by genkaus.)
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: What is at stake are the very things I believe make life worth living: purpose, personal freedom, and moral values, for a start. Because these are so important to human existence, I do not give them up lightly.
What makes you think that you'd have to give up on them if you accept that your mental existence was rooted in biology?
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: Acually, I am not making the assumption of two distinct entities. All three could support a physical theory of mind. However, of the three given, only the last allows the possibilities of two real and distinct things (having different categories of being?) or demonstrating two distinct aspects of reality.
That does not seem to be the case. You start your discussion talking about brain-states and mind-states and from there you go about positing causal links between them. Which means you are starting by considering them as two distinct entities.
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: And why can’t they be mutually dependant and co-existent in a meaningful way?
Because the current evidence suggests otherwise.
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: People used to compare the brain to a steam engine too. But I am familiar with the hardware/software analogy. Not to put words in your mouth, but you seem to be taking a functionalist position.
You're not. I think I have indicated before that that is my position.
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: Functions are not contingent on the system in the way. Any given function can conceivably be performed on multiple physical systems – cogs and gears, brain tissue, electronic circuits, pneumatic tubes, telephonic networks, etc. This undermines mind-brain dependence and in many ways makes a material/immaterial relationship more plausible.
Not quite. We cannot have this discussion on a wooden computer. I cannot replace all the water in my body with alcohol and expect it to work the same. The more complicated the performed function, the more specific the requirements for the underlying physical system.
But even if your position was true, how would the fact that replace my biological brain with an electronic one lessen the dependence of my mind on the material medium?
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: Secondly, functions are not inherent, they are assigned. All other known physical systems besides the human brain proceed without any awareness of function. It takes an already conscious being to recognize a function. A rectangular piece of fired clay has no function until someone comes along and says, “hey, there’s a brick I can use.”
Don't confuse recognition with assignment. Where the functions are assigned, they depend on the conscious entity doing the assigning. For example, for the brick, the assigned function could be making a building, smashing a window or throwing it at someone's head. On the other hand a lot of systems do perform an inherent function determined by their nature. For example, the function of leaves in a tree is to absorb sunlight, water and carbon-dioxide to make sugar and oxygen. The awareness of the said function does not change the inherent nature of it.
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: Thirdly (and this mainly to shows that the analogy is weak, not the overall arguement), a computer does just generate software; it must be uploaded. If you buy a computer without software installed, turning it on will not generate Microsoft Word or i-Tunes. You open the door to the unfair question of who installed the software? Presumably, evolution allowed the hardware and software to develop simultaneously. That may be, but it's highly speculative.
That doesn't weaken the analogy, it bolsters it.
Even at its very basic, even when no software or operating system has been uploaded, a computer does perform certain functions as dictated by its configuration. Upon startup, it boots up, looks for disk-drives etc. Even your pen-drive would just be a paper-weight if it did not have a basic form of software determined by its physical configuration. Similarly, the brain does have some inherent software such as pain-avoidance. In both cases, uploading additional software would not be possible if there wasn't some of it there to begin with.
Secondly, the method of uploading software is different in both cases. Whereas in a computer we can upload and install it directly, a brain has to be fed the necessary external inputs in form of sense-data to generate its software. Consider, for example, someone who has never had any sort of sensory input. Not just his five senses, but all his other senses, such as pain, hunger, pleasure etc. have never been activated. Do you think that such a person would have self-awareness? That he would be capable of self-generated action?
The software-uploading in humans takes place through a long and complicated process of education from their birth onwards, where the parents help the child to learn concepts through repeated exposure to certain sense data.
If anything, the main difference between a computer-mind and a biological-mind in the current state of affairs would be that while the mind does seem to develop its own software based on the input, gradually, but surely, the computers do not have the same capacity because we haven't figured out yet how it is done. Once we do, behold the the advent of AI.
(February 24, 2013 at 1:05 am)Rhythm Wrote: It may not generate microsoft word, but why would we expect it to? It does, however, execute a set of functions merely in being "turned on" (even when a pc is doing abslutely nothing it's actually, more accurately, executing a mind boggling-ly vast number of NOT or null "functions". Sure they're not useful to us, but who cares?
Ofcourse they are useful to us. Without those basic set of functions in place, it would not be possible to install the operating system.