Your mind is not a metaphysical wonderland. It's not atheists who refute your ideas, it's science. And they refute it because it's silly. I know you have a powerful need to feel special and unique, but our brains are just neuron-processing hubs that have evolved into consciousness. We're slaves to neurochemistry. I know that's not what you want to hear, but I'm sure you'll ignore this and go on believing whatever you want. You just want to be right; you don't want to be correct. I'm bowing out of this "conversation" in favor of beating a wall down with my face. It'll be easier.
Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: November 27, 2024, 3:18 pm
Thread Rating:
consciousness?
|
(February 22, 2013 at 2:56 pm)ChadWooters Wrote: Professor Plumb, you have only restated an argument already made, adding only that an authority shares your opinion. The tacit assumption of his concluding remarks is that no causal mechanism between brain-processes and mind-processes is required because one is the same as the other. Try not to conclude any "tacit" assumptions when so many explicit ones are given that show that you are wrong. Sherman has concluded that brain produces consciousness - not brain is consciousness. (February 22, 2013 at 2:56 pm)ChadWooters Wrote: Meanwhile his next sentence injects a straw man into this debate: Try not to confuse this discussion with the one that he was having. Mind creating matter was very much a part of his discussion as shown by the arguments made and evidence presented by his opponent. That you may not be suggesting the same thing (though I suspect you are, but just won't state it outright), thereby making that portion of discussion irrelevant is - well - irrelevant. (February 22, 2013 at 2:56 pm)ChadWooters Wrote: No one denies the intimate connection between minds and brains. The authority you cited does not actually address the full relationship between mental events and brain events. Actually, he does. He says that the full relationship is not known and the problem is not solved but the current evidence suggests the latter causes the formal. (February 22, 2013 at 2:56 pm)ChadWooters Wrote: He only looks for efficient causes and observes only third-party physical facts. Empirical study of the brain takes for granted the formal relationships, logical relations, and assigned values (all immaterial) that allow us to feel, think about, and will to act upon what we observe. Not quite. Any empirical study is incomplete without taking those into consideration. That the said authority chose not to mention those in his argument does not suggest they are taken for granted. (February 22, 2013 at 2:56 pm)ChadWooters Wrote: In the example you provided, physical events produce both physical and mental effects. Following the initial physical cause you get both a second physical event and mental one. Now you have two potential causes, one mental and the other physical, for the next in line physical event and its associated mental one. This means one of the following: And this is the crux of your problem. The only way to conclude that only these three choices exist is to apriori assume that there are two distinct, two separate, real and distinct realities - one pertaining to the material and the other to the mental. Implicit within this is the idea that these two can exist independently. Further evidence of your straw-manning is given by the your assumption that all atheists are materialists who'd refuse to even consider the third option. As a matter of fact, the closest you come to my position is in the third option. I do not recognize the physical and the mental as two separate realities but as aspects of a single one - this reality. The fact that we can consider them separately (as in your example of reflecting upon our inner world) does not mean that they can or do exist independently. However, I do consider the so called mental aspect to be ultimately dependent upon and originating from the physical. Not to belabor a point, but once again, consider the computer analogy. If you consider your hardware to be the brain and software to be the mind, then you can see that there is a definite causal link. While they can be considered separate and appear to work independently, there is a definite undercurrent whereby any changes in the software (or caused by it) are reflected in the hardware and vice-versa. However, the software is still ultimately dependent upon the hardware, since it cannot exist without its specific configuration - while hardware has no such constraint upon its existence. Therefore, hardware produces software and in the same way brain produces mind.
To Chad: I don't see why you object to seeing the biology of the brain as the ground of the mind's activity. If it turns out to be true that no functioning mental world exists apart from living brains, what difference does that make?
Doesn't it make sense for the mind to build a representation of the world that is 'world-like'? It enables us to consider relationships between objects and subjects in the world in the abstract, to consider the possibility and likelihood of avoiding dangers and exploiting opportunities. We can even use our mental representations to imagine utopian scenarios. The contribution of mental activity to our quality of life is unquestioned. But what difference does it make if our mental life arises biologically? I'm trying to understand what you see as being at stake if it turns out that mental phenomena exist wholly in minds as a particular aspect of our biology. What do we have to give up under that view? RE: consciousness?
February 23, 2013 at 10:37 pm
(This post was last modified: February 23, 2013 at 10:42 pm by Neo-Scholastic.)
(February 23, 2013 at 10:56 am)whateverist Wrote: …To Chad: I don't see why you object to seeing the biology of the brain as the ground of the mind's activity. If it turns out to be true that no functioning mental world exists apart from living brains, what difference does that make?...what difference does it make if our mental life arises biologically? I'm trying to understand what you see as being at stake if it turns out that mental phenomena exist wholly in minds as a particular aspect of our biology. What do we have to give up under that view?What is at stake are the very things I believe make life worth living: purpose, personal freedom, and moral values, for a start. Because these are so important to human existence, I do not give them up lightly. (February 23, 2013 at 9:51 am)genkaus Wrote: And this is the crux of your problem. The only way to conclude that only these three choices exist is to apriori assume that there are two distinct, two separate, real and distinct realities - one pertaining to the material and the other to the mental. Implicit within this is the idea that these two can exist independently. Further evidence of your straw-manning is given by the your assumption that all atheists are materialists who'd refuse to even consider the third option.Acually, I am not making the assumption of two distinct entities. All three could support a physical theory of mind. However, of the three given, only the last allows the possibilities of two real and distinct things (having different categories of being?) or demonstrating two distinct aspects of reality. (February 23, 2013 at 9:51 am)genkaus Wrote: …I do not recognize the physical and the mental as two separate realities but as aspects of a single one - this reality. The fact that we can consider them separately (as in your example of reflecting upon our inner world) does not mean that they can or do exist independently. However, I do consider the so called mental aspect to be ultimately dependent upon and originating from the physical.And why can’t they be mutually dependant and co-existent in a meaningful way? (February 23, 2013 at 9:51 am)genkaus Wrote: …again, consider the computer analogy.People used to compare the brain to a steam engine too. But I am familiar with the hardware/software analogy. Not to put words in your mouth, but you seem to be taking a functionalist position. Functions are not contingent on the system in the way. Any given function can conceivably be performed on multiple physical systems – cogs and gears, brain tissue, electronic circuits, pneumatic tubes, telephonic networks, etc. This undermines mind-brain dependence and in many ways makes a material/immaterial relationship more plausible. Secondly, functions are not inherent, they are assigned. All other known physical systems besides the human brain proceed without any awareness of function. It takes an already conscious being to recognize a function. A rectangular piece of fired clay has no function until someone comes along and says, “hey, there’s a brick I can use.” Thirdly (and this mainly to shows that the analogy is weak, not the overall arguement), a computer does just generate software; it must be uploaded. If you buy a computer without software installed, turning it on will not generate Microsoft Word or i-Tunes. You open the door to the unfair question of who installed the software? Presumably, evolution allowed the hardware and software to develop simultaneously. That may be, but it's highly speculative. RE: consciousness?
February 24, 2013 at 1:05 am
(This post was last modified: February 24, 2013 at 1:08 am by The Grand Nudger.)
It may not generate microsoft word, but why would we expect it to? It does, however, execute a set of functions merely in being "turned on" (even when a pc is doing abslutely nothing it's actually, more accurately, executing a mind boggling-ly vast number of NOT or null "functions". Sure they're not useful to us, but who cares? In what way are purpose, personal freedom, and moral values dependent upon your ghost in the machine?
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
RE: consciousness?
February 24, 2013 at 11:17 am
(This post was last modified: February 24, 2013 at 11:19 am by genkaus.)
(February 23, 2013 at 10:37 pm)ChadWooters Wrote: What is at stake are the very things I believe make life worth living: purpose, personal freedom, and moral values, for a start. Because these are so important to human existence, I do not give them up lightly. What makes you think that you'd have to give up on them if you accept that your mental existence was rooted in biology? (February 23, 2013 at 10:37 pm)ChadWooters Wrote: Acually, I am not making the assumption of two distinct entities. All three could support a physical theory of mind. However, of the three given, only the last allows the possibilities of two real and distinct things (having different categories of being?) or demonstrating two distinct aspects of reality. That does not seem to be the case. You start your discussion talking about brain-states and mind-states and from there you go about positing causal links between them. Which means you are starting by considering them as two distinct entities. (February 23, 2013 at 10:37 pm)ChadWooters Wrote: And why can’t they be mutually dependant and co-existent in a meaningful way? Because the current evidence suggests otherwise. (February 23, 2013 at 10:37 pm)ChadWooters Wrote: People used to compare the brain to a steam engine too. But I am familiar with the hardware/software analogy. Not to put words in your mouth, but you seem to be taking a functionalist position. You're not. I think I have indicated before that that is my position. (February 23, 2013 at 10:37 pm)ChadWooters Wrote: Functions are not contingent on the system in the way. Any given function can conceivably be performed on multiple physical systems – cogs and gears, brain tissue, electronic circuits, pneumatic tubes, telephonic networks, etc. This undermines mind-brain dependence and in many ways makes a material/immaterial relationship more plausible. Not quite. We cannot have this discussion on a wooden computer. I cannot replace all the water in my body with alcohol and expect it to work the same. The more complicated the performed function, the more specific the requirements for the underlying physical system. But even if your position was true, how would the fact that replace my biological brain with an electronic one lessen the dependence of my mind on the material medium? (February 23, 2013 at 10:37 pm)ChadWooters Wrote: Secondly, functions are not inherent, they are assigned. All other known physical systems besides the human brain proceed without any awareness of function. It takes an already conscious being to recognize a function. A rectangular piece of fired clay has no function until someone comes along and says, “hey, there’s a brick I can use.” Don't confuse recognition with assignment. Where the functions are assigned, they depend on the conscious entity doing the assigning. For example, for the brick, the assigned function could be making a building, smashing a window or throwing it at someone's head. On the other hand a lot of systems do perform an inherent function determined by their nature. For example, the function of leaves in a tree is to absorb sunlight, water and carbon-dioxide to make sugar and oxygen. The awareness of the said function does not change the inherent nature of it. (February 23, 2013 at 10:37 pm)ChadWooters Wrote: Thirdly (and this mainly to shows that the analogy is weak, not the overall arguement), a computer does just generate software; it must be uploaded. If you buy a computer without software installed, turning it on will not generate Microsoft Word or i-Tunes. You open the door to the unfair question of who installed the software? Presumably, evolution allowed the hardware and software to develop simultaneously. That may be, but it's highly speculative. That doesn't weaken the analogy, it bolsters it. Even at its very basic, even when no software or operating system has been uploaded, a computer does perform certain functions as dictated by its configuration. Upon startup, it boots up, looks for disk-drives etc. Even your pen-drive would just be a paper-weight if it did not have a basic form of software determined by its physical configuration. Similarly, the brain does have some inherent software such as pain-avoidance. In both cases, uploading additional software would not be possible if there wasn't some of it there to begin with. Secondly, the method of uploading software is different in both cases. Whereas in a computer we can upload and install it directly, a brain has to be fed the necessary external inputs in form of sense-data to generate its software. Consider, for example, someone who has never had any sort of sensory input. Not just his five senses, but all his other senses, such as pain, hunger, pleasure etc. have never been activated. Do you think that such a person would have self-awareness? That he would be capable of self-generated action? The software-uploading in humans takes place through a long and complicated process of education from their birth onwards, where the parents help the child to learn concepts through repeated exposure to certain sense data. If anything, the main difference between a computer-mind and a biological-mind in the current state of affairs would be that while the mind does seem to develop its own software based on the input, gradually, but surely, the computers do not have the same capacity because we haven't figured out yet how it is done. Once we do, behold the the advent of AI. (February 24, 2013 at 1:05 am)Rhythm Wrote: It may not generate microsoft word, but why would we expect it to? It does, however, execute a set of functions merely in being "turned on" (even when a pc is doing abslutely nothing it's actually, more accurately, executing a mind boggling-ly vast number of NOT or null "functions". Sure they're not useful to us, but who cares? Ofcourse they are useful to us. Without those basic set of functions in place, it would not be possible to install the operating system. RE: consciousness?
February 24, 2013 at 11:42 am
(This post was last modified: February 24, 2013 at 12:44 pm by The Grand Nudger.)
I assure you I am capable of building a circuit that has absolutely no use to us but nevertheless performs a staggering array of functions. hehehe. The use of a function (as it relates to us) is something we invest that function with, not an inherent property of that function. We have to understand the gate, identify the problem, envision a way in which the gate can resolve the problem - or we could, if we wanted, just throw a billion random gates at a problem and see which one returns the correct result (then leverage that without ever caring to see -how- it solved the problem). We leverage the properties of functions to give those functions use. There are an uncountable number of functions operating out in the world - two rocks can function like a variety of gates ,but they have no use to us, because we have not invested them with it -yet. As you mentioned, a complicated electronic device could easily be a paperweight - and with no paper - even that is useless. Designations of use or usefulness in circuits (and many things) first begins by postulating or invoking an operator (input), then identifying the task (function), finally describing the hypothetical or demonstrable resolution to that task in light of the operator and what we hope to achieve by it (output/register/memory/interface).
An operating system does leverage the relationship between gates in series, and those gates are required in that series for any specific OS to be installed or to function properly, but another amusing thought is that no particular series of gates is positively required to make a non-specific OS "do work" - the OS can be written for any configuration of gates one cares to cobble together, even if you "designed" it by throwing darts at a wall. It will have quirks and tics, it may not function "properly" (depending on your benchmark) in one area and it may function fantastically in another. It might not even be all that sturdy to talk about a "blank" system with no OS, as physics -is- an OS for the movement and pathing of electrons across a circuit board. It tells a signal, for example "if a resists, do b". A heady mix of efficiency and waste. Remind anyone of anything? (edit for more interesting shit- as far as we can tell our neurons are arranged, structured, and function in a way entirely indistinguishable from and/nand gates - which is telling - because and/nand gates in series are incredibly powerful tools for processing information and executing complex functions - all tasks we have identified can be accomplished with just this gate - we might envision a more elaborate or efficient gate for any particular task...but simplicity, duplication, and redundancy seem to be biological hallmarks. Why -and how- would our biology build such a gate when the and/nand handles that function- and many others already simply by duplication in series? To save space I suppose, but there's plenty of it to work with inside of our skulls - though we could conceive of a purpose built "brain" utilizing the same structural materials with more specific gates being orders of magnitude smaller while achieving the same level of function)
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
(February 24, 2013 at 11:17 am)genkaus Wrote:(February 23, 2013 at 10:37 pm)ChadWooters Wrote: What is at stake are the very things I believe make life worth living: purpose, personal freedom, and moral values, for a start. Because these are so important to human existence, I do not give them up lightly. That is just what I was wondering. RE: consciousness?
February 24, 2013 at 5:19 pm
(This post was last modified: February 24, 2013 at 5:21 pm by Neo-Scholastic.)
(February 24, 2013 at 11:17 am)genkaus Wrote: What makes you think that you'd have to give up on them if you accept that your mental existence was rooted in biology…You start your discussion talking about brain-states and mind-states and from there you go about positing causal links between them. Which means you are starting by considering them as two distinct entities.You’re reading too much into my posts. Descriptions of mind-states and descriptions of brain-states are clearly distinct. This says nothing about whether that means there are actually two different substantial entities, two entities of which one is material and the other immaterial, two distinct aspects of a single thing, or a single thing differently described. (February 24, 2013 at 11:17 am)genkaus Wrote:You forget that one can interpret evidence to support multiple conclusions. Using neuro-physical evidence to support physical theories denies that qualitiative experiences have any influence over the causal chain. External stimulation of the brain generating feelings of love does not automatically exclude the possibility that feelings of love influence physical changes in the brain. Clearly, reality consists of both observable third-person facts and first-person qualitative experiences. This allows us to describe reality with in two distinct ways, such as:(February 23, 2013 at 10:37 pm)ChadWooters Wrote: And why can’t they be mutually dependant and co-existent in a meaningful way?Because the current evidence suggests otherwise. Example A: Stimulation of the olfactory cortex activates the hippocampus which stimulates the release of dopamine. Example B: The smell of perfume sparks my memory of grandmother and I have feelings of love. Example C: Burning pain is followed by the taste of vinegar leading to a lust for power. Now without mutual interaction, it doesn’t’ matter what felt experiences accompany the physical events. The physical events of A could just as easily generate the subjective experience of C as those of B. That would be absurd, but no physical theory can exclude it. Moreover, if only physical processes are causal, then thinking, feeling and willing would have no significance and ‘you’ are just along for the ride, slavishly following a blind electro-chemical reaction that churns along deterministicly. There can be no room for free will, moral values, true understanding, or meaningful contemplation. It makes much more sense to believe that mental-properties really do have some over influence physical processes, even if the current scientific model cannot explain how. The functionalist position, which you favor, already concedes a mutual dependence between material and immaterial properties. Translating every mental-property into a ‘function’ only gives the appearance of reducing mental properties to physical ones. The difference between ‘loving’ and the ‘function of loving’ is purely semantic. Thus a causal chain that reads, “The function of remembering a beloved leads to the function of feeling loved” is merely a complicated way of saying, “Remembering a beloved leads to feeling loved.” Nothing is added to the meaning of the description by adding the word ‘function’. Changing the description does not alter whether or not the chain of events is composed of mental events versus purely physical ones. (February 24, 2013 at 5:19 pm)ChadWooters Wrote: You’re reading too much into my posts. Descriptions of mind-states and descriptions of brain-states are clearly distinct. This says nothing about whether that means there are actually two different substantial entities, two entities of which one is material and the other immaterial, two distinct aspects of a single thing, or a single thing differently described. This may not, but your arguments do. (February 24, 2013 at 5:19 pm)ChadWooters Wrote: You forget that one can interpret evidence to support multiple conclusions. Which is why it is a bad idea to go on just a single piece of evidence. (February 24, 2013 at 5:19 pm)ChadWooters Wrote: Using neuro-physical evidence to support physical theories denies that qualitiative experiences have any influence over the causal chain. External stimulation of the brain generating feelings of love does not automatically exclude the possibility that feelings of love influence physical changes in the brain. Clearly, reality consists of both observable third-person facts and first-person qualitative experiences. This allows us to describe reality with in two distinct ways, such as: No - not "just as easily". Your idea that the physical theory reduces all sensations to indistinguishable neural signals is about as ridiculous as saying that if only electrical signals are perceived to be flowing into the computer, it'd be impossible to tell if porn or a virus is being downloaded. In fact, most of neuroscience is about finding that distinction. (February 24, 2013 at 5:19 pm)ChadWooters Wrote: Moreover, if only physical processes are causal, then thinking, feeling and willing would have no significance and ‘you’ are just along for the ride, slavishly following a blind electro-chemical reaction that churns along deterministicly. There can be no room for free will, moral values, true understanding, or meaningful contemplation. It makes much more sense to believe that mental-properties really do have some over influence physical processes, even if the current scientific model cannot explain how. This argument works only if you assume that 'you' are something other than the sum of those electro-chemical reactions. That would be an illusion not supported by evidence we have. Given this idea of 'you', it's no wonder you see a schism between determinism and free-will/morality/understanding etc. (February 24, 2013 at 5:19 pm)ChadWooters Wrote: The functionalist position, which you favor, already concedes a mutual dependence between material and immaterial properties. Translating every mental-property into a ‘function’ only gives the appearance of reducing mental properties to physical ones. The difference between ‘loving’ and the ‘function of loving’ is purely semantic. Thus a causal chain that reads, “The function of remembering a beloved leads to the function of feeling loved” is merely a complicated way of saying, “Remembering a beloved leads to feeling loved.” Nothing is added to the meaning of the description by adding the word ‘function’. Changing the description does not alter whether or not the chain of events is composed of mental events versus purely physical ones. No, it does not. What it does is explain how sensations are felt and how they interact with the causal chain. |
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)