RE: On naturalism and consciousness
September 5, 2014 at 12:02 pm
(This post was last modified: September 5, 2014 at 12:04 pm by The Grand Nudger.)
(September 4, 2014 at 6:48 pm)bennyboy Wrote: Let me start with sentience. As you know, I don't think sentience has any meaning if its determination requires arbitration: "X complexity means sentient, <X complexisty means not sentient."Any meaning as a concept - or any meaning as a description of what we assume to be our "exterior" circumstances/environment.
Quote: Therefore, the most primitie building block of sentience has to be rooted in a kind of atomic consciousness (by which I mean an indivisible minimal consciousness, not any relation to a physical atom, which is misnamed anyway)Because the system has it in some example of "entity" that means that all (or the vast majority) of the particles in the system must have "consciousness" as an attribute? How hard do you want me to unpack this?
Quote:That being said, there are certain ideas which are probably required to establish meaning at any given level of complexity.-Or certain architectures, physical structures doing observable work.
Quote:Just being minimally conscious, for example, wouldn't allow you to see meaning in people's behaviors. It's definitely possible, through reflection or drug use or meditation, to arrive at a mental state in which you can see light and hear sound, and perceive no deep meaning in any of it.I'd wonder whether being "minimally conscious" really raises up to whatever bar we set for "sentience"? Is it possible that something could be minimally conscious - but not sentient?
Quote:On the other hand, I'd argue a computer could see "meaning" in Chad's lightswitches. For example, it could process an alarm as a trigger for an escape behavior, or a bathroom light as a trigger for a cleanliness inspection algorithm.Whatever "meaning" either system has is a subjective experience, isn;t it? The architecture of computers explains why this is so - I would draw a similar analogy to the human brain and relevant sensory apparatus. There's only so much either can do - and their limits are disturbingly similar.
Quote:But without the sentient experience of a motivated mind, these kinds of meaning aren't very meaningful, in the sense that people have the experience of meaningfulness.To you, or us, yes - perhaps, but being a subjective experience...is this a problem? We would expect to require translation between different architectures. Or, to put it another way, we would expect their experience to be different than our own.
Quote:They are just assignations of outputs at one level to behaviors at a new level: the 0-->0, 1-->10, 2-->3 or whatever that I mentioned before. In this case, 0 means "do 0" and 1 means "do 10" and 2 means "do 3," and nothing more.Algorithm. I agree with all of that, comp mind suggests that this is also how our "mind" is generated. It doesn't demand that they be of identical construction- it asks if they achieve a similar effect (because we can observe those) based upon consistent principles -of computing-.
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!