RE: Quantum consciousness...
August 23, 2017 at 11:13 pm
(This post was last modified: August 23, 2017 at 11:15 pm by bennyboy.)
(August 22, 2017 at 8:04 pm)Khemikal Wrote: One might wonder, at that hypothetical minimum level of organization - what a hypothetical minimal incremental step "below" it looks like?So you're saying that rather than having consciousness or a lack of it, we have a kind of smooth transition, like saying, "In the rainbow, when does red change to blue?" There's no actual cut line, so the choice of exactly where to say, "There. . . it's happened!" is arbitrary. Yet, there is the red, and the blue, and the smooth transition between them. Am I reading you right?
In the binary proposition...some y is suddenly self aware in a meaningful way whereas the x directly before it is not. This isn't what would be expected in a comp mind framework.
Quote:We could certainly categorize, in the vein of "it couldn't be said" that below some level x a human consciousness is not present. OFC, things "below" that level of organization do seem to be conscious. If we're looking to set a minimum binary level for consciousness itself...rather than some specific representative thereof, where would you suggest we place it? Would you place it at the level of a single nueron, for example? That - as a hypothetical, if we had some way to peirce the veil we would find consciousness in each and every neuron, and so anything with even one neuron - or any structure which achieved the same effect by any other means?I recall we've had this conversation before. Even for a single neuron, I'd start pulling out proteins, molecules, or even QM particles, and strain that cut line as far as I could. I'm not sure there WOULD be any point at which you'd go from something to nothing: you'd probably have a statistical chance of failure as you took things too far-- maybe misfirings occurring more and more often or whatever. Or, and this is my leaning, it may be that the relationship between QM:chair and QM:mind are equal-- that all the fundamental principles necessary for mind are there right at the bottom.
I suppose that's really the question here. I can pull apart a chair, and it will be less and less chairlike, until at some point I'll get something so unlike a chair that nobody would use that term for it. Is consciousness like this? Is it just a semantic for systems that behave in certain ways, such that the reality doesn't change as we manipulate the system, but the definitions we'd be willing to apply to that system change?