Okay first thing's first, I'm not angry about anything This is gonna be a post all about the mind and perception, particularly colour perception.
Animals, including humans, have brains and it is highly probable but not proven that those brains produce the perceptions (i.e. qualia) that we experience. If it were not the case that brains produced perceptions it would be an extraordinary coincidence and heading into the realms of 'idealism' and suchlike. Although I'm open to that possibility for the sake of what I'm saying here I'll assume that the brain does indeed produce (or computes, or emerges from or whatever) perceptions in one way or another.
Different animals have difference sense organs and thus presumably very different perceptions of the world. It's probably beyond our imagination to conceive of what it's like for an animal very different from ourselves to perceive some aspects of the world, such as a bat or dolphin with echo location - a feature we're unfamiliar with. But it *might* be reasonable to assume that a dog, say, visually perceives the world very similar to us because it has eyes like we do. Eyes collect data about the light 'out there' and that information is processed by the brain and perception is computed that is not exactly the same as the original input. I.e. in the case of our two eyes a 3D perception is created out of two 2D input devices (eyes). So in other words the perception that is created is the result of some sort of computation.
These perceptions have no location in physical space and they are not 'made' of physical matter, yet they exist in some form because in our experience we can detect the difference between one thing and another. If I feel pain, I can feel how that changes or disappears entirely. Or I can see a blue pencil on top of a white piece of paper and can see that they are different. There are different states in these perceptions and if one thing is not the same as another then they can't both be nothing.
Perceptions are functionally useful. Colour perception makes perfect sense from a computational point of view as a means of labelling 'pixels' in parallel so that the right sorts of patterns can be detected in the world, and at the right level of abstraction. Compare trying to read words in a text document versus by looking at the binary representation of that same document. Or compare looking for patterns in a 1D array of numbers vs a 2D array.
Whether the perception of colour is equivalent to the underlying brain activity - has a one-to-one correspondence with it and thus is it - or whether it is created by it, in addition as it were, in such a way that perception itself could feed back into the system (i.e. kind of like dualism), doesn't really matter to what I'm talking about here. Personally I think it's the former but I can't rule out the latter. Or even the third possibility that it is not connected to physical brain activity at all and we're living in some sort of 'idealistic' world and/or God. But assuming it's the former, the evolutionary 'design' of a perception reflects what the brain is already doing behind the scenes... the brain is already discriminating the patterns we experience in consciousness and somehow they find representation in conscious perceptions. In other words, colour is equivalent to what the brain is doing when it processes colour neurally and computationally.
So with the background to my thinking out of the way, that leads me onto my questions. I hope you and I can agree that a banana is yellow. We both know what a banana looks like and we both know what the colour yellow looks like. But how can we both be sure that our experience of the colour yellow is the same? I don't think we can. If every yellow 'pixel' in my visual field actually showed the colour I perceive as blue and vice versa then from common social experience of the world and the objects in it, I'd label blue yellow and yellow blue. Then you and I, looking at the same banana, would both call it yellow but experience something different. But then someone could come along and ask me to compare colours. They might say 'yellow is lighter than blue'. At which point I would have to disagree, I think. It would only work if the whole colour spectrum was inverted - black became white and everything in between... then I would label the process of colours looking darker, lighter. So for instance when you said white is lighter than grey and me in my inverted world with black as white and white as black would label the visual difference between grey>black as 'lighter' when it was in fact darker.
Who knows how many other constraints there are on how we can talk to each other about colour, and catch each other out as it were about our experience of it. And each difference we can talk about is something we must first be able to detect in our own visual fields. There's already colour blindness etc which identifies when two people's experience of colour is not the same, but these constraints, and the number of them, do suggest to me that it is likely that we all experience colour roughly the same - that if we did not, the various constraints would have singled out people with a fundamentally different experience of colour. So assuming that to be the case - that we all experience roughly the same experience of colour - it becomes apparent that there are billions of brains out there all producing the same qualia in their perceptions. As an emergent property of a replicable brain that's all well and good and to be expected but as for the 'palette' how does it come to be... how is it 'designed' as it were? I wonder if it is the case that the colour qualia we 'see' is the only way to represent the data in a way that meets all the constraints of the system... that the palette we see emerges because it is the only way to differentiate, in the right ways, between the different states that are represented in the underlying neural hardware. That somehow an inverted colour world fails somewhere to meet the constraints of the actual brain-in-state and therefore does not, and cannot appear. That therefore all perception, whatever type it is, 'presents' the data in the only way it can to fulfil it's objectives.
Any thoughts are welcome on any aspect of this
Animals, including humans, have brains and it is highly probable but not proven that those brains produce the perceptions (i.e. qualia) that we experience. If it were not the case that brains produced perceptions it would be an extraordinary coincidence and heading into the realms of 'idealism' and suchlike. Although I'm open to that possibility for the sake of what I'm saying here I'll assume that the brain does indeed produce (or computes, or emerges from or whatever) perceptions in one way or another.
Different animals have difference sense organs and thus presumably very different perceptions of the world. It's probably beyond our imagination to conceive of what it's like for an animal very different from ourselves to perceive some aspects of the world, such as a bat or dolphin with echo location - a feature we're unfamiliar with. But it *might* be reasonable to assume that a dog, say, visually perceives the world very similar to us because it has eyes like we do. Eyes collect data about the light 'out there' and that information is processed by the brain and perception is computed that is not exactly the same as the original input. I.e. in the case of our two eyes a 3D perception is created out of two 2D input devices (eyes). So in other words the perception that is created is the result of some sort of computation.
These perceptions have no location in physical space and they are not 'made' of physical matter, yet they exist in some form because in our experience we can detect the difference between one thing and another. If I feel pain, I can feel how that changes or disappears entirely. Or I can see a blue pencil on top of a white piece of paper and can see that they are different. There are different states in these perceptions and if one thing is not the same as another then they can't both be nothing.
Perceptions are functionally useful. Colour perception makes perfect sense from a computational point of view as a means of labelling 'pixels' in parallel so that the right sorts of patterns can be detected in the world, and at the right level of abstraction. Compare trying to read words in a text document versus by looking at the binary representation of that same document. Or compare looking for patterns in a 1D array of numbers vs a 2D array.
Whether the perception of colour is equivalent to the underlying brain activity - has a one-to-one correspondence with it and thus is it - or whether it is created by it, in addition as it were, in such a way that perception itself could feed back into the system (i.e. kind of like dualism), doesn't really matter to what I'm talking about here. Personally I think it's the former but I can't rule out the latter. Or even the third possibility that it is not connected to physical brain activity at all and we're living in some sort of 'idealistic' world and/or God. But assuming it's the former, the evolutionary 'design' of a perception reflects what the brain is already doing behind the scenes... the brain is already discriminating the patterns we experience in consciousness and somehow they find representation in conscious perceptions. In other words, colour is equivalent to what the brain is doing when it processes colour neurally and computationally.
So with the background to my thinking out of the way, that leads me onto my questions. I hope you and I can agree that a banana is yellow. We both know what a banana looks like and we both know what the colour yellow looks like. But how can we both be sure that our experience of the colour yellow is the same? I don't think we can. If every yellow 'pixel' in my visual field actually showed the colour I perceive as blue and vice versa then from common social experience of the world and the objects in it, I'd label blue yellow and yellow blue. Then you and I, looking at the same banana, would both call it yellow but experience something different. But then someone could come along and ask me to compare colours. They might say 'yellow is lighter than blue'. At which point I would have to disagree, I think. It would only work if the whole colour spectrum was inverted - black became white and everything in between... then I would label the process of colours looking darker, lighter. So for instance when you said white is lighter than grey and me in my inverted world with black as white and white as black would label the visual difference between grey>black as 'lighter' when it was in fact darker.
Who knows how many other constraints there are on how we can talk to each other about colour, and catch each other out as it were about our experience of it. And each difference we can talk about is something we must first be able to detect in our own visual fields. There's already colour blindness etc which identifies when two people's experience of colour is not the same, but these constraints, and the number of them, do suggest to me that it is likely that we all experience colour roughly the same - that if we did not, the various constraints would have singled out people with a fundamentally different experience of colour. So assuming that to be the case - that we all experience roughly the same experience of colour - it becomes apparent that there are billions of brains out there all producing the same qualia in their perceptions. As an emergent property of a replicable brain that's all well and good and to be expected but as for the 'palette' how does it come to be... how is it 'designed' as it were? I wonder if it is the case that the colour qualia we 'see' is the only way to represent the data in a way that meets all the constraints of the system... that the palette we see emerges because it is the only way to differentiate, in the right ways, between the different states that are represented in the underlying neural hardware. That somehow an inverted colour world fails somewhere to meet the constraints of the actual brain-in-state and therefore does not, and cannot appear. That therefore all perception, whatever type it is, 'presents' the data in the only way it can to fulfil it's objectives.
Any thoughts are welcome on any aspect of this