(September 5, 2013 at 1:36 am)bennyboy Wrote: Hmmmm. . . programming it will take much work and time. I will do it because I think it will be fun. But start with youtube:
Great. Now, here are the two relevant points of this explanation as they relate to an entity's capacity to experience.
First of all, the neural network is describes as three distinct components - input nodes, hidden nodes and output nodes. A fair description - once the human neural network is simplified. What it doesn't acknowledge, is the complexity created by nodes simultaneously fitting different categories. For example, output nodes may serve as input nodes for another neural network. The chain of Input Node --> Hidden Node A --> Hidden Node B --> Output Node may serve as an input node for the same network.
Secondly, the 'desirable' outputs have been externally imposed. As of now, based on our subjective experience, we regard certain outputs as 'desirable' and assign weights and mathematical functions for back-processing accordingly. At this stage, the neural network itself does not have the layer of nodes required to process and back-process the output nodes and relevant functions and alter them.
These two factors would be required for making a sentient Cyberboy 2000. Its that additional level of neural networking present in humans and absent in the current generation of computers that makes us capable of experience and not them. Should the Cyberboy have that network - as evidenced by it assigning and reassigning preference to different outputs - then it would be capable of experience.
Now, before you go off on a tangent about making assumptions let's make this clear once more by a simpler explanation.
We have input nodes, a black box/hidden network that processes signals from the input nodes and output nodes. We know that there are different categories of input and different categories of output - so the logical conclusion is that the black-box has subsections in it to process different inputs and provide different outputs. At this level, there is no evidence of subjective awareness or experience. Further, other black-boxes are completely inaccessible to this one. The internal functions of other black-boxes cannot serve as input to this one - only their outputs can. Within the context of 'mind-existent words', this black-box is 'you', the input nodes are 'your senses' and output nodes are 'your behavior' or 'physiological changes'.
As it happens, the internal working of the black-box is not a complete mystery to the black-box itself. This need not be a necessary situation. The black-box could've just as easily functioned with the preset code of input-processing-output. But as it happens, there is a separate section within the black-box that treats its internal working as input, processes it and gives specific output. In 'mind-existent terms', we call this process subjective awareness or sentience or experience. Since it cannot receive the internal working of another black-box as input, we cannot experience anyone else's subjective awareness. However, given this input, we can figure out the specific output of this section. And if we see the same output from other black-boxes, the reasonable conclusion is that it has a similar section within it. Your counter-argument that this is an assumption does not work if we've identified as that section to be necessary for that specific output.
Specific to the given example, the output of this section is assignment of 'desirable' attribute to other outputs, i.e giving relative weightage to other outputs . In the given neural network, the desirable attribute - which results in changes in the neural network - is preset. There is no section of the network that assigns it as an output and can therefore alter it. Since it is preset, that 'preference' would be evident from the start and the network won't be able to assign preference to any unconsidered outputs. But if it is the result of another section of the network, it would be alterable. And this distinction would be the basis to conclude whether or not an entity is capable of experience.