Pinocchio syndrome , the turing test
August 19, 2016 at 8:09 pm
(This post was last modified: August 19, 2016 at 8:58 pm by fdesilva.)
From the very inception of ones own consciousness, a human knows most perfectly well ones own consciousness and it associated experiences.
All its (humans) interactions with the universe is via its consciousness.
Now consider a child, it will initially think everything is conscious like itself. A baby will smile at a toy. So the toy starts of having passed the Turing test.
With more learning the child will start to pass and fail different objects as to if they are conscious or not.
Thus each person runs a Turing test on objects encountered all their life all the time.
Now in the past people ran Turing test on the sun, stars , the weather , volcanoes and most of the time these things passed their Turing test. As such they were worshiped as gods.
So the definition of the Pinocchio syndrome is this.
From childhood we have a tendency to assign consciousness to everything. Then we run a Turing test to assert if its correct or not.
All of us suffer from this syndrome and we need to keep this syndrome in mind when it comes to Strong AI. If the Turing test is weak then strong AI would be a worship of gods.
Now a better approach to this question would be this.
1.Each human knows very well what it is to be conscious. What constitutes a conscious experience. As such it would be possible to define a set of Axioms, the Axioms of consciousness.
2.In regards to computers again, a computer is not a black box. Humans know exactly how they work. As such it would be possible to decide if the working of the computer can bring about the Axioms of consciousness
I made this post shorter
All its (humans) interactions with the universe is via its consciousness.
Now consider a child, it will initially think everything is conscious like itself. A baby will smile at a toy. So the toy starts of having passed the Turing test.
With more learning the child will start to pass and fail different objects as to if they are conscious or not.
Thus each person runs a Turing test on objects encountered all their life all the time.
Now in the past people ran Turing test on the sun, stars , the weather , volcanoes and most of the time these things passed their Turing test. As such they were worshiped as gods.
So the definition of the Pinocchio syndrome is this.
From childhood we have a tendency to assign consciousness to everything. Then we run a Turing test to assert if its correct or not.
All of us suffer from this syndrome and we need to keep this syndrome in mind when it comes to Strong AI. If the Turing test is weak then strong AI would be a worship of gods.
Now a better approach to this question would be this.
1.Each human knows very well what it is to be conscious. What constitutes a conscious experience. As such it would be possible to define a set of Axioms, the Axioms of consciousness.
2.In regards to computers again, a computer is not a black box. Humans know exactly how they work. As such it would be possible to decide if the working of the computer can bring about the Axioms of consciousness
I made this post shorter