RE: Will AI ever = conciousness or sentience?
November 29, 2012 at 3:07 pm
(This post was last modified: November 29, 2012 at 3:07 pm by Angrboda.)
(November 29, 2012 at 2:55 pm)Ryantology Wrote: The definition of 'consciousness' is, of course, the trickiest part of that; how can you prove a machine is conscious? You never can, I think, because any test you devise to measure 'consciousness' is going to contain inherent biases of one kind or another; there is, after all, no objective way to measure it. I think, at the point where an AI tells us that it is conscious, and can convince a majority of people that its thought processes are independent and unique, we have to start giving them the benefit of the doubt (as we do naturally to every other person we encounter) and call them 'conscious'. And, I do believe that may happen in my lifetime.
There's actually another way.
If what we call consciousness is an effect of a set of brain processes which we can describe and characterize, and whose function we can understand, verifying consciousness in another entity would largely be simply a matter of verifying that a similar aggregate process is present in that entity.
![[Image: extraordinarywoo-sig.jpg]](https://i.postimg.cc/zf86M5L7/extraordinarywoo-sig.jpg)