(November 28, 2012 at 1:28 am)whateverist Wrote: Even so, is there any reason to think a program will ever experience subjective states or become self aware or have an identity crisis or be said to exhibit wisdom that does not directly reflect that of its programmer? I see that as a very different question than asking whether a machine could be programmed in such a way as to fool us into thinking these things are going on. I'm skeptical to the point of finding the idea absurd.
I don't think that a program or a machine could ever be self-aware or have a consciousness in the same way that we do. I think that the level of self-referentiality that exists in the human mind is much deeper than the level of self-referentiality that exists in machines and/or computer programs.
Interestingly, however, I've read in a few articles that a computer program can be thought to have consciousness - or a mind of it's own, so to speak - depending on how you define the word "conscious". There are certain definitions of consciousness in relation to computational properties that, when applied to the behavior of a computer program, the program itself can be considered to be "conscious" or "self-aware." You can see some of those definitions and their applications on page four in the link below:
Conscious Machines and Consciousness Oriented Programming