(November 28, 2012 at 4:45 am)Rayaan Wrote: I don't think that a program or a machine could ever be self-aware or have a consciousness in the same way that we do.
How does that mean they will not be self aware or have consciousness? Just because it might be different does not mean that it won't be comparable.
Quote:I think that the level of self-referentiality that exists in the human mind is much deeper than the level of self-referentiality that exists in machines and/or computer programs.
Currently I'd agree, but if a computer program was to become every bit as complex as DNA and the human mind what's stopping it from becoming as 'self referential' as us?
A fly isn't very self-referential is it? But that's because it's not at a similar complexity or development as a human is. Neither is a toaster. My view is that if a machine were ever created with relatively the same complexity as us, then there's no reason for it not to experience consciousness in a comparable way to what we do, if it were designed to do so.
I think it's more how you define sentience.