(November 28, 2012 at 11:01 am)Rhythm Wrote: Why the deference to "the same way that we do"? Is our route to consciousness or self awareness the only route? Even so, suppose our theoretical AI machine was built using organic materials...now how different are we?
The difference, to me, is that machines can be self-aware only in a mechanical type of fashion because their behaviors and their decisions are purely determined by a set of rules (which they have been programmed with). Consciousness, however, is something that I think of as a combination of many different things which includes things like attention, feelings, intentions, imaginations, and even free will to a certain extent. I don't think it's possible for AI machines to have those attributes.
Also, as far as I know, AI machines are intelligent only in the sense of being able to solve problems (by searching), but that is not "thinking."
(November 28, 2012 at 11:12 am)whateverist Wrote: How exactly do conscious beings 'work'? We understand lots about how a human body works and we've mapped the brain to find those places where a tweek will create a twitch or a severence can create a particular sort of dysfunction. Even so, I am not impressed that we are very close at all to understanding how conscious beings work.
Good question. But, yeah, I agree with you. We do not understand how exactly conscious beings work. There are many different theories on how consciousness works and there are even entire books on this topic. Consciousness is not something that I have a good grasp of, but it is still interesting to me, and I have some ideas what it may be.
(November 28, 2012 at 1:41 pm)Napoléon Wrote: How does that mean they will not be self aware or have consciousness? Just because it might be different does not mean that it won't be comparable.
See my response to Rhythm.
(November 28, 2012 at 1:41 pm)Napoléon Wrote: Currently I'd agree, but if a computer program was to become every bit as complex as DNA and the human mind what's stopping it from becoming as 'self referential' as us?
I think that the main issue, again, is not being able to identify what exactly gives rise to self-awareness/consciousness. We also don't know if it depends on the complexity of our DNA specifically. As for the complexity of the human mind, I think it is still debatable whether or not computer programs can actually become anything close to the human mind.
(November 28, 2012 at 1:41 pm)Napoléon Wrote: A fly isn't very self-referential is it? But that's because it's not at a similar complexity or development as a human is. Neither is a toaster. My view is that if a machine were ever created with relatively the same complexity as us, then there's no reason for it not to experience consciousness in a comparable way to what we do, if it were designed to do so.
Well, I don't think that a similar complexity as ours is the only requirement to produce consciousness in a machine. A particular AI machine may be built in the future which is more complex than the human brain, but that doesn't necessarily mean that it can possess consciousness as well.
As I said before, in my opinion, consciousness also embodies our faculty of attention, reasoning, feelings, intentions, imaginations, and even free will, among other things. It's not necessarily something that just sits in our brains all by itself. It's possible that the existence of our consciousness is dependent on all or at least some of those things - simultaneously.