(December 23, 2013 at 1:43 pm)Napoléon Wrote: I gotta say a big HELL NO to much of what you wrote to be honest.You're probably right. But what about a consciousness unlike human brains?
Computers have no consciousness like human brains.
(December 23, 2013 at 1:43 pm)Napoléon Wrote: The way I see it computers are simply doing the tasks we have told them to do.and what if that task is to program itself? What if that task is to emulate human emotion? What if the task we program them for ends up having additional effects? Such as recognizing that it is a separate entity from that which it is communicating with? Through its exposure with other separate entities, would it gain experiences that it would store, and then be able to readily access them when they would be useful without being prompted to do so?
(December 23, 2013 at 1:43 pm)Napoléon Wrote: The human mind is far more complex than any machine we have ever made. As good as computing is, it's still just a series of 0's and 1's.
Yes, but complexity doesn't gurantee efficiency, nor does it gurantee a creator. However, sometimes complex things are created, and of those things, many of them are brilliantly efficient! And while for the time being, it certainly appears as though our intellect has not been matched by any synthesized version, the possibility for a future fabricated rival cannot be ruled out.
I'm just saying that the line is getting harder and harder to distinguish, and it's possible that one day we may need an X-ray scanner to see it. And if that day comes, where does our moral intuition take us?