(December 23, 2013 at 2:30 pm)The Reality Salesman Wrote: You're probably right. But what about a consciousness unlike human brains?
They're still analogue, organic minds. See the first point in my link previously.
Quote:and what if that task is to program itself? What if that task is to emulate human emotion? What if the task we program them for ends up having additional effects? Such as recognizing that it is a separate entity from that which it is communicating with? Through its exposure with other separate entities, would it gain experiences that it would store, and then be able to readily access them when they would be useful without being prompted to do so?
Go ahead and program it then. It's not as simple as you think, hence why it hasn't been done yet.
Quote:Yes, but complexity doesn't gurantee efficiency, nor does it gurantee a creator. However, sometimes complex things are created, and of those things, many of them are brilliantly efficient! And while for the time being, it certainly appears as though our intellect has not been matched by any synthesized version, the possibility for a future fabricated rival cannot be ruled out.
What does consciousness have to do with efficiency or having a creator?
Quote:I'm just saying that the line is getting harder and harder to distinguish, and it's possible that one day we may need an X-ray scanner to see it. And if that day comes, where does our moral intuition take us?
No, it's not too difficult to distinguish at the moment. Maybe to those who don't know much about computing they can be fooled into thinking machines have consciousness, take Asimo for example, but to anyone who does understand even the most basic programming code and how computers work, it's pretty obvious that the level of complexity required to simulate a human brain is pretty fucking incomprehensible right now.
Again, that doesn't mean to say we won't or can't design a machine to think like we do, but we haven't, and we can't right now.