(December 23, 2013 at 11:30 pm)MindForgedManacle Wrote: As another computer science major - and now also philosophy major - I have to say that I think what you said is bullshit. For starters, we don't even know what consciousness is. If I recall correctly, one of the things we do know is that the current evidence is against consciousness being, at base, a sort of purely algorithmic process, which if true would seem to nix the possiblity of achieving our sort of consciousness by the way of computation as we currently do it.
I don't think it really matters that we actually nail down a 100% definition of 'consciousness', because that may end up being a largely- (if not entirely-)subjective qualifier. Some people will believe that a computer is conscious if it reliably displays traits we associate with human consciousness, if it has many of the subtleties we expect. Some people will never accept that any artificial machine could ever be conscious, no matter what. Any definition of 'consciousness' we have will carry with it certain biases.
What appears certain, to me, is that consciousness is not magic. There will be a process underlying and driving consciousness and there's no reason we can't learn this process and duplicate it to some greater or lesser extent, given enough time.