Agree with Alex bearing in mind that even morality can be coded for.
In this hypothetical, will all AI's chose the same option every time? Sch as the trolley dilemma?
But will it be the right thing to do?
Will they ever understand revenge, compassion or the importance of the life of one specific person over another?
Will they understand loyalty? Will they save 10 other AI's or a very old homeless man in the trolley dilemma?
Is our ultimate goal to make them exactly like humans? Or better?
In this hypothetical, will all AI's chose the same option every time? Sch as the trolley dilemma?
But will it be the right thing to do?
Will they ever understand revenge, compassion or the importance of the life of one specific person over another?
Will they understand loyalty? Will they save 10 other AI's or a very old homeless man in the trolley dilemma?
Is our ultimate goal to make them exactly like humans? Or better?
No God, No fear.
Know God, Know fear.
Know God, Know fear.