(March 9, 2018 at 8:02 am)SteveII Wrote: Doesn't this entire conversation presuppose free will? But I know some of you don't believe in free will. How does a determinist justify a belief in moral values and duties?
I don't think you need free will for this. I usually think of myself as a learning AI. You've got some goals programmed in. How do you achieve those goals. I think moral values is probably a sloppy word for what's going on. But behaviors and actions to achieve a preferred state seems reasonable.