(May 26, 2019 at 4:08 pm)BrianSoddingBoru4 Wrote:(May 26, 2019 at 3:37 pm)wyzas Wrote: Once the bot's know it's a suicide mission and tell us they don't want to go. Until then, it's just a mission, HAL.![]()
Side note: Just watched a vid where a dog owner died(recently) and her will stipulated that her pet dog be cremated and placed/planted with her. They followed the will, dogs are still considered property. I'm not sure that we have advanced as much as people like to think.
Here is the story (not what I watched): http://vt.co/news/us/healthy-dog-is-euth...ead-owner/
But would the machine even have the right to refuse the mission? We're in the habit of compelling human beings to perform dangerous, life-threatening tasks, under threat of imprisonment if they refuse to comply.
Boru
Doubtful. We have a vast history of placing our self worth above what we consider lower life forms. (yeah, yeah, insert race card here)
Being told you're delusional does not necessarily mean you're mental.