(March 26, 2016 at 11:03 pm)IATIA Wrote: I have not seen this addressed. It could be important because after the fact, it will become a moot issue.
Isaac Asimov's "Three Laws of Robotics"
[url=http://www.auburn.edu/~vestmon/robotics.html][/url]
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
How do you implement them?