RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:27 pm
(This post was last modified: November 4, 2016 at 6:33 pm by abaris.)
(November 4, 2016 at 6:26 pm)Excited Penguin Wrote: It would evolve. It would invent.
One simple question: HOW?
(November 4, 2016 at 6:27 pm)Rhythm Wrote: -because, again...we're nowhere near as fast as machine intelligence, we're -already- smoked on that count. Even so, the idea that we havent been improving ourselves exponentially is false on the face of it. It took a hell of a long time to discover fire, from kittyhawk to the moon was a relative flash in the pan. Knowledge builds on knowledge.
We're talking about machines we built. We provide the materials, we put them together, we create the technology. How would it create a society of it's own to be threatening us? With our raw materials, our energy? With us watching, no less?
(November 4, 2016 at 6:27 pm)Alasdair Ham Wrote: Just remember that what Harris is saying is purely hypothetical based on 3 assumptions. He's not talking about how it could happen or about anything else in practice. He's talking about how in his opinion it is probable, for the reasons he gives, that in the future A.I. could be dangerous if you accept the 3 assumptions he mentioned.
So, yes, without adressing the basics, painting a scenario that doesn't bother going into even one of the pesky details. Love it or leave it? Is that what you're trying to say?