RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:12 pm
(This post was last modified: November 4, 2016 at 6:12 pm by abaris.)
(November 4, 2016 at 6:05 pm)Alasdair Ham Wrote: Well no, that isn't part of his argument so it's not important.
We just need to assume that it's inevitable eventually.
AI is getting better and better and one day will be so much smarter than us that we will be insignificant to it.
Why exactly do we have to assume that? Because Harris says so? Is he privy to yet unknown information, to predict how a yet to developed AI will function at some point in the future? Did he adress any of the crucial technological questions or did he just ramble on about a fictional superintelligence treating us like ants? Crucial as in the most basic questions. On what kind of energy should this AI run and on what technology other than the one provided by it's creators should it function. Or how it could be possibly able to beat the hard- and software limitations.
I certainly hope you're not going into the intelligent air direction by now.