Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: November 16, 2024, 11:02 am

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Can we build AI without losing control over it? | Sam Harris
#1
Can we build AI without losing control over it? | Sam Harris
https://www.youtube.com/watch?v=8nt3edWLgIg

Sam Harris on the dangers of Artificial Intelligence.

I am such a big Sam Harris fan I think he's fucking awesome.

I think he's wrong to aggregate utility but other than that.

Fascinating talk.
Reply
#2
RE: Can we build AI without losing control over it? | Sam Harris
We are what we are because of the evolutionary pressures on us. AIs aren't going to spontaneously turn into minds that want things the way we do because they reach a magic number of connections. The only evolutionary pressure on them is to serve us well. There's no reason for them to put themselves ahead of us, or for them to have a real ego at all. It will probably take thousands of 'laws of robotics' built into them to manage them safely (to prevent them from doing things we don't want for our own good, probably), but I predict this issue will be like Y2K. Less of a problem than most people think, and what problem there is will be mitigated by the fact that we anticipate and prepare for it.
I'm not anti-Christian. I'm anti-stupid.
Reply
#3
RE: Can we build AI without losing control over it? | Sam Harris
I need to start a thread called "Can Sam Harris talk about AI without me losing control?"

I've ranted on the thinking atheist about Sam Harris. He's fear mongering and is talking out of his arse on this one I'm afraid Alisdair. At the same time destroying all credibility of the field when it's difficult enough to get any kind of work in it as it is compared to other scientific fields. He clearly has no practical experience of Artificial Intelligence and makes layman assumptions that demonstrate this.

I won't repeat myself but in a nutshell what he's talking about is so far in the future that he could just as easily be standing up there warning of the dangers of space travel because one day we might invent some kind of inter-galactic drive, meet aliens and get destroyed.

It's mere speculation or fantasy, because we just don't know what kind of technology would allow us to achieve strong AI of the level he is talking about, so we can't say what form the AI would take. As I've said elsewhere, a quantum computer will give us completely different AI to a biological or DNA computer.

Another more relevant example, it is as equally far fetched to stand up there and say that we shouldn't be researching neuroscience because we might edit ourselves to extinction. Yet he doesn't warn about this. After all, the knowledge that would be required would be the same for us to create the kind of AI that he's talking about. Neuroscience is a top down field which tells us what the brain does, computational neuroscience tells us how it does it, artificial intelligence tells us why it does it. All three are necessary scientific endeavours for understanding ourselves, you can't have one without the other, so as a neuroscientist he should really shut the fuck up or admit to being a total hypocrite.
Reply
#4
RE: Can we build AI without losing control over it? | Sam Harris
But he doesn't say that we shouldn't research it... aren't there experts in AI like Musk that also think this?

I know what you mean about it being very far in the distant future if this were to happen, my instinct says that, but remember when Bill Gates said that we'd never need anymore than 16MB or RAM and computers were about as good as they were going to get and then the technology exponentially shot past that?

Maybe he's wrong about AI but I don't see how he's fearmongering or a hypocrite. He didn't say we should stop researching it, he said that this was inevitable.

Maybe he's wrong but I don't understand how he's fearmongering or a hypocrite Undecided He gets misrepresented A LOT.

I don't think anyone should shut the fuck up for expressing their views Undecided

If his intentions are good how is he a hypocrite? Undecided

I don't understand the anger Undecided
Reply
#5
RE: Can we build AI without losing control over it? | Sam Harris
All the experts warning about the dangers of AI aren't actually experts in AI. Elon Musk is an entrepreneur. Stephen Hawkings is a physicist. Sam Harris is a neuroscientist. Bill Gates is a software engineer / business man.

Why would any expert work in a field which would bring about an apocalypse?

Sam Harris is fear mongering because:
  • He is wrong
  • He has no experience, knowledge or understanding of what he is talking about
  • What he is warning about is so far in the future that it is mere fantasy and speculation
  • It is pure titillation in order to promote himself.
Reply
#6
RE: Can we build AI without losing control over it? | Sam Harris
Oh well. It looks like you're right if Musk isn't even an AI expert then.

But I don't see how Sam Harris is a hypocrite or only trying to promote himself. Being wrong doesn't mean he's doing it on purpose.
Reply
#7
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 10:26 am)Mathilda Wrote: I need to start a thread called "Can Sam Harris talk about AI without me losing control?"

I've ranted on the thinking atheist about Sam Harris. He's fear mongering and is talking out of his arse on this one I'm afraid Alisdair. At the same time destroying all credibility of the field when it's difficult enough to get any kind of work in it as it is compared to other scientific fields. He clearly has no practical experience of Artificial Intelligence and makes layman assumptions that demonstrate this.
What laymans assumptions would those be, for the laymen among us? He layed out three explicitly, are those the ones you take objection to? He seems to have a pretty positive view of the value of ai, even if he has concerns, he repeats a line more than once about it....and the theme of the talk is actually our -shared- inability to mount what he thinks is an appropriate response.

Quote:I won't repeat myself but in a nutshell what he's talking about is so far in the future that he could just as easily be standing up there warning of the dangers of space travel because one day we might invent some kind of inter-galactic drive, meet aliens and get destroyed.
Didn't he address this sort of comment, specifically, in the talk?  Does it matter how far in the future it might be, in your opinion, and if so, why?

Quote:It's mere speculation or fantasy, because we just don't know what kind of technology would allow us to achieve strong AI of the level he is talking about, so we can't say what form the AI would take. As I've said elsewhere, a quantum computer will give us completely different AI to a biological or DNA computer.
I don't think what type of technology might allow us to achieve the effect bears any relevance to his opinion as stated.  Will those different results, from different types of computers, be qualitatively different in the context of his concerns?  Not quantitatively different, because that's a non-issue.  He doesn;t seem concerned wih what it;s made out of, but what it will or might do, in addition to what -we- might do in response to it.  Is a quantum AI more or less potentially dangerous than a biological one, and why...what's the relevance?

Quote:Another more relevant example, it is as equally far fetched to stand up there and say that we shouldn't be researching neuroscience because we might edit ourselves to extinction. Yet he doesn't warn about this. After all, the knowledge that would be required would be the same for us to create the kind of AI that he's talking about. Neuroscience is a top down field which tells us what the brain does, computational neuroscience tells us how it does it, artificial intelligence tells us why it does it. All three are necessary scientific endeavours for understanding ourselves, you can't have one without the other, so as a neuroscientist he should really shut the fuck up or admit to being a total hypocrite.
You seem to be importing some other sort of objection here.  In the talk linked, that I just watched, he describes us halting our research and improvement as the worst thing that would have ever happened to humanity.  He's clearly not advocating for anything even remotely -like- what you've described above.....at least not in that video? He's expressing caution over what is very easily argued to be pandora's box.

Additionally, if you're going to talk about hypocrisy, does i make sense to confidently object by means of some proclamation of how far in the future something is, or his lack of knowledge? Am I missing something, do you know what it;s going to take to make the sort of ai he's describing (or even what it takes to produce intelligence in a human brian?), do you have a crystal ball or a newspaper clipping from the future?

Obviously, I'm not a fan, I don't know if this was just the last straw for you..or what he's said in the past...but the argument presented is pretty damn solid, insomuch as it follows and the assumptions he expresses can be ascribed at least some measure of soundness? It is a bit ludicrous to imagine that we could control something that could out-think us, and not just a little, a lot. Especially seeing as how it's intelligence that we use to control in the first place...and regardless of the timescale involved, it's equally ludicrous to only start to worry about a problem when you have it staring you in the face. That's what -I- got from the talk. Honestly, it comes off as a bit of a deepity, a "no-shit" kind of statement on the face of it.

History is full of impressive failures where people thought they had more time, or that a specific problem would likely never materialize, or be so distant as to be a non-issue, don't you think?
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
Reply
#8
RE: Can we build AI without losing control over it? | Sam Harris
Actually you're right, I just posted this talk. I watched it a few months back. I do remember him addressing all of that now.

Sam Harris does tend to have already preempted any objections and addressed them all before they even come up.

I already noticed that he valued AI and wasn't saying that the research should be stopped. He said that at NO POINT. He gets strawmanned A LOT.

Exactly, he didn't say it would be anytime soon either, he said it was inevitable.
Reply
#9
RE: Can we build AI without losing control over it? | Sam Harris
That's the thing, the assumptions he outlines are indeed very sound.

Why is he always strawmanned so often lol.

Like... every single time I've seen him address objections against him he's dealt with them easily and it's been nothing but a strawman.
Reply
#10
RE: Can we build AI without losing control over it? | Sam Harris
Well thanks for those extremely painful 15 minutes. I managed to avoid watching it until now because I was going on blogs and comments repeating what he said. It was worse than I thought. I was literally shouting at the computer screen at the insanity of it all. It literally is on a par with some wooist warning about Cern giving us weaponised portable black holes in 50 years time.

I need to relax. The assumptions he makes are staggering. I'll respond later when I have calmed down and can type properly without risk of mashing my keyboard into bits.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Pastors losing faith (Vice) Fake Messiah 1 300 January 14, 2019 at 8:18 pm
Last Post: bennyboy
  Sam Harris podcast, blog, etc. Fake Messiah 2 1147 September 30, 2015 at 3:06 am
Last Post: ApeNotKillApe
  Do you want to build a snowman? Silver 9 1970 December 26, 2014 at 4:15 am
Last Post: BrianSoddingBoru4
  Sam Harris at the Global Atheist Convention Justtristo 22 11447 August 10, 2012 at 10:15 am
Last Post: Justtristo
  Universe Without Design Xerxes 0 1284 May 4, 2012 at 3:40 am
Last Post: Xerxes
  Doing Good...Without God Forsaken 0 807 April 10, 2012 at 5:26 am
Last Post: Forsaken
  The End of Faith by Sam Harris Justtristo 1 1643 May 28, 2011 at 1:47 pm
Last Post: Zenith
  Glenn Beck facing sack after losing over a million viewers downbeatplumb 12 5385 March 9, 2011 at 1:12 am
Last Post: Ubermensch
Rainbow Doctors without borders charity event and auction. leo-rcc 2 2130 September 13, 2010 at 7:01 pm
Last Post: DeistPaladin
  Sam Harris: Science can answer moral questions Edwardo Piet 10 3976 July 22, 2010 at 3:14 am
Last Post: leo-rcc



Users browsing this thread: 2 Guest(s)