Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: May 26, 2024, 4:24 am

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Can we build AI without losing control over it? | Sam Harris
#81
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:43 pm)Mathilda Wrote:
(November 4, 2016 at 5:22 pm)Alasdair Ham Wrote: No not faulty. Like why do people kill ants? Because most people have more important things on their minds and don't give a shit. They're pests.

We're talking about a distant future where AI is far more smart than us, has a mind of its own and has its own interests.

No one worried about Einstein going on a killing spree and he had a working body. Certainly no one contemplates the possibility of Stephen Hawking doing it. He's a brain in a wheel chair. Or to use Sam Harris's example, John von Neumann. We each have our own interests, our own mind and some of us are smarter than others. So? What's going to happen?

Our intellect compared to ants is a LOT smarter than Einstein's or Hawking's compared to us.

We're talking about an AI so intelligent that it would have no reason to give a shit about us and see us as pests just like a lot of people do with ants.

Why would our ambitions and existence be even remotely important to an intelligence so beyond us that it considered us nothing but a pest?

We're not talking about anything going on a killing spree. We're not talking about robots remember.
Reply
#82
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:47 pm)Alasdair Ham Wrote: The premise is already that AI is getting better and better... I don't see how it's going to happen is relevant. There are numerous ways and numerous resources.

To make it even more simple and to repeat what I said to Mathilda.

We're not talking about a species but a piece of technology. Technology of the time of it's creation. There's a ceiling what that technology can or can't do. Hard as well as software required to run that thing won't evolve. They will be frozen in time together with their limitations.

Apart from failing to understand why you don't consider energy requirements an important question. We, as a real species, need energy to keep going. It's called food and drink. If you unplug your computer it won't do very much, even if you have the fanciest CPU installed.

But if there are so many numerous ways and sources involved, name a few. And please, you opened that can, it's the future doesn't count as an answer.
[Image: Bumper+Sticker+-+Asheville+-+Praise+Dog3.JPG]
Reply
#83
RE: Can we build AI without losing control over it? | Sam Harris
No this isn't comparable to a piece of technology... this is more comparable to a superintelligent mind. I mean, that's the relevant part. Who cares what it is made out of or if it is conscious or not.
Reply
#84
RE: Can we build AI without losing control over it? | Sam Harris
If you accept the premises and the conclusion and only disagree with a strawman of things he didn't say, like saying he was making an assumption of expotentiality when he didn't, then you're not even disagreeing with him.

I think emotions get in the way sometimes. First you said his assumptions weren't sound. Then you said you didn't have a problem with the assumptions. Then you said he was assuming exponentiality. Then I pointed out that he said the exact opposite. It's getting thinner and thinner until now you're agreeing with everything he said in principle and simply asking how in practice it's going to happen, you're already agreeing in principle, and now you've moved the goalpost to doubting that a superintelligent could become dangerous. I find it rather analogous to the God of the gaps getting smaller and smaller... until eventually it will be "Okay I agree in principle with what he said in this video but I disagree with what he said elsewhere", hehe.
Reply
#85
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:36 pm)Alasdair Ham Wrote:
(November 4, 2016 at 5:31 pm)abaris Wrote: How can a thing, designed by human intellect, assembled by human ingenuity, possibly surpass human intellect and ingenuity?

How couldn't it?

Isn't that rather like asking "how could intelligent humans evolve from singe selled organisms?"

Seems like the genetic fallacy from me. Why would intelligent AI in the future be limited by the intelligence of those who created?

Because it takes time. Lots of time. It can't know in advance what is a good or bad solution for whatever it is trying to do without actually testing it. This means applying it to the real world. And as I said before, any single self recursive improving function that overwrites a part of itself is limited to the bit that doesn't get overwritten.



(November 4, 2016 at 5:36 pm)Alasdair Ham Wrote:
(November 4, 2016 at 5:31 pm)abaris Wrote: It could process faster, given the right amount of energy and processing power. Computers do that for us already, but it can't actually surpass the intellect available when it's assembled.

Why not?

We're talking about future intelligences where the AIs can think for themselves like humans... only much more intelligent.

Which is far, far in the realms of sci-fi. So what form does this AI take? Is it an android? A disembodied AI on a serve farm? What? What dangers could this pose?


(November 4, 2016 at 5:36 pm)Alasdair Ham Wrote:
(November 4, 2016 at 5:31 pm)abaris Wrote: How likely is all of this coming together anytime?

Like anytime soon? Like he said.... all it takes is us to keep going, it doesn't have to be anytime soon.

Let's assume for the sake of the argument that it's definitely going to happen and can't be stopped, how quickly is it going to happen? Surely that's important here. If it happens slowly then we have time to adapt and evaluate its progress.
Reply
#86
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:53 pm)Alasdair Ham Wrote: No this isn't comparable to a piece of technology... this is more comparable to a superintelligent mind. I mean, that's the relevant part. Who cares what it is made out of or if it is conscious or not.

Who cares?!!!?

Seriously? This is the most important question of all.

And it wouldn't be an AI if it wasn't conscious in one way or the other.
[Image: Bumper+Sticker+-+Asheville+-+Praise+Dog3.JPG]
Reply
#87
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:45 pm)Rhythm Wrote: I see you've decided to go the batshit christer route of deciding which questions are proper.  Good for you, lol.

What the fuck is that supposed to mean?
Reply
#88
RE: Can we build AI without losing control over it? | Sam Harris
Well no, that isn't part of his argument so it's not important.

We just need to assume that it's inevitable eventually.

AI is getting better and better and one day will be so much smarter than us that we will be insignificant to it.
Reply
#89
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 6:03 pm)Mathilda Wrote:
(November 4, 2016 at 5:45 pm)Rhythm Wrote: I see you've decided to go the batshit christer route of deciding which questions are proper.  Good for you, lol.

What the fuck is that supposed to mean?

I think it was a joke. Sad

I hope you're not offended.

Like... I think he was making fun of you deciding what questions were important, maybe he felt it was a little condescending, but I don't think he meant any harm by it.
Reply
#90
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 5:46 pm)Mathilda Wrote:
(November 4, 2016 at 5:30 pm)Excited Penguin Wrote: No, the A. G. I. begins improving itself at a rate much faster than a human and we get an extremely smart mind in a box. If you think its lack of limbs is going to be a problem. . .  I have no idea why you would think that. If I were 10.000 times smarter than you, you wouldn't fear me because of my body.

And how would AGI improve itself exponentially faster than a human?

If you were 10,000 times smarter than me, why would I have to fear you?

By working much faster than a biological brain. 

This machine, if you are to grant we can make one as intelligent as a human with all of the human range of mental skills at its disposal, would surely be able to interact with humans if nothing else. We could feed it information just like we do when we educate a human , except it would be able to process much more at once, naturally. 

This might be hard for you to imagine, but I can't see why. If you give me a million years to study all the assembled knowledge in the world and then some billions of humans to do my bidding, yes, I will still be under the influence or in the shadow of my original biological drives, but we all know how distortedly we've come to fulfill those just by being aware of them and by living in an ever more complex environment. 

Now take me and imagine me as the A. I. instead. It won't take me a million years to absorb all that information, solve problems , theorize, plan and think up scenarios and so on. It will take me much less simply because I'm faster in my electrical setup. My neuron equivalents move at the speed of light.

I don't see why lacking a personality or a body would be a problem. And even if it were, we could provide me with those things fairly easily. Work in that area is underway and looks promising, last time I checked.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Pastors losing faith (Vice) Fake Messiah 1 240 January 14, 2019 at 8:18 pm
Last Post: bennyboy
  Sam Harris podcast, blog, etc. Fake Messiah 2 997 September 30, 2015 at 3:06 am
Last Post: ApeNotKillApe
  Do you want to build a snowman? Foxaèr 9 1730 December 26, 2014 at 4:15 am
Last Post: BrianSoddingBoru4
  Sam Harris at the Global Atheist Convention Justtristo 22 10989 August 10, 2012 at 10:15 am
Last Post: Justtristo
  Universe Without Design Xerxes 0 1191 May 4, 2012 at 3:40 am
Last Post: Xerxes
  Doing Good...Without God Forsaken 0 751 April 10, 2012 at 5:26 am
Last Post: Forsaken
  The End of Faith by Sam Harris Justtristo 1 1591 May 28, 2011 at 1:47 pm
Last Post: Zenith
  Glenn Beck facing sack after losing over a million viewers downbeatplumb 12 5072 March 9, 2011 at 1:12 am
Last Post: Ubermensch
Rainbow Doctors without borders charity event and auction. leo-rcc 2 2023 September 13, 2010 at 7:01 pm
Last Post: DeistPaladin
  Sam Harris: Science can answer moral questions Edwardo Piet 10 3702 July 22, 2010 at 3:14 am
Last Post: leo-rcc



Users browsing this thread: 1 Guest(s)