Posts: 7392
Threads: 53
Joined: January 15, 2015
Reputation:
88
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:25 pm
(November 4, 2016 at 5:52 pm)abaris Wrote: Apart from failing to understand why you don't consider energy requirements an important question. We, as a real species, need energy to keep going. It's called food and drink. If you unplug your computer it won't do very much, even if you have the fanciest CPU installed.
Exactly. A huge consideration with super computers is the power they require to run.
https://en.wikipedia.org/wiki/Supercompu...management
Quote:A typical supercomputer consumes large amounts of electrical power, almost all of which is converted into heat, requiring cooling. For example, Tianhe-1A consumes 4.04 megawatts (MW) of electricity.[52] The cost to power and cool the system can be significant, e.g. 4 MW at $0.10/kWh is $400 an hour or about $3.5 million per year.
A quantum computer needs to be cooled to a fraction of a degree above kelvin.
What's an AI going to do stuck in a very large building?
Compare this to a human:
http://hypertextbook.com/facts/2001/Jacq...Ling.shtml
Quote:The average power consumption of a typical adult is 100 Watts and the brain consumes 20% of this making the power of the brain 20 W.
This allows a human to have a body, to walk around and interact with the world and to learn from it. Yet we don't find humans increasing their intelligence at exponential rates, except perhaps certain theist trolls on here who finally understand what we're telling them.
Posts: 9479
Threads: 116
Joined: July 5, 2015
Reputation:
22
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:26 pm
(November 4, 2016 at 6:16 pm)Mathilda Wrote: (November 4, 2016 at 5:50 pm)Alasdair Ham Wrote: Our intellect compared to ants is a LOT smarter than Einstein's or Hawking's compared to us.
We're talking about an AI so intelligent that it would have no reason to give a shit about us and see us as pests just like a lot of people do with ants.
Why would our ambitions and existence be even remotely important to an intelligence so beyond us that it considered us nothing but a pest?
We're not talking about anything going on a killing spree. We're not talking about robots remember.
Right. OK. And what's it going to do? How would it do it? Why would we give it the means to do it?
If we're not talking about robots, then what are we talking about? What can it do?
It could do anything we do. And I don't mean individually, either. Collectively, as a species. It would evolve. It would invent. It would be fairly easy for it to manipulate the humans controlling it so that they do its bidding unknowingly even, whether that's providing it with raw materials for building something to do away with us and our help or to let it escape from the box.
Posts: 43162
Threads: 720
Joined: September 21, 2008
Reputation:
132
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:27 pm
(November 4, 2016 at 6:19 pm)abaris Wrote: (November 4, 2016 at 6:15 pm)Alasdair Ham Wrote: @ Abaris
I'm not saying we intrinsically have to assume his premises. I'm saying we have to assume his premises in order for his conclusion to follow from his premises.
Sure, but why should we do that without reflecting on the scenario?
Just remember that what Harris is saying is purely hypothetical based on 3 assumptions. He's not talking about how it could happen or about anything else in practice. He's talking about how in his opinion it is probable, for the reasons he gives, that in the future A.I. could be dangerous if you accept the 3 assumptions he mentioned.
So I will make 3 submissions now:
1. I submit that if you don't see how a superintelligence can be dangerous... then I don't know what to say.
2. I submit that if you don't see how saying he's making the assumption of exponentiality is a misrepresentation of what he actually said when he specifically said he's not saying that... then I don't know what to say.
3. I submit that if you don't know how not knowing how A.I. is going to keep getting better is irrelevant if you've already accepted the premise that it is going to keep getting better... then I don't know what to say.
Posts: 7392
Threads: 53
Joined: January 15, 2015
Reputation:
88
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:27 pm
(November 4, 2016 at 5:53 pm)Alasdair Ham Wrote: No this isn't comparable to a piece of technology... this is more comparable to a superintelligent mind. I mean, that's the relevant part. Who cares what it is made out of or if it is conscious or not.
A mind always has to be embodied. How it is embodied determines how it interacts with its environment and what it can learn from it.
Otherwise it's like having a computer with no internet connection or keyboard, or CD drive and just expecting it to get more intelligent by itself.
Posts: 67390
Threads: 140
Joined: June 28, 2011
Reputation:
161
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:27 pm
(This post was last modified: November 4, 2016 at 6:31 pm by The Grand Nudger.)
-because, again...we're nowhere near as fast as machine intelligence, we're -already- smoked on that count. Even so, the idea that we havent been improving ourselves exponentially is false on the face of it. It took a hell of a long time to discover fire, from kittyhawk to the moon was a relative flash in the pan. Knowledge builds on knowledge.
Again, I'll ask you if there's some reason that ai can't be "embodied"....if that's a concern?
It doesn't seem, to me, that your expertise in the field is the problem here....or could be the solution or defeater to Harris argument. You have a problem of inference. Good at ai, bad at reasoning. Blindsided by your lack of proficiency in the latter, by your assumption that the former grants you insight in this regard.
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
Posts: 13122
Threads: 130
Joined: October 18, 2014
Reputation:
55
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:27 pm
(This post was last modified: November 4, 2016 at 6:33 pm by abaris.)
(November 4, 2016 at 6:26 pm)Excited Penguin Wrote: It would evolve. It would invent.
One simple question: HOW?
(November 4, 2016 at 6:27 pm)Rhythm Wrote: -because, again...we're nowhere near as fast as machine intelligence, we're -already- smoked on that count. Even so, the idea that we havent been improving ourselves exponentially is false on the face of it. It took a hell of a long time to discover fire, from kittyhawk to the moon was a relative flash in the pan. Knowledge builds on knowledge.
We're talking about machines we built. We provide the materials, we put them together, we create the technology. How would it create a society of it's own to be threatening us? With our raw materials, our energy? With us watching, no less?
(November 4, 2016 at 6:27 pm)Alasdair Ham Wrote: Just remember that what Harris is saying is purely hypothetical based on 3 assumptions. He's not talking about how it could happen or about anything else in practice. He's talking about how in his opinion it is probable, for the reasons he gives, that in the future A.I. could be dangerous if you accept the 3 assumptions he mentioned.
So, yes, without adressing the basics, painting a scenario that doesn't bother going into even one of the pesky details. Love it or leave it? Is that what you're trying to say?
Posts: 43162
Threads: 720
Joined: September 21, 2008
Reputation:
132
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:28 pm
All these questions about how it's going to happen in practice are completely irrelevant to whether you accept his 3 premises in principle or not. One minute you accept them the next minute you don't. Which is it?
Posts: 43162
Threads: 720
Joined: September 21, 2008
Reputation:
132
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:31 pm
(November 4, 2016 at 6:27 pm)Mathilda Wrote: (November 4, 2016 at 5:53 pm)Alasdair Ham Wrote: No this isn't comparable to a piece of technology... this is more comparable to a superintelligent mind. I mean, that's the relevant part. Who cares what it is made out of or if it is conscious or not.
A mind always has to be embodied. How it is embodied determines how it interacts with its environment and what it can learn from it.
Computers are bodies. How is biology relevant? Machines may even be able to become conscious. There's nothing magical about DNA.
Quote:Otherwise it's like having a computer with no internet connection or keyboard, or CD drive and just expecting it to get more intelligent by itself.
Not really I don't see how that's relevant at all. A stationary body is still a body. A computer is already a body, it's just not biological or in motion.
Posts: 7392
Threads: 53
Joined: January 15, 2015
Reputation:
88
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:31 pm
(November 4, 2016 at 5:59 pm)Alasdair Ham Wrote: I think emotions get in the way sometimes. First you said his assumptions weren't sound. Then you said you didn't have a problem with the assumptions.
I said I didn't have a problem with his first assumption he stated. The second and third I did have problems with. But that wasn't even what I was referring to when I said that he was making assumptions. His whole talk was laden with unstated assumptions that he didn't even realise that he was making.
(November 4, 2016 at 5:59 pm)Alasdair Ham Wrote: Then you said he was assuming exponentiality. Then I pointed out that he said the exact opposite.
But he was still relying on exponentiality despite claiming not to. He made reference to the singularity. He didn't call it that specifically but that's what he was referring to when he talked about replacing a room full of people with a computer to do research.
Posts: 9479
Threads: 116
Joined: July 5, 2015
Reputation:
22
RE: Can we build AI without losing control over it? | Sam Harris
November 4, 2016 at 6:31 pm
(November 4, 2016 at 6:27 pm)abaris Wrote: (November 4, 2016 at 6:26 pm)Excited Penguin Wrote: It would evolve. It would invent.
One simple question: HOW?
(November 4, 2016 at 6:27 pm)Rhythm Wrote: -because, again...we're nowhere near as fast as machine intelligence, we're -already- smoked on that count. Even so, the idea that we havent been improving ourselves exponentially is false on the face of it. It took a hell of a long time to discover fire, from kittyhawk to the moon was a relative flash in the pan. Knowledge builds on knowledge.
We're talking about machines we built. We provide the materials, we put them together, we create the technology. How would it create a society of it's own to be threatening us? With our raw materials, our energy? With us watching, no less? By having a human level intelligence.
|