Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: May 29, 2024, 2:39 am

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Can we build AI without losing control over it? | Sam Harris
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:14 pm)Mathilda Wrote:
(November 4, 2016 at 6:12 pm)Rhythm Wrote: and the means by which ai would improve itself faster than humans has been discussed not only in the video but many times in the thread - on the basis of speed alone.

And is a wholly unfounded assumption.

We have no reason to believe that an AI would improve itself faster than humans. How could it? No one has explained this yet or given any reason to assume that it would. Why would an AI be able to do this and not a natural agent?

People see computers are fast at certain tasks and assume that they are always faster than brains. This is totally wrong. Brains are actually faster than computers at a lot of things. For example, if I throw an apple at you, or it's partially hidden, or it's lifted up, sideways, rotated, coloured differently, you still instantly recognise it as an apple. You also have certain connotations and memories of apples that come immediately to mind. The processing power for this is absolutely immense but we don't even think about it. It just happens.
Again you ask the very same question as though it hadn't already been answered, in the video, even in the post -you quoted- in response to.  No one is assuming that computers are faster at crunching numbers or more reliable at making logical inference.  They objectively and demonstrable are.  No ones assuming it's better at everything, some of us are expressing the fact that it -is- better at -this- thing.  Again, no one even requires that it be better......it can be equivalent and it's still -faster-.  No one says it isn't immense, and obviusly we'd have to think about it to effect it, to make that sort of improvement...but if 1) is true, that intelligence is a matter of information processing in physical systems....and pointing to our very own brains as an example...it's obviously not -impossible- to achieve regardless of how difficult it it. 

Dunning kruger is moving at full warp, at this point.......



Quote:It's precisely because I use machine intelligence for my tools that I know how slow it is. I can spend weeks or months evolving a simple agent controller that will do something very simple but intelligent. I won't know how it works unless I spend weeks or months of my life analysing it.
You obviously have shitty machines, or you actually -don't- know what you're talking about after all.  You let a computer spend many more of those months doing that analysis, please, for the love of christ, please don't pretend that you're this obtuse..or that you're doing the brunt of that work. You aren't, you should know you aren't, and that your very research wouldn't be feasible without that assistance...and we're talking about a -much- more powerful system if we're talking hypothetical ai, 50 years, 500 years, or 5000 years from now...unless, for no stated reason..we make absolutely no progess in all that time. 


Quote:Which again makes assumptions. How does a machine do this? It can't know in advance what will and will not work. Each solution has to be evaluated. This takes an extremely long time. You are essentially assuming that the whole course of evolution will happen in an instant.
The same way we do it.  It makes precisely -3- assumptions, all of which have been listed.  How do we evaluate those solutions.....today, with the help of computers, huh?  Dumb ones, relative doorknobs to general ai.  

Quote:No. Energy requirements are critical. The lower the energy consumption, the more processing you can do. You can't have an android or drone walking or flying around that requires several megawatts to run. Nor can you offload the processing to a remote server either because it has to react in real time.

The less energy that is required, the more sustainable it is.
Oh ffs sake, you're just shifting shit around and pretending to respond to my comments.  OFC energy requirements ar critical, no juice no play....but we have juice, and can get more juice...we have every reason -to- get more juice and to duimp it into a machine wih the capabilities that has so much as a "human level" intelligence, or even far less, no matter what the cost - and we already do.  20k years in a week.  Do a cost benefit analysis on that and tell me you don't think it's worth it.  That's the majority of human technological achievement in the same amount of time it takes me to plant 5 acres of fucking beans by hand.  Yes please. Similarly, unless being a drone that walks around and flies is somehow necessary, it;s pointless to reference as though it were informative in context...which it isn't.

Quote:How would the AI understand the implications of what it is researching if it is not embodied in the real world? How would an energy inefficient AI make use of that information if all it is is a computer with no actuators?
Again, and for the final time, is there some reason that it -couldn't- be embodied in the real world...because unless there is, this is a meaningless and illogical objection. Meanwhile apparently "disembodied" current computers already have vast utility that far exceeds human ability....so wtf are you even going on about? Our own "embodied" intelligence relies on simulation...that we have bodies is, to us, no different than simply believing that we do or having them simulated. Our entire idea -of- a body is a simulation even if we -also- have them. You tell me what the important component is, in that?

Lets cut through all this shit right now, because I don;t have endless patience. You didn;t watch that video, you don;t intend to address Harris arguments, and aren't actually aware of their contents, are you? Further, you have no intention of doing anything other than positing and endless litany of bad logic hefted up, cheifly, by the notion that you're an expert and that somehow makes you good at reasoning -about- your area of expertise. Which is, itself, unfortunately, bad reasoning. Ypou don;t know how long it will take for us to achieve ai, you don't know what it would take to recreate a human level intelligence, or fabricate a superintelligent ai...you don't have a crystal ball and you're incapable of forming a rational explanation for your beliefs about all the things previously mentioned. You have frustration, you have a neverending stream of irrelevant "it's super duper difficult"s...yeah, well, no shit, and?
I am the Infantry. I am my country’s strength in war, her deterrent in peace. I am the heart of the fight… wherever, whenever. I carry America’s faith and honor against her enemies. I am the Queen of Battle. I am what my country expects me to be, the best trained Soldier in the world. In the race for victory, I am swift, determined, and courageous, armed with a fierce will to win. Never will I fail my country’s trust. Always I fight on…through the foe, to the objective, to triumph overall. If necessary, I will fight to my death. By my steadfast courage, I have won more than 200 years of freedom. I yield not to weakness, to hunger, to cowardice, to fatigue, to superior odds, For I am mentally tough, physically strong, and morally straight. I forsake not, my country, my mission, my comrades, my sacred duty. I am relentless. I am always there, now and forever. I AM THE INFANTRY! FOLLOW ME!
Reply
RE: Can we build AI without losing control over it? | Sam Harris
It's hard, but we don't know that it's impossible. And as Bostrom's explains, even if the chance of it happening is something like 0.00000001 it's still worrisome. When you think of the potential trillions that may never be born into a life (of wealth at that) because of our negligence you start looking at existential risks a little differently. And this is one almost no one else talks about in these terms. They either go with Kurzweil's happy singularity scenario or put it off as too far in the future to worry about.
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:26 pm)Alasdair Ham Wrote:
(November 4, 2016 at 7:21 pm)Mathilda Wrote: Actually, yes it is. It is precisely what it is about. It is about adapting to an unknown environment. An environment that needs to be sensed. Otherwise you might as well use a look up table.

You think that's what intelligence is?

Sounds to me like you're talking about skill in general.

Intelligence is about speed and depth of comprehension (or in A.I. an artificial simulation of it).


That's a lay person's understanding of intelligence and is a very small proportion of what the brain does.
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:25 pm)Mathilda Wrote:
(November 4, 2016 at 7:22 pm)Alasdair Ham Wrote: Yeah and, thanks for the exchange, I'm staying up tonight... I think I have to because I'm gonna struggle to sleep... completely irrelevant to this thread and conversation but, due to RL reasons, I'm not feeling so great so I feel rather burntout at this point.

It's been good discussing with you Mathilda.

I'm still only at page 11 and I think I will have to call it a night. I'm starting to repeat myself now.

No offence intended if I came across a bit brusk.

Yeah it's always good to discuss with you. You're super smart, one of my favorite posters here and I like and care about you.

No offence taken. I was more worried about offending you actually. I can come across as rather annoying in a debate even when it's never personal with me.

Sleep well Smile
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:27 pm)Mathilda Wrote:
(November 4, 2016 at 7:26 pm)Alasdair Ham Wrote: You think that's what intelligence is?

Sounds to me like you're talking about skill in general.

Intelligence is about speed and depth of comprehension (or in A.I. an artificial simulation of it).


That's a lay person's understanding of intelligence and is a very small proportion of what the brain does.

Well, you may have a greater understanding of something like intelligence but if the definition of intelligence is about comprehension and you're talking about things that have nothing to do with comprehension then we're not actually talking about comprehension. It's much like a compatabilist who claims they have a more complex understanding of free will when all they've done is redefined it.

How well you can adapt in an environment says nothing about your intelligence if you don't actually understand or comprehend anything in that environment. The fact dogs can smell better than us or eagles can see further than us doesn't make them smarter in their environment it merely makes them more skilled in their environment. They are competent without comprehension.

You're talking about competence in an environment without comprehension in an environment, and without the latter it's not intelligence, it's merely skillfulness.
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:27 pm)Mathilda Wrote:
(November 4, 2016 at 7:26 pm)Alasdair Ham Wrote: You think that's what intelligence is?

Sounds to me like you're talking about skill in general.

Intelligence is about speed and depth of comprehension (or in A.I. an artificial simulation of it).


That's a lay person's understanding of intelligence and is a very small proportion of what the brain does.

And only a very small portion of what the brain does is actually relevant to intelligence.
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:26 pm)Rhythm Wrote: Dunning kruger is moving at full warp, at this point.......


(November 4, 2016 at 7:26 pm)Rhythm Wrote: You obviously have shitty machines, or you actually -don't- know what you're talking about after all.  You actually let a computer spend many more of those months doing that analysis, please, for the love of christ, please don't pretend that you're this obtuse.  


Really Rhythm, if you're going to be insulting then just go fuck yourself. I've had it with you. Fuck off.

Dunning Kruger? Don't know what I am talking about? I have a doctorate in this area and post doctoral experience. I am an active peer reviewed published researcher. I've been doing this kind of work since 1996.

I was told during my PhD that it was very ambitious. I pulled it off. Apparently my boss sang my praises to everyone before I started my research fellowship. I've strived to tackle the hardest problems rather than to concentrate on publishing as many papers as possible. I've tried to break new ground. I've succeeded. This is why I only work with self organising systems. This is why I have a good understanding of just how difficult it is. I've worked in most areas of AI and R&D in industry as well as academia.

And this is far more experience, qualifications and knowledge than either you or Sam Harris has. And I am the one with Dunning Kruger?
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:21 pm)Mathilda Wrote:
(November 4, 2016 at 6:18 pm)Alasdair Ham Wrote: Yes but sense of smell and ability to survive isn't what intelligence is.

Actually, yes it is. It is precisely what it is about. It is about adapting to an unknown environment. An environment to at needs to be sensed. Otherwise you might as well use a look up table.

That's anthropomorphizing, at the very least. 

What environment anyway. We're not in the jungle anymore , for the most part. We live in an artificial world as it is. We are surrounded by technology. What is the difference between me browsing the Internet and a machine doing the same, hypothetically speaking ? Do the senses and the mobility add anything to my strictly analytical powers? 

Watson, anyone ?
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:19 pm)Mathilda Wrote: As mentioned before, we are the limits. Our ability to understand and to engineer. This is what people don't appreciate, just how fiendishly difficult AI actually is. It's getting late so I will save it for another post but things that we take for granted as intelligent beings, we have no idea how to even go about implementing in an AI.

Mathilda's point here is really worth considering. We're nowhere near "true AI". We can just about program cars to stay in the correct lanes, and they are still confused by white vans. The "AI" in that case is not intelligence in the same sense that humans have it. The computer doesn't know what cars are, it has no concept of driving, or staying between two lines, etc. It's been told that it needs to monitor sensors and perform certain actions if the sensors produce certain outputs.

To get to a state where computers would be "more intelligent" than humans in a human intelligence sense, we'd first actually have to figure out how to do that, and even then, it's not going to be the same thing. At a basic level, any computer AI will operate via code which we can put limits / constraints on. In addition to that, we can put physical limitations (hardware) on AI.

As for self-improvement, well, that's a whole other ball game. We can't even get computers at the moment to self-write advanced computer programs, so to think that once we crack "true AI" we'll just be able to teach this new computer how it works, and then get it to update it's own code, well, I find that rather far-fetched to say the least.
Reply
RE: Can we build AI without losing control over it? | Sam Harris
(November 4, 2016 at 7:21 pm)Mathilda Wrote:
(November 4, 2016 at 6:18 pm)Alasdair Ham Wrote: Yes but sense of smell and ability to survive isn't what intelligence is.

Actually, yes it is. It is precisely what it is about. It is about adapting to an unknown environment. An environment to at needs to be sensed. Otherwise you might as well use a look up table.

That's anthropomorphizing, at the very least. 

What environment anyway. We're not in the jungle anymore , for the most part. We live in an artificial world as it is. We are surrounded by technology. What is the difference between me browsing the Internet and a machine doing the same, hypothetically speaking ? Do the senses and the mobility add anything to my strictly analytical powers? 

Watson, anyone ?
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Pastors losing faith (Vice) Fake Messiah 1 240 January 14, 2019 at 8:18 pm
Last Post: bennyboy
  Sam Harris podcast, blog, etc. Fake Messiah 2 998 September 30, 2015 at 3:06 am
Last Post: ApeNotKillApe
  Do you want to build a snowman? Foxaèr 9 1733 December 26, 2014 at 4:15 am
Last Post: BrianSoddingBoru4
  Sam Harris at the Global Atheist Convention Justtristo 22 10996 August 10, 2012 at 10:15 am
Last Post: Justtristo
  Universe Without Design Xerxes 0 1191 May 4, 2012 at 3:40 am
Last Post: Xerxes
  Doing Good...Without God Forsaken 0 752 April 10, 2012 at 5:26 am
Last Post: Forsaken
  The End of Faith by Sam Harris Justtristo 1 1591 May 28, 2011 at 1:47 pm
Last Post: Zenith
  Glenn Beck facing sack after losing over a million viewers downbeatplumb 12 5074 March 9, 2011 at 1:12 am
Last Post: Ubermensch
Rainbow Doctors without borders charity event and auction. leo-rcc 2 2025 September 13, 2010 at 7:01 pm
Last Post: DeistPaladin
  Sam Harris: Science can answer moral questions Edwardo Piet 10 3706 July 22, 2010 at 3:14 am
Last Post: leo-rcc



Users browsing this thread: 1 Guest(s)