Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: July 21, 2025, 8:44 pm

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Can we build AI without losing control over it? | Sam Harris
#19
RE: Can we build AI without losing control over it? | Sam Harris
OK he is assuming the processing speed will increase exponentially. I know that he mentions Moore's Law coming to an end as an objection, but he doesn't actually address that. He just seems processing speed as equal to better intelligence.

As a neuroscientist, he really really should know better. He knows how many neurons there are, how many synapses, the connectivity between neurons, how many cells there are in a single dendritic tree. All this performs computation. The brain is massively parallel. There are many problems that we just cannot ever hope to compute within the lifetime of the universe on processor speed alone. Yet parallel processing is difficult for conventional computing platforms, even just for a handful of processors or threads.

He is assuming that processing speed will always continue to increase. Why make this assumption? Why extrapolate the curve? As I said, exponential functions in nature saturate into sigmoid functions. It's such a classic mistake to make. This is why you get asset bubbles and people think that house prices will go up forever. He completely forgets about the economy and the resources it needs to continually expand. Computers need rare earth minerals. They need energy to run.  They need to be cooled, which also requires energy. They need space.

He is making assumptions about the economy and society. Funding for research is not automatic because funding is limited. Every discovery opens up a new area of search space and this is where the funding goes. But the economy does not fund other areas which may be just as a lucrative because people don't even realise or appreciate what can be done. For example, Big Data is all the rage right now, not embodied robotics. But this is because we are currently drowning in data from the Internet.

This is the thing about the progress of technology. In even just a few short years it can be next to impossible to predict due to the myriad of ways that both the economy and society changes.

The AI solutions we have now based on a data intensive economy are in no way adequate for creating the kind of AI that he is talking about. A strong AI needs to be embodied for it to understand something otherwise you have Searle's Chinese room problem. Or the example I like to use, imagine putting a baby into a sensory deprivation chamber, sticking tubes into it and letting it grow until it's 20 years old. You have the wetware available and functioning, but it could not then understand anything about the outside world because it never lived in it. You can only be as intelligent as your environment ever allowed you to be. And with an embodied agent you then have other limitations such as materials research and how you power it.

The way Harris just says, imagine replacing a room for of 20 Yale graduates with a super Artificial Intelligence, then it will just continue developing etc. How? How will that AI work? We don't know. So how can we assume that it will exponentially increase in intelligence? Maybe it will be a service bot that gets joy from vacuum cleaning the carpet.

The least said about America, China and Russia deploying Artificial Intelligences that can create world wars the better.

He talks about the AI being an extension of ourselves and plugging into our brains and its values. He is making assumptions about the form of AI. Is it going to be like an embodied animal with drives and needs? Is it going to be part of a data farm trawling through data from the Internet without actually understanding it? He just glosses over what he means about the AIs values. Fact is we can't know because we just don't know what kind of AI is possible.

The limiting factor in all of this, is our ability to understand as humans. It just won't ever be that fast. We can't just create Artificial Intelligence. We are limited by our ability to measure the brain and to process that data. You just have to compare it to the efforts used in understanding the genome and how many papers need to be written about a handful of genes without really even saying much.

Believe me there are a whole load of problems in AI that we don't even know how to go about solving.

I really want to go into this in more detail but there's just so much wrong with what Harris has said it's just too long for a single post.
Reply



Messages In This Thread
RE: Can we build AI without losing control over it? | Sam Harris - by I_am_not_mafia - November 4, 2016 at 2:05 pm

Possibly Related Threads...
Thread Author Replies Views Last Post
  Pastors losing faith (Vice) Fake Messiah 1 370 January 14, 2019 at 8:18 pm
Last Post: bennyboy
  Sam Harris podcast, blog, etc. Fake Messiah 2 1396 September 30, 2015 at 3:06 am
Last Post: ApeNotKillApe
  Do you want to build a snowman? Silver 9 2207 December 26, 2014 at 4:15 am
Last Post: BrianSoddingBoru4
  Sam Harris at the Global Atheist Convention Justtristo 22 12330 August 10, 2012 at 10:15 am
Last Post: Justtristo
  Universe Without Design Xerxes 0 1371 May 4, 2012 at 3:40 am
Last Post: Xerxes
  Doing Good...Without God Forsaken 0 883 April 10, 2012 at 5:26 am
Last Post: Forsaken
  The End of Faith by Sam Harris Justtristo 1 1762 May 28, 2011 at 1:47 pm
Last Post: Zenith
  Glenn Beck facing sack after losing over a million viewers downbeatplumb 12 5831 March 9, 2011 at 1:12 am
Last Post: Ubermensch
Rainbow Doctors without borders charity event and auction. leo-rcc 2 2244 September 13, 2010 at 7:01 pm
Last Post: DeistPaladin
  Sam Harris: Science can answer moral questions Edwardo Piet 10 4518 July 22, 2010 at 3:14 am
Last Post: leo-rcc



Users browsing this thread: 1 Guest(s)