RE: Artificial Intelligence
July 23, 2015 at 11:04 am
(This post was last modified: July 23, 2015 at 11:07 am by I_am_not_mafia.)
I was truly trying to explain rather than debate but it seems that you have been mistakenly ascribing a position that I wasn't actually holding which is probably why what I said did not make sense to you.
No I am not. I never said that. I was trying to demonstrate that we don't have enough processing power and that the more generalisable the adaptive solution, the more complex it will be and the more processing power will be required. I don't know how much processing power is required but it's more than most people think that it is. Not only that but we will need to evolve our solutions (something I tried to demonstrate above). The more complex our adaptive solutions are more the more processing is required to configure it.
Not only that but we also need to understand what's going on, that might be the greatest limiting factor. That takes time, effort, the ability to measure at finer detail and no amount of processing power will do that for us.
I am not saying that strong AI requires a certain amount of processing power or that it needs to model what we see in the brain. I do personally believe that it needs to be wholly self organising though (I won't explain why). I was railing against a top down approach where people point at something that they have simulated and claim that they have reproduced something. It's like drawing a picture of a house and pretending that you have built shelter. The function of both is completely different.
This all started with us disagreeing that the brain was inefficient and slow and maybe this just comes down to semantics. My position all this time is that we don't actually know that for sure and it's a difficult statement to actually qualify. We may suspect that to be the case because of what we understand about evolution. We know that the brain is not optimal precisely because we are able to generalise and adapt, but that's not the same as it being inefficient and slow. Nor am I denying that there isn't some scope for some redundancy. This is getting into the realms of complexity theory though.
(July 23, 2015 at 9:41 am)Rhythm Wrote: -You're aiming for a number "x", a measure of processing power whereby you feel that strong AI would be possible.
No I am not. I never said that. I was trying to demonstrate that we don't have enough processing power and that the more generalisable the adaptive solution, the more complex it will be and the more processing power will be required. I don't know how much processing power is required but it's more than most people think that it is. Not only that but we will need to evolve our solutions (something I tried to demonstrate above). The more complex our adaptive solutions are more the more processing is required to configure it.
Not only that but we also need to understand what's going on, that might be the greatest limiting factor. That takes time, effort, the ability to measure at finer detail and no amount of processing power will do that for us.
I am not saying that strong AI requires a certain amount of processing power or that it needs to model what we see in the brain. I do personally believe that it needs to be wholly self organising though (I won't explain why). I was railing against a top down approach where people point at something that they have simulated and claim that they have reproduced something. It's like drawing a picture of a house and pretending that you have built shelter. The function of both is completely different.
This all started with us disagreeing that the brain was inefficient and slow and maybe this just comes down to semantics. My position all this time is that we don't actually know that for sure and it's a difficult statement to actually qualify. We may suspect that to be the case because of what we understand about evolution. We know that the brain is not optimal precisely because we are able to generalise and adapt, but that's not the same as it being inefficient and slow. Nor am I denying that there isn't some scope for some redundancy. This is getting into the realms of complexity theory though.