Sorry I missed your reply. I'll split it over two posts, one for each subject.
One problem here is the equivocation about the term 'efficiency'. One of the first things drilled into my head when I first started studying computer science is that you only ever use the term efficient when describing something else. Efficient in terms of speed? Memory usage? Power consumption?
When writing a simple computer program you need to prioritise how it will be efficient. Take for example a program that needs to return a random prime number from a pool of 50 million of them. If you are prioritising speed (efficient over time) then you would just calculate them all and store them in memory. Say memory was the most important resource and you had lots of time to spare, you could just pick a random number between 1 and 50 million and calculate prime numbers until you reached that number of primes.
I get your point about evolution. There is always 'stuff' left over that's considered junk, except when understood within a wider context it never is. So take 'junk DNA'. Forget for the moment that people dispute whether or not it is actually junk, at the very least it widens the search space during an evolutionary run (see the Neutral gene theory for this).
Yes of course the brain can be made more efficient for the purpose of what it currently does, but there are two things to remember. This is not the same as being "inefficient, wasteful, and relatively slow" (relatively slow compared to what?). Secondly if you could hack the brain and make it more efficient then you will might well lose genericity. If there is a cost then evolution re-uses what it already has or gets rid of it. Take glia cells for example. They provide structure and maintain neurons, but for the last few decades you get these cycles in neuroscience where people theorise whether they are capable of processing themselves. Just because they provide structure to the brain does not mean that they cannot perform other functions as well.
But you aren't comparing like for like. Those Chinese workers are capable of performing far more than the calculations that you can run on your computer. Whereas if we took a single brain, broke it apart and could engineer all those neurons and axons to perform a specific functions with the same ease that we can write computer programs then there would be enough material to work with and it would be more power efficient. A single live neuron has more computational power than a whole classical artificial neural network.
Generally what you find again and again in Artificial Intelligence is that you can make some real serious progress with very little effort, but getting it to scale becomes exponentially harder the more generic you want it. This is because the situations any AI has to deal with increases exponentially and this is also why the field of classical AI failed.
Your point about energy conversion applies to other methods of generating electricity. How much wind power is lost when using turbines? How much heat is wasted when burning coal? Nuclear power is probably the most efficient I suppose but now we're getting off subject. This is a thread about artificial general intelligence and therefore by definition is about how generalisable an adaptive system is.
Also because there is a cost to having a big brain. Greater metabolic cost. Higher mother mortality rate for example. And there is an evolutionary advantage to making better use of what you have. Because there is no end point with evolution there is no such thing as an optimal solution, but I disagree that we know that the brain is inefficient. We may of course suspect that it is, and it may indeed be so, but we don't actually know until we have come up with a better solution ourselves. And that is actually what I am trying to do myself. In my own research I am trying to produce a wholly new architecture that is an alternative to using biologically plausible neural networks. I hoped to get some kind of efficiency boost somehow, but the main reason was because the main thing limiting progress is our understanding of how to engineer neural networks. As it turns out I've made efficiencies in terms of memory. I haven't yet compared how they compare in terms of speed though.
(July 14, 2015 at 9:26 am)Rhythm Wrote: Human intelligence doesn't -require- the scale of architecture we're born with, we can do a great deal with much less (and a great deal less than that would satisfy any useful definition of intelligence). We -know- that biological implementations are inefficient, wasteful, and relatively slow.
(July 17, 2015 at 3:16 pm)Rhythm Wrote:(July 17, 2015 at 2:47 pm)I_am_not_mafia Wrote: We don't know that.Yes, we do. If you -purpose built- a human being, you would leave out vestigials, and you would scale the human being appropriately - you might rearrange few bits while you were at it. Because we are -evolved-...rather than designed, no such consideration was made. That it takes longer to evolve a brain that it does to build a better chipset is -fairly- well evidenced.......
In terms of wattage the processing power of the brain is extremely efficient compared to an equivalent super computer which will require megawatts of electricity instead of about a hundred watts.
One problem here is the equivocation about the term 'efficiency'. One of the first things drilled into my head when I first started studying computer science is that you only ever use the term efficient when describing something else. Efficient in terms of speed? Memory usage? Power consumption?
When writing a simple computer program you need to prioritise how it will be efficient. Take for example a program that needs to return a random prime number from a pool of 50 million of them. If you are prioritising speed (efficient over time) then you would just calculate them all and store them in memory. Say memory was the most important resource and you had lots of time to spare, you could just pick a random number between 1 and 50 million and calculate prime numbers until you reached that number of primes.
I get your point about evolution. There is always 'stuff' left over that's considered junk, except when understood within a wider context it never is. So take 'junk DNA'. Forget for the moment that people dispute whether or not it is actually junk, at the very least it widens the search space during an evolutionary run (see the Neutral gene theory for this).
Yes of course the brain can be made more efficient for the purpose of what it currently does, but there are two things to remember. This is not the same as being "inefficient, wasteful, and relatively slow" (relatively slow compared to what?). Secondly if you could hack the brain and make it more efficient then you will might well lose genericity. If there is a cost then evolution re-uses what it already has or gets rid of it. Take glia cells for example. They provide structure and maintain neurons, but for the last few decades you get these cycles in neuroscience where people theorise whether they are capable of processing themselves. Just because they provide structure to the brain does not mean that they cannot perform other functions as well.
(July 14, 2015 at 9:26 am)Rhythm Wrote: Nevertheless, energy must be supplied (and in the case of our brains a fairly robust amount of chemical inputs in addition - though we could boil that back down to energy as well, sure), and living organisms have shit conversion....just atrocious.
(July 14, 2015 at 9:26 am)Rhythm Wrote: That's why we use machines in the first place. More work over less time for a smaller amount of energy put in. Imagine how many bowls of rice it would take, for example, if the internet where a room full of chinese people doing all of this on pen and paper (and supposing we could actually get that much work out of them.
But you aren't comparing like for like. Those Chinese workers are capable of performing far more than the calculations that you can run on your computer. Whereas if we took a single brain, broke it apart and could engineer all those neurons and axons to perform a specific functions with the same ease that we can write computer programs then there would be enough material to work with and it would be more power efficient. A single live neuron has more computational power than a whole classical artificial neural network.
Generally what you find again and again in Artificial Intelligence is that you can make some real serious progress with very little effort, but getting it to scale becomes exponentially harder the more generic you want it. This is because the situations any AI has to deal with increases exponentially and this is also why the field of classical AI failed.
Your point about energy conversion applies to other methods of generating electricity. How much wind power is lost when using turbines? How much heat is wasted when burning coal? Nuclear power is probably the most efficient I suppose but now we're getting off subject. This is a thread about artificial general intelligence and therefore by definition is about how generalisable an adaptive system is.
(July 14, 2015 at 9:26 am)Rhythm Wrote: NS won't actually weed out a big brain just because it isn't fully leveraged. It would only weed out a big brain (in favor of a smaller one, for example) if that big brain was failing to deliver.
Also because there is a cost to having a big brain. Greater metabolic cost. Higher mother mortality rate for example. And there is an evolutionary advantage to making better use of what you have. Because there is no end point with evolution there is no such thing as an optimal solution, but I disagree that we know that the brain is inefficient. We may of course suspect that it is, and it may indeed be so, but we don't actually know until we have come up with a better solution ourselves. And that is actually what I am trying to do myself. In my own research I am trying to produce a wholly new architecture that is an alternative to using biologically plausible neural networks. I hoped to get some kind of efficiency boost somehow, but the main reason was because the main thing limiting progress is our understanding of how to engineer neural networks. As it turns out I've made efficiencies in terms of memory. I haven't yet compared how they compare in terms of speed though.