RE: Consciousness causes higher entropy compared to unconscious states in the human brain
January 31, 2018 at 5:07 pm
(This post was last modified: January 31, 2018 at 5:12 pm by uncool.)
(January 31, 2018 at 4:51 pm)polymath257 Wrote: One again, using entropy for this model is just silly. You have an N value of 10,296 and a p value that is somewhere between 0 and 10296. The computed entropy *only* depends on N and p, not on anything else. Computing the binomial coefficient at all (let alone doing the surrogate of taking its logarithm). Just use the value p. Computing the entropy is silly.
The best approximation to the binomial distribution in this case is a normal distribution with mean and standard deviation produced by known values for binomial distributions. This negates the whole reason they used entropy.
And what were the results? Ultimately that being awake means more active pairs. The connection to entropy of any sort (Shannon or thermodynamic) is a red herring.
- I don't detect it is "silly", given that Shannon entropy already affords compressed measurements, and they wish to compress the problem space.
- Wikipedia/Entropy: "Entropy is one of several ways to measure diversity. Specifically, Shannon entropy is the logarithm of 1D, the index with parameter equal to 1."
- And in the OP, I already mentioned that there are more "neuronal interactions" (aka active pairs) during conscious state, so I don't know why you repeated that above!
(January 31, 2018 at 4:51 pm)polymath257 Wrote: Your picture should look more like a Bell curve than a parabola. Think about why.
The pictures (and the spoiler) already looked like bell curves. What do you mean by your statement above?
(January 31, 2018 at 6:09 am)Succubus Wrote: By brain power do you mean processing power? The machines ran off with that award many years ago.
Wrong. The human brain is currently the most powerful, most efficient cognitive machine on the planet.