RE: Consciousness causes higher entropy compared to unconscious states in the human brain
January 31, 2018 at 4:29 pm
(This post was last modified: January 31, 2018 at 4:50 pm by uncool.)
(January 31, 2018 at 4:11 pm)polymath257 Wrote: What you quoted *doesn't* give the threshold for determining the matrix B, which replaces values below the threshold by 0 and above by 1 (which is, in and of itself, a very suspect practice). What value for the synchrony index was chosen for the threshold? They never say.
Sorry, but a sample size of 9 people doesn't demonstrate anything. Too much variation. It doesn't invalidate the results, but it doesn't lend them any support either. In other words, the paper is a null.
I didn't say that Shannon entropy prevents anything. It is simply irrelevant given the naiveté of the model used. This is a one-parameter model. The only thing relevant is the number of activated pairs (and there need be no actual connection to show up as 'connected': only correlation is used). The entropy is simply related to that number. But so are many, many other values, like the mean value.
The compression rationale only shows that the authors don't understand what they are doing. A binomial distribution assumption (their [Np]), will have increasing 'entropy' until p=N/2 and decreasing entropy for larger values of p. This is trivially basic. Their obfuscation is noted.
Stirlings formula is reasonable to use in statistical mechanics, where the N value is on the order of Avagadro's number and the number of different states is similarly large. Here, the number of pairs is only 10,296 and the p value used is just the number of activated pairs. A simple normal distribution is both easier to use and more informative.
Yes, I am very aware of Stirling's formula and how it is used in statistical mechanics and information theory. In essence, the model in the paper is a two-state model (on, off) on the pairs.
- The "10 phase-randomised surrogates per original channel/signal" threshold paradigm still yielded larger values of entropy for conscious states compared to unconscious states.
- Reference-A: "It turns out that the value of the magnitude of synchrony of the surrogates is close to the one for the aforementioned baseline chosen, so the results do not vary; nevertheless this new method still assigns the largest entropy to the random signals (surrogates), so there is still the assumption that the average synchrony of the stochastic signals is a good approximation to define connections among brain networks."
- The sample size of nine people reasonably supports their work, because a far as I detect, (wrt evidence) biological brains don't vary in the general amount of connections that occur. This means the results are likely generalizable.
- You did indeed say that Shannon Entropy usage (instead of thermodynamic entropy usage) and a simplistic model, did not obtain as their paper claims. The whole point of Stirling approximation was to simplify their model beyond the initial scope of dense input points, so it's no surprise that their model turned out to be "simplistic".
- Reference-B: "However, the estimation of C (the combinations of connections between diverse signals), is not feasible due to the large number of sensors; for example, for 35 sensors, the total possible number of pairwise connections is [1442] = 10296, then if we find in the experiment that, say, 2000 pairs are connected, the computation of [102962000] has too large numbers for numerical manipulations, as they cannot be represented as conventional floating point values in, for instance, MATLAB."
- Furthermore, the usage of Shannon Entropy does not inhibit the comparison of entropy amidst conscious/unconscious states; although you claimed the paper failed to make that comparison, supposedly due to usage of Shannon Entropy and model simplicity, although the entire point of Shannon Entropy formalism was to simplify the initial input space.
- I detect that the following addresses your concern regarding supposed obfuscation: "...the top of the curve representing more possible combinations to handle information/energy exchanges. On the other hand, in the extremes of this curve we find fewer microstates, thus these are not optimal situations to process the many microstates in the environment. The key then is not to reach the maximum number of units interacting (which would be all-to-all connections and thus only one possible microstate), but rather the largest possible number of configurations allowed by the constraints."