RE: Consciousness causes higher entropy compared to unconscious states in the human brain
January 31, 2018 at 4:11 pm
What you quoted *doesn't* give the threshold for determining the matrix B, which replaces values below the threshold by 0 and above by 1 (which is, in and of itself, a very suspect practice). What value for the synchrony index was chosen for the threshold? They never say.
Sorry, but a sample size of 9 people doesn't demonstrate anything. Too much variation. It doesn't invalidate the results, but it doesn't lend them any support either. In other words, the paper is a null.
I didn't say that Shannon entropy prevents anything. It is simply irrelevant given the naiveté of the model used. This is a one-parameter model. The only thing relevant is the number of activated pairs (and there need be no actual connection to show up as 'connected': only correlation is used). The entropy is simply related to that number. But so are many, many other values, like the mean value.
The compression rationale only shows that the authors don't understand what they are doing. A binomial distribution assumption (their [Np]), will have increasing 'entropy' until p=N/2 and decreasing entropy for larger values of p. This is trivially basic. Their obfuscation is noted.
Stirlings formula is reasonable to use in statistical mechanics, where the N value is on the order of Avagadro's number and the number of different states is similarly large. Here, the number of pairs is only 10,296 and the p value used is just the number of activated pairs. A simple normal distribution is both easier to use and more informative.
Yes, I am very aware of Stirling's formula and how it is used in statistical mechanics and information theory. In essence, the model in the paper is a two-state model (on, off) on the pairs.
Sorry, but a sample size of 9 people doesn't demonstrate anything. Too much variation. It doesn't invalidate the results, but it doesn't lend them any support either. In other words, the paper is a null.
I didn't say that Shannon entropy prevents anything. It is simply irrelevant given the naiveté of the model used. This is a one-parameter model. The only thing relevant is the number of activated pairs (and there need be no actual connection to show up as 'connected': only correlation is used). The entropy is simply related to that number. But so are many, many other values, like the mean value.
The compression rationale only shows that the authors don't understand what they are doing. A binomial distribution assumption (their [Np]), will have increasing 'entropy' until p=N/2 and decreasing entropy for larger values of p. This is trivially basic. Their obfuscation is noted.
Stirlings formula is reasonable to use in statistical mechanics, where the N value is on the order of Avagadro's number and the number of different states is similarly large. Here, the number of pairs is only 10,296 and the p value used is just the number of activated pairs. A simple normal distribution is both easier to use and more informative.
Yes, I am very aware of Stirling's formula and how it is used in statistical mechanics and information theory. In essence, the model in the paper is a two-state model (on, off) on the pairs.