Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: April 25, 2024, 8:11 pm

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Consciousness causes higher entropy compared to unconscious states in the human brain
#11
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 10:55 am)polymath257 Wrote: OK, I looked at the original article (in arxiv.org) and there is even less there than I expected. MANY basic problems in this article.

First, there were only *9* people tested. There is NOTHING that can be done statistically with a study of 9 people. I could stop there, but the problems keep going.

They determined if different channels on EEGs were 'connected' by whether the correlations met a certain 'threshold', but that threshold was never given explicitly. Then, if the values exceeded that threshold, they were set to 1 and otherwise set to 0.

Next, they use an *incredibly* simplistic model for the 'complexity', essentially that of a binomial distribution. The problem is that such a distribution has only one parameter (in this case the number of correlated channels) and the characteristics are such that 'more entropy' simply means 'more connections active' (unless more than half of the channels are correlated).

So, their ultimate 'result' is that there are more active connections when someone is awake than when they are asleep or in a coma.

Since they use Shannon entropy instead of thermodynamic entropy, and since their actual model is so simplistic, the claim that entropy 'causes higher consciousness' is just not supported by this study.

TL;DR: Their experiment uses too few people, is based on a model that simply states people that are awake have more active brains.

The connection to entropy is, truthfully, completely bogus.

Y'know, I'm glad you're around, Polymath. Otherwise we couldn't tell a legitimate scientific study from the babbling of nine undergrads with an EEG. A completely naturalistic explanation for consciousness/qualia is such a holy grail that I can't help entertaining a possible explanation of it.
Reply
#12
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 10:55 am)polymath257 Wrote: OK, I looked at the original article (in arxiv.org) and there is even less there than I expected. MANY basic problems in this article.

First, there were only *9* people tested. There is NOTHING that can be done statistically with a study of 9 people. I could stop there, but the problems keep going.

They determined if different channels on EEGs were 'connected' by whether the correlations met a certain 'threshold', but that threshold was never given explicitly. Then, if the values exceeded that threshold, they were set to 1 and otherwise set to 0.

Next, they use an *incredibly* simplistic model for the 'complexity', essentially that of a binomial distribution. The problem is that such a distribution has only one parameter (in this case the number of correlated channels) and the characteristics are such that 'more entropy' simply means 'more connections active' (unless more than half of the channels are correlated).

So, their ultimate 'result' is that there are more active connections when someone is awake than when they are asleep or in a coma.

Since they use Shannon entropy instead of thermodynamic entropy, and since their actual model is so simplistic, the claim that entropy 'causes higher consciousness' is just not supported by this study.

TL;DR: Their experiment uses too few people, is based on a model that simply states people that are awake have more active brains.

The connection to entropy is, truthfully, completely bogus.

Your criticism is reasonably easily shown as bunk for the following reasons:
  • Contrary to your claim, the paper didn't mention that "entropy causes higher consciousness". In fact, the word "cause" can't be found in the paper!
  • Additionally, contrary to your claim, the threshold was explicitly mentioned.

  • Reference-A, Quote from paper, describing threshold: "We tried another less prejudiced method, using surrogates of the original signals, and then computing the average synchrony index among the surrogate population (10 phase-randomised surrogates per original channel/signal)."
  • Biological human brains are generally predicted to behave in quite similar manners, so that 9 people were  examined, does not suddenly invalidate the results.
  • Shannon entropy does not prevent the measurement of the difference between conscious and unconscious states. (As indicated by the writers, Shannon entropy was used to circumvent the enormous values in the EEG results. It is typical in programming to use approximations or do compressions of the input space!)
  • Reference-B, Quote from paper, describing compression rationale: "However, the estimation of C (the combinations of connections between diverse signals), is not feasible due to the large number of sensors; for example, for 35 sensors, the total possible number of pairwise connections is [1442] = 10296, then if we find in the experiment that, say, 2000 pairs are connected, the computation of [102962000] has too large numbers for numerical manipulations, as they cannot be represented as conventional floating point values in, for instance, MATLAB. To overcome this difficulty, we used the well-known Stirling approximation for large n : ln(n!) = n ln(n)˘n".
  • Reference-C, Showing that Shannon Entropy (i.e. a compressed measurement) does not prevent comparison of entropy in systems: https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

My question to you is: 
  • Why do you personally feel (contrary to evidence) that Shannon entropy measure supposedly prevents the measurement of conscious vs unconscious states in the brain, and also, don't you realize that it is typical in programming to encode approximation of dense input spaces?
Reply
#13
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
What you quoted *doesn't* give the threshold for determining the matrix B, which replaces values below the threshold by 0 and above by 1 (which is, in and of itself, a very suspect practice). What value for the synchrony index was chosen for the threshold? They never say.

Sorry, but a sample size of 9 people doesn't demonstrate anything. Too much variation. It doesn't invalidate the results, but it doesn't lend them any support either. In other words, the paper is a null.

I didn't say that Shannon entropy prevents anything. It is simply irrelevant given the naiveté of the model used. This is a one-parameter model. The only thing relevant is the number of activated pairs (and there need be no actual connection to show up as 'connected': only correlation is used). The entropy is simply related to that number. But so are many, many other values, like the mean value.

The compression rationale only shows that the authors don't understand what they are doing. A binomial distribution assumption (their [Np]), will have increasing 'entropy' until p=N/2 and decreasing entropy for larger values of p. This is trivially basic. Their obfuscation is noted.

Stirlings formula is reasonable to use in statistical mechanics, where the N value is on the order of Avagadro's number and the number of different states is similarly large. Here, the number of pairs is only 10,296 and the p value used is just the number of activated pairs. A simple normal distribution is both easier to use and more informative.

Yes, I am very aware of Stirling's formula and how it is used in statistical mechanics and information theory. In essence, the model in the paper is a two-state model (on, off) on the pairs.
Reply
#14
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 4:11 pm)polymath257 Wrote: What you quoted *doesn't* give the threshold for determining the matrix B, which replaces values below the threshold by 0 and above by 1 (which is, in and of itself, a very suspect practice). What value for the synchrony index was chosen for the threshold? They never say.

Sorry, but a sample size of 9 people doesn't demonstrate anything. Too much variation. It doesn't invalidate the results, but it doesn't lend them any support either. In other words, the paper is a null.

I didn't say that Shannon entropy prevents anything. It is simply irrelevant given the naiveté of the model used. This is a one-parameter model. The only thing relevant is the number of activated pairs (and there need be no actual connection to show up as 'connected': only correlation is used). The entropy is simply related to that number. But so are many, many other values, like the mean value.

The compression rationale only shows that the authors don't understand what they are doing. A binomial distribution assumption (their [Np]), will have increasing 'entropy' until p=N/2 and decreasing entropy for larger values of p. This is trivially basic. Their obfuscation is noted.

Stirlings formula is reasonable to use in statistical mechanics, where the N value is on the order of Avagadro's number and the number of different states is similarly large. Here, the number of pairs is only 10,296 and the p value used is just the number of activated pairs. A simple normal distribution is both easier to use and more informative.

Yes, I am very aware of Stirling's formula and how it is used in statistical mechanics and information theory. In essence, the model in the paper is a two-state model (on, off) on the pairs.
  • The "10 phase-randomised surrogates per original channel/signal" threshold paradigm still yielded larger values of entropy for conscious states compared to unconscious states.
  • Reference-A: "It turns out that the value of the magnitude of synchrony of the surrogates is close to the one for the aforementioned baseline chosen, so the results do not vary; nevertheless this new method still assigns the largest entropy to the random signals (surrogates), so there is still the assumption that the average synchrony of the stochastic signals is a good approximation to define connections among brain networks."
  • The sample size of nine people reasonably supports their work, because a far as I detect, (wrt evidence) biological brains don't vary in the general amount of connections that occur. This means the results are likely generalizable.
  • You did indeed say that Shannon Entropy usage  (instead of thermodynamic entropy usage) and a simplistic model, did not obtain as their paper claims. The whole point of Stirling approximation was to simplify their model beyond the initial scope of dense input points, so it's no surprise that their model turned out to be "simplistic". 
  • Reference-B: "However, the estimation of C (the combinations of connections between diverse signals), is not feasible due to the large number of sensors; for example, for 35 sensors, the total possible number of pairwise connections is [1442] = 10296, then if we find in the experiment that, say, 2000 pairs are connected, the computation of [102962000] has too large numbers for numerical manipulations, as they cannot be represented as conventional floating point values in, for instance, MATLAB."
  • Furthermore, the usage of Shannon Entropy does not inhibit the comparison of entropy amidst conscious/unconscious states; although you claimed the paper failed to make that comparison, supposedly due to usage of Shannon Entropy and model simplicity, although the entire point of Shannon Entropy formalism was to simplify the initial input space.
  • I detect that the following addresses your concern regarding supposed obfuscation: "...the top of the curve representing more possible combinations to handle information/energy exchanges. On the other hand, in the extremes of this curve we find fewer microstates, thus these are not optimal situations to process the many microstates in the environment. The key then is not to reach the maximum number of units interacting (which would be all-to-all connections and thus only one possible microstate), but rather the largest possible number of configurations allowed by the constraints."
  • [Image: vAxbUnc.png]


Reply
#15
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
Popcorn
Reply
#16
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
One again, using entropy for this model is just silly. You have an N value of 10,296 and a p value that is somewhere between 0 and 10296. The computed entropy *only* depends on N and p, not on anything else. Computing the binomial coefficient at all (let alone doing the surrogate of taking its logarithm). Just use the value p. Computing the entropy is silly.

The best approximation to the binomial distribution in this case is a normal distribution with mean and standard deviation produced by known values for binomial distributions. This negates the whole reason they used entropy.

And what were the results? Ultimately that being awake means more active pairs. The connection to entropy of any sort (Shannon or thermodynamic) is a red herring.

Your picture should look more like a Bell curve than a parabola. Think about why.
Reply
#17
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 4:51 pm)polymath257 Wrote: One again, using entropy for this model is just silly. You have an N value of 10,296 and a p value that is somewhere between 0 and 10296. The computed entropy *only* depends on N and p, not on anything else. Computing the binomial coefficient at all (let alone doing the surrogate of taking its logarithm). Just use the value p. Computing the entropy is silly.

The best approximation to the binomial distribution in this case is a normal distribution with mean and standard deviation produced by known values for binomial distributions. This negates the whole reason they used entropy.

And what were the results? Ultimately that being awake means more active pairs. The connection to entropy of any sort (Shannon or thermodynamic) is a red herring.
  • I don't detect it is "silly", given that Shannon entropy already affords compressed measurements, and they wish to compress the problem space.
  • Wikipedia/Entropy: "Entropy is one of several ways to measure diversity. Specifically, Shannon entropy is the logarithm of 1D, the  index with parameter equal to 1."
  • And in the OP, I already mentioned that there are more "neuronal interactions" (aka active pairs) during conscious state, so I don't know why you repeated that above!

(January 31, 2018 at 4:51 pm)polymath257 Wrote: Your picture should look more like a Bell curve than a parabola. Think about why.

The pictures (and the spoiler) already looked like bell curves. What do you mean by your statement above?

(January 31, 2018 at 6:09 am)Succubus Wrote: By brain power do you mean processing power? The machines ran off with that award many years ago. 

Wrong. The human brain is currently the most powerful, most efficient cognitive machine on the planet.
Reply
#18
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
The 'problem space' needs no compression. 0<=p<=N. That's all. No approximations. No entropy needed. Nothing.
Reply
#19
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 5:07 pm)uncool Wrote:
(January 31, 2018 at 6:09 am)Succubus Wrote: By brain power do you mean processing power? The machines ran off with that award many years ago. 

Wrong. The human brain is currently the most powerful, most efficient cognitive machine on the planet.

Thank you for emphasizing my point.
It's amazing 'science' always seems to 'find' whatever it is funded for, and never the oppsite. Drich.
Reply
#20
RE: Consciousness causes higher entropy compared to unconscious states in the human brain
(January 31, 2018 at 7:03 pm)Succubus Wrote: Thank you for emphasizing my point.

No, I showed your point to be irrelevant. Recall that my initial words were that the AGI (which is not yet here) will be more powerful than the human mind.

This is why your point was irrelevant, it doesn't alter the validity of the words underlined in the OP, or contribute anything novel to the topic at hand.

(January 31, 2018 at 6:09 pm)polymath257 Wrote: The 'problem space' needs no compression. 0<=p<=N. That's all. No approximations. No entropy needed. Nothing.

I don't know what your problem is with the OP, especially given the following:
  • Finally, the paper deals with Stirling approximation, which concerns Shannon entropy measure.
  • Furthermore we know this about Shannon Entropy measure based on Wikipedia/entropy...: "Shannon entropy is the logarithm of 1D, the index with parameter equal to 1.
  • In other words, the paper and Shannon entropy measure are compatible; but you admit twice now that the paper is "one parameter model" while you clearly/reasonably forget that "Shannon entropy" may be as Wikipedia basically underlines, "one parameter aligned". (This is likely why you ironically claim the paper and Shannon entropy have no business being expressed together!)
People may not want to "kudos" my response, and instead kudos your false responses, but we also know scientific evidence doesn't care about peoples feelings!
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Human brain activity during stone tool production. The Grand Nudger 6 1498 May 2, 2018 at 9:26 pm
Last Post: Fireball
  Entropy/lose the math, need laypersons def.. Brian37 21 3035 January 13, 2018 at 11:20 pm
Last Post: polymath257
  Electromagnetic waves cause brain cancer ? Marsellus Wallace 26 4978 January 27, 2016 at 8:47 pm
Last Post: Fireball
  Atheism/Theism and Left/Right Brain? bambi_swag 11 4545 October 4, 2015 at 7:24 am
Last Post: robvalue
  Thoughts on a multiverse/consciousness Gessle 32 6528 February 14, 2015 at 4:21 pm
Last Post: Alex K
  Please Help Me Understand Consciousness Mudhammam 17 3022 March 25, 2014 at 2:08 am
Last Post: Mudhammam
  Does Matter Create Consciousness? Or Vice Versa? Mudhammam 24 6625 January 2, 2014 at 11:39 am
Last Post: Mudhammam
  Brain Cells and the Universe FallentoReason 31 14534 February 8, 2013 at 12:05 pm
Last Post: Zone
  How to measure consciousness? The_Flying_Skeptic 3 2225 November 27, 2011 at 7:06 am
Last Post: The Grand Nudger
  Origins of consciousness Mal Kiever 15 3437 November 12, 2011 at 9:27 am
Last Post: KichigaiNeko



Users browsing this thread: 1 Guest(s)