RE: Consciousness causes higher entropy compared to unconscious states in the human brain
January 31, 2018 at 8:08 pm
(This post was last modified: January 31, 2018 at 8:30 pm by uncool.)
(January 31, 2018 at 7:03 pm)Succubus Wrote: Thank you for emphasizing my point.
No, I showed your point to be irrelevant. Recall that my initial words were that the AGI (which is not yet here) will be more powerful than the human mind.
This is why your point was irrelevant, it doesn't alter the validity of the words underlined in the OP, or contribute anything novel to the topic at hand.
(January 31, 2018 at 6:09 pm)polymath257 Wrote: The 'problem space' needs no compression. 0<=p<=N. That's all. No approximations. No entropy needed. Nothing.
I don't know what your problem is with the OP, especially given the following:
- Your words about the channel-paradigm from the paper in reply 9: "The problem is that such a distribution has only one parameter (in this case the number of correlated channels) ...".
- Your words about the channel-paradigm from the paper in reply 13: "This is a one-parameter model".
- Finally, the paper deals with Stirling approximation, which concerns Shannon entropy measure.
- Furthermore we know this about Shannon Entropy measure based on Wikipedia/entropy...: "Shannon entropy is the logarithm of 1D, the index with parameter equal to 1.
- In other words, the paper and Shannon entropy measure are compatible; but you admit twice now that the paper is "one parameter model" while you clearly/reasonably forget that "Shannon entropy" may be as Wikipedia basically underlines, "one parameter aligned". (This is likely why you ironically claim the paper and Shannon entropy have no business being expressed together!)