(March 29, 2016 at 11:56 am)Esquilax Wrote: "information" is not some inherent property or quantity of an object, it's a post hoc meaning applied after a string of potentially unguided characters or processes are interpreted by a mind and scanned for repeating patters and so on.Information is an inherent property of DNA. In fact the information stored in DNA is enormous. It makes no difference that a third party is aware or unaware of it, that is irrelevant. The ability of the DNA to self diagnose a fault and to then apply a correction is certainly evidence of inherent information.
(March 29, 2016 at 11:56 am)Esquilax Wrote:But we aren't talking about a small reversal in disorder when it comes to DNA. We are talking about going from nothing living to the human genetic code which is 3 billion bits long. This represents a massive reversal of disorder, against all odds.Quote:Given that living systems with DNA have shown a progressive and very large increase in information, we can say that evolution requires entropy to be reversed in a step by step fashion over millions of years.
... Except that now we're talking about a definition of entropy that is not physically impossible, and is hence not an argument against evolution. In fact, I'm beginning to suspect you have no idea what either definition of entropy means, because the fucking Wikipedia article provides an absolutely mundane example of a decrease in informational entropy right there in its introduction.
(March 29, 2016 at 11:56 am)Esquilax Wrote: Entropy in information theory refers to uncertainty in deriving predictions about information, it fully accepts as a foundational principle that information can be naturally occurring, and that the entropy inherent in that information can rise and fall depending on numerous factors influencing the results: the person who coined the usage and came up with its measurements used coin tosses as the primary unit of measurement (are you saying that the outcomes of coin tosses are intelligently designed, because in this context they have information?) and demonstrates how entropy within that set up can both increase and decrease . In the case of a coin toss making entropy decrease is as simple as weighting the coin such that one outcome is more probable than the other, decreasing the level of uncertainty in predicting the outcome of any given toss.But this requires intelligent input doesn't it? In evolution, there is no intelligent input because that would indicate design, and we can't have that can we?
Quote:Decreases in entropy are not precluded in information theory and are, in fact, trivially easy to come across: the lynchpin of your argument falls away, if you're using this definition and, if you switch to thermodynamic entropy then my observations about cosmic rays are perfectly apt.I've no problem with isolated reversals in entropy, that isn't the problem. It is the scale of the reversals required for life that is the problem.