(March 28, 2016 at 9:04 pm)AJW333 Wrote: If we look at entropy from an informational point of view we have;
"in data transmission and information theory, a measure of the loss of information in a transmitted signal or message."
If you're not talking about thermodynamic entropy, then there's no sense in which an increase in entropy is impossible, nor even a firm demonstration that the terms of your definition apply to genetics. To be clear, "information" is not some inherent property or quantity of an object, it's a post hoc meaning applied after a string of potentially unguided characters or processes are interpreted by a mind and scanned for repeating patters and so on. There's information in a rock, sitting on the ground, in terms of positional data, chemical composition, etc: you've mixed up your definitions and are now no longer discussing something which even hints at design.
Quote:Given that living systems with DNA have shown a progressive and very large increase in information, we can say that evolution requires entropy to be reversed in a step by step fashion over millions of years.
... Except that now we're talking about a definition of entropy that is not physically impossible, and is hence not an argument against evolution. In fact, I'm beginning to suspect you have no idea what either definition of entropy means, because the fucking Wikipedia article provides an absolutely mundane example of a decrease in informational entropy right there in its introduction.
I mean, strictly speaking, entropy in information theory is a measure of unpredictability within transmitted information content, which is why I would suggest that next time, before you make a statement, you do more than look up the definition on dictionary.com, like, say, actually doing a little research, because information within this context is entirely different than what you're making it out to be. Entropy in information theory refers to uncertainty in deriving predictions about information, it fully accepts as a foundational principle that information can be naturally occurring, and that the entropy inherent in that information can rise and fall depending on numerous factors influencing the results: the person who coined the usage and came up with its measurements used coin tosses as the primary unit of measurement (are you saying that the outcomes of coin tosses are intelligently designed, because in this context they have information?) and demonstrates how entropy within that set up can both increase and decrease . In the case of a coin toss making entropy decrease is as simple as weighting the coin such that one outcome is more probable than the other, decreasing the level of uncertainty in predicting the outcome of any given toss.
This is a fundamental failure of research on your part, man. You're using a definition that's entirely irrelevant to what you're attempting to prove, you're even representing that definition wrong, and frankly, I'm willing to bet that this is due to a lack of research beyond the basic dictionary definition. You can read the Wikipedia article if you like, or any number of other educational outlets that can explain how this aspect of information theory works, because as of right now you clearly have no idea at all what this entails.
Quote: If you want to deny that this is the result of design, you can cry "open system," but how does the application of heat and cosmic rays, plus the occasional space rock add to the pool of information within the DNA?
See, I was being charitable and assuming that you were discussing entropy in its common usage, where it's at least vaguely connected to the subject of evolution. It turns out that instead you're misusing an arcane and irrelevant technical definition, but I don't think I can be blamed for not predicting your abject, random failure of understanding here. Decreases in entropy are not precluded in information theory and are, in fact, trivially easy to come across: the lynchpin of your argument falls away, if you're using this definition and, if you switch to thermodynamic entropy then my observations about cosmic rays are perfectly apt.
"YOU take the hard look in the mirror. You are everything that is wrong with this world. The only thing important to you, is you." - ronedee
Want to see more of my writing? Check out my (safe for work!) site, Unprotected Sects!
Want to see more of my writing? Check out my (safe for work!) site, Unprotected Sects!