Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: April 19, 2024, 7:37 pm

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Entropy/lose the math, need laypersons def..
#21
RE: Entropy/lose the math, need laypersons def..
(December 16, 2017 at 4:52 pm)Brian37 Wrote: Is it the idea of going from more order to less order? 

If I read it right, and based on my watching videos of the singularity, scientists are saying that back then, it was more ordered, and now what we observe is less ordered? 

If that is the case, then why do we see now ordered galaxies and solar systems? Or is that a result of expansion creating so much random distance between bodies because of temp differences and cooling?

Whatever you respond with, PLEASE DUMB IT DOWN to a children's party animal balloon metaphor if you can.

Seth Lloyd probably expresses it best: "Entropy is a measure of information".



Reply
#22
RE: Entropy/lose the math, need laypersons def..
As others have pointed out, entropy isn't exactly a measure of disorder. It is *actually* a description of how much information is lost when we take a 'large scale' view as opposed to a 'microscopic view'.

So, for example, at the microscopic level, the air in a room is a *lot* of molecules of oxygen, nitrogen, water, carbon dioxide, etc. These molecules all have their own direction and speed of motion. So a microscopic account of the air in the room would require *at least* the position, direction of movement, and speed of movement and type of molecule for each molecule in the room..

The *large scale* description the other hand, would have the temperature, the pressure of each type of gas, and the volume of the room. So instead of septillions of pieces of information, it only requires a few. That loss of information in the large scale description is the entropy.

Now, temperature is an average of the energy of motion for all those molecules. Pressure is how hard they hit the walls, etc. But each large scale description has many, many small scale ways it could arise.

Now, to say entropy always increases means that the large scale description will inevitably lose information about the true state of things over time.

The connection to disorder is fairly easy: in essence, a disordered mess is fairly easy to give a large scale description for: it's a random mess. But an *ordered* situation requires fewer small scale pieces of information: think of a regular array of atoms in a crystal: each in a very specific place. To describe the whole crystal microscopically only requires the *pattern* of the crystal and a few atoms to orient things. So the loss of information in going to the large scale is smaller: less entropy.

Now, when we heat something up, it turns out that there are more microscopic possibilities, so it the small scale description requires a lot more specifics and so the large scale description loses more information: a higher entropy. This is why heat is a wonderful way to increase entropy of a system.

Now, some specifics: when water freezes, the ice is a crystal and has much lower entropy than the more random water. But, to freeze water, you have to remove heat from it, which warms the surroundings. That increases the entropy of the surroundings. The playoff is what determines whether the water will turn into ice or not. If the temperature is low, the increase of entropy in the ice has less effect than the heat released into the environment, so ice freezes at lower temperatures.

I hope this helps.
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Consciousness causes higher entropy compared to unconscious states in the human brain uncool 40 6046 February 1, 2018 at 1:49 pm
Last Post: The Grand Nudger



Users browsing this thread: 1 Guest(s)