Posts: 2412
Threads: 5
Joined: January 3, 2018
Reputation:
22
RE: Entropy/lose the math, need laypersons def..
January 13, 2018 at 11:20 pm
As others have pointed out, entropy isn't exactly a measure of disorder. It is *actually* a description of how much information is lost when we take a 'large scale' view as opposed to a 'microscopic view'.
So, for example, at the microscopic level, the air in a room is a *lot* of molecules of oxygen, nitrogen, water, carbon dioxide, etc. These molecules all have their own direction and speed of motion. So a microscopic account of the air in the room would require *at least* the position, direction of movement, and speed of movement and type of molecule for each molecule in the room..
The *large scale* description the other hand, would have the temperature, the pressure of each type of gas, and the volume of the room. So instead of septillions of pieces of information, it only requires a few. That loss of information in the large scale description is the entropy.
Now, temperature is an average of the energy of motion for all those molecules. Pressure is how hard they hit the walls, etc. But each large scale description has many, many small scale ways it could arise.
Now, to say entropy always increases means that the large scale description will inevitably lose information about the true state of things over time.
The connection to disorder is fairly easy: in essence, a disordered mess is fairly easy to give a large scale description for: it's a random mess. But an *ordered* situation requires fewer small scale pieces of information: think of a regular array of atoms in a crystal: each in a very specific place. To describe the whole crystal microscopically only requires the *pattern* of the crystal and a few atoms to orient things. So the loss of information in going to the large scale is smaller: less entropy.
Now, when we heat something up, it turns out that there are more microscopic possibilities, so it the small scale description requires a lot more specifics and so the large scale description loses more information: a higher entropy. This is why heat is a wonderful way to increase entropy of a system.
Now, some specifics: when water freezes, the ice is a crystal and has much lower entropy than the more random water. But, to freeze water, you have to remove heat from it, which warms the surroundings. That increases the entropy of the surroundings. The playoff is what determines whether the water will turn into ice or not. If the temperature is low, the increase of entropy in the ice has less effect than the heat released into the environment, so ice freezes at lower temperatures.
I hope this helps.