Entropy

We'll deal with entropy in two ways. First from a microscopic view, how we can model entropy, and then from a macroscopic view, how can we quantify entropy.

Entropy is related to the universe slowly moving to its most probable configuration with all the energy equally dispersed in all places. There is no force driving things to this configuration, it is simply the natural consequence of this being the most likely outcome. The universe is a big place to explore. Eventually the energy will "wander off" to spread out all over it. For now, we find it concentrated in particular areas, but given a chance, it will wander off.

Entropy is related to the dispersal of energy. From a microscopic view this refers to the number of different ways that the energy of the system can be distributed. How many different indistinguishable ways can we "arrange" the given amount of energy (matter) in the system. Each of these "arrangements" is called a microstate. The number of these microstates is a measure of the entropy. More microstates = more entropy. Thus we discover that very special arrangements of molecules (like crystalline solids) have very low entropy, while gaseous materials (with molecules wandering all over) have high entropy.

The reason systems tend toward higher entropy states is simply that those states are more probable.

Imagine all the gas molecules in the room you are sitting in. Now in your mind divide the room into two sides (left and right). Each molecule (which is carrying some amount of energy) could be anywhere in that room. Each molecule has a 50/50 chance of being either on the right side or the left side of the room. What are the odds that all of the molecules (and there are a whole lot of them) are on the right side of the room. The chances are not just small. They are so small that we don't have to worry about this ever happening. What are the odds that the molecules are essentially evenly distributed between both sides of the room. The odds are astronomically large. So large that we can be assured that this is how we will find the molecules. That is the essence of the microscopic view of entropy. There are fewer ways to arrange the molecules and have them all on one side of the room. Fewer microstates = low entropy. There are more ways to arrange the molecules and have them evenly distributed. More microstate = more entropy. Not only that, you can predict what will happen in moving between these two possibilities. Imagine all the molecules happened to start out on the lefthand side because there was a barrier in the middle of the room. What would happen if you removed the barrier? The molecules would all move out into the room until they were equally distributed. The system would go spontaneously from the low entropy state (all on the left) to the high entropy state (all spread out). The reverse, however, would never happen.

On a macroscopic scale, we "quantify" entropy by measuring heat flow. The change in entropy is defined as the "reversible" heat divided by the absolute temperature.

\[\Delta S = {q_{\rm rev} \over T}\]

For the most part, we will be discussing (temperature change, phase change) the heat we have been calculating as the "reversible heat." Although the term reversible can convey many possible meanings, know that in thermodynamics the term reversible really means that the process is going forward along a very specific path. So even though heat (q) and work (w) are not state functions themselves because they are path dependent, a specifically defined path (the reversible one) allows them to match the changes in certain state functions. So think of "reversible" as a specifically defined path for a given process. For some processes like gas expansion or chemistry this can be somewhat more complicated. We will not spend lots of time on gas expansion, but we will look a lot at chemistry. For chemical reactions, we are interested in the standard enthalpy change for the reaction. This we will relate not to the heat of the reaction, but to the entropy of the final state (the products) minus the entropy of the initial state (the reactant). Thus we will need some measure of the entropy of different compounds.

\[\Delta S_{\rm rxn}^\circ = \Sigma n S_{\rm products}^\circ - \Sigma n S_{\rm reactants}^\circ\]

Finally, entropy change is related to the "amount" of thermal energy in a sample. As the energy "leaves" (q < 0), then the entropy will decrease. Or as energy "comes in" (q > 0), the entropy will increase. What matters is how much this energy is compared to the average amount that is already in the system (or surroundings). The average thermal energy is measured by the temperature. When the temperature is very high, heat flow results in only a minor change in the entropy. In contrast , when the temperature is very low, heat flow results in an enormous change. It is because of this that heat flows spontaneously from high temperature to low temperature. Heat flowing into the lower temperature object will result in an increase of the entropy of that object. But heat flowing out of the high temperature object will result in a lowering of the entropy of the high temperature object. In both cases, the amount of energy is identical. However, since the entropy is the heat divided by the temperature, the increase in entropy for the low temperature object will be higher than the decrease in entropy for the high temperature object.