2nd Law of Thermodynamics

The second law tells us the direction which things happen. Heat flows from high temperature to low temperature (but not the other way around). Balls roll down hill (not up). In chemistry, we are interested in the direction of chemical change.

The second law is related to entropy. Entropy can be a challenging concept. Worse yet there are many misconceptions about entropy that exist in the world. Such as entropy is "disorder". Entropy is not disorder.


more reading on the 2nd Law.


The 2nd Law

The 2nd Law of Thermodynamics states that any process that happens spontaneously will lead to an increase in the entropy of the universe. Entropy is given the symbol S.

Or

\[\Delta S_{\rm univ} > 0\]

By "spontaneously" we means that it is a process (a reaction for chemists) that tends to happen in the direction indicated (as written). Spontaneous reactions are poised to go forward provided there is nothing there to stop it. When a process or reaction is determined to be non-spontaneous, then it tends to not happen as written (because the reverse process is actually the spontaneous one). Chemical reactions that are spontaneous in a thermodynamic sense do not necessarily happen quickly or in many cases, happen at all. This stopping point that prevents spontaneous reactions from actually happening is a whole other area of chemistry called kinetics which deals with the rates of reactions. For a full understanding of chemical reactions you need to consider both the thermodynamics of the reaction and the kinetics of the reaction. But thermodynamics allows us to predict the range of possible things that could happen.

What do we mean by "universe"? When looking at changes in entropy, we have to consider not only changes for the system (the stuff we are studying), but also changes for the surroundings. We can add these two together to get the total entropy change for everything. In the largest sense, everything = the universe.

Next we need to think about the new state function, entropy.


Entropy

We'll deal with entropy in two ways. First from a microscopic view, how we can model entropy, and then from a macroscopic view, how can we quantify entropy.

Entropy is related to the universe slowly moving to its most probable configuration with all the energy equally dispersed in all places. There is no force driving things to this configuration, it is simply the natural consequence of this being the most likely outcome. The universe is a big place to explore. Eventually the energy will "wander off" to spread out all over it. For now, we find it concentrated in particular areas, but given a chance, it will wander off.

Entropy is related to the dispersal of energy. From a microscopic view this refers to the number of different ways that the energy of the system can be distributed. How many different indistinguishable ways can we "arrange" the given amount of energy (matter) in the system. Each of these "arrangements" is called a microstate. The number of these microstates is a measure of the entropy. More microstates = more entropy. Thus we discover that very special arrangements of molecules (like crystalline solids) have very low entropy, while gaseous materials (with molecules wandering all over) have high entropy.

The reason systems tend toward higher entropy states is simply that those states are more probable.

Imagine all the gas molecules in the room you are sitting in. Now in your mind divide the room into two sides (left and right). Each molecule (which is carrying some amount of energy) could be anywhere in that room. Each molecule has a 50/50 chance of being either on the right side or the left side of the room. What are the odds that all of the molecules (and there are a whole lot of them) are on the right side of the room. The chances are not just small. They are so small that we don't have to worry about this ever happening. What are the odds that the molecules are essentially evenly distributed between both sides of the room. The odds are astronomically large. So large that we can be assured that this is how we will find the molecules. That is the essence of the microscopic view of entropy. There are fewer ways to arrange the molecules and have them all on one side of the room. Fewer microstates = low entropy. There are more ways to arrange the molecules and have them evenly distributed. More microstate = more entropy. Not only that, you can predict what will happen in moving between these two possibilities. Imagine all the molecules happened to start out on the lefthand side because there was a barrier in the middle of the room. What would happen if you removed the barrier? The molecules would all move out into the room until they were equally distributed. The system would go spontaneously from the low entropy state (all on the left) to the high entropy state (all spread out). The reverse, however, would never happen.

On a macroscopic scale, we "quantify" entropy by measuring heat flow. The change in entropy is defined as the "reversible" heat divided by the absolute temperature.

\[\Delta S = {q_{\rm rev} \over T}\]

For the most part, we will be discussing (temperature change, phase change) the heat we have been calculating as the "reversible heat." Although the term reversible can convey many possible meanings, know that in thermodynamics the term reversible really means that the process is going forward along a very specific path. So even though heat (q) and work (w) are not state functions themselves because they are path dependent, a specifically defined path (the reversible one) allows them to match the changes in certain state functions. So think of "reversible" as a specifically defined path for a given process. For some processes like gas expansion or chemistry this can be somewhat more complicated. We will not spend lots of time on gas expansion, but we will look a lot at chemistry. For chemical reactions, we are interested in the standard enthalpy change for the reaction. This we will relate not to the heat of the reaction, but to the entropy of the final state (the products) minus the entropy of the initial state (the reactant). Thus we will need some measure of the entropy of different compounds.

\[\Delta S_{\rm rxn}^\circ = \Sigma n S_{\rm products}^\circ - \Sigma n S_{\rm reactants}^\circ\]

Finally, entropy change is related to the "amount" of thermal energy in a sample. As the energy "leaves" (q < 0), then the entropy will decrease. Or as energy "comes in" (q > 0), the entropy will increase. What matters is how much this energy is compared to the average amount that is already in the system (or surroundings). The average thermal energy is measured by the temperature. When the temperature is very high, heat flow results in only a minor change in the entropy. In contrast , when the temperature is very low, heat flow results in an enormous change. It is because of this that heat flows spontaneously from high temperature to low temperature. Heat flowing into the lower temperature object will result in an increase of the entropy of that object. But heat flowing out of the high temperature object will result in a lowering of the entropy of the high temperature object. In both cases, the amount of energy is identical. However, since the entropy is the heat divided by the temperature, the increase in entropy for the low temperature object will be higher than the decrease in entropy for the high temperature object.


Numbers of microstates

An isolated system will spontaneously transition between states such that the entropy of the system is increased to its maximum value. Why is this? Is there some strange force pushing things to higher entropy states? The simple fact is that if a final state has higher entropy, it is simply more likely to exist from the myriad of possible states. These states contain distributions of molecules and energies that are the most probable. What distributions are the most probable? The ones with the greatest number of microstates. A microstate is a specific way in which we can arrange the energy of the system. Many microstates are indistinguishable from each other. The more indistinguishable microstates, the higher the entropy.

The best way to wrap your head around this idea can be to look at a very small scale example. Often the idea of "order" and "disorder" is invoked when thinking about entropy. Technically this is not correct, but as we look at microstates you might see why this example arises.

Imagine that we have a certain amount of energy that needs to be distributed between three molecules. Each molecule will have "quantized" energy states in which we can put that energy (like the energy levels of atomic or molecular orbitals). For our example, just to keep it relatively simple, the energy levels will be equally spaced. Now imagine the total energy of our system of three molecules is 3 energy units. These energy units must be distributed between the three molecules. Let's look at how many different ways there are to do this.

We can give each molecule one unit of energy (1 + 1 + 1 = 3 total). There is only one way to accomplish this. Each molecule needs to have one unit of energy. Alternatively, we can give all three units to just one molecule, and zero energy to the other two (3 + 0 + 0 = 3 total). There are three ways to accomplish this since we can give all the energy to either molecule a, or molecule b, or molecule c. Since molecules a, b, and c are equivalent we will consider these three microstates to be equivalent (all degenerate in energy). Finally, we can give 1 unit of energy to one of the molecules, and 2 units of energy to another one. It turns out there are six possible ways to accomplish this distribution of energy. The diagram below illustrates each of these distributions that we have mentioned.

You can see that there are 10 total possible distributions (microstates). We can classify these based on the distributions mentioned in the previous paragraph. Group i has all the energy in one molecule - although it could be in any of the three molecules a, b, or c. So there are three possible microstates. Group iii has one unit of energy in each molecule. This turns out to be quite special since there is only one way to accomplish this and thus it has only one microstate. Group ii is where one molecule has one energy unit, the next has two, and the last has zero. There are six ways to do this.

Now if we imagine all the possibilities (all 10 microstates) and we simply select one at random, it is clear that it most likely to be form "group ii." Since we can't tell the difference between the molecules, all of these microstates in group ii are the same. If we now ask what will happen in general, you'll see that 60% of the time we get group ii, 30% of the time we get group i, and only 10% of the time do we get the "special" group iii. What is most likely to happen? We are most likely to get group ii.

The odds are the best for group ii above (6 out of 10), but not amazingly higher. However, as we move to systems with larger numbers (Avagadro's number is very, very large), what we see is that the "most likely" configurations are overwhelmingly likely. That is, the most likely configurations are essentially the only configurations that will ever be seen. There are many very similar configurations around the average so we observe small fluctuations, but we never see the extreme "special" configurations.

You can also note that configurations in group iii seem "ordered" and configurations in group ii seem "disordered." So to put this in historic perspective, an "ordered" state is one with very little variability, possibly just a handful of microstates possible. In contrast, a "disordered" state is one with a multitude of possibilities (thousands upon thousands of microstates) - so much so that no apparent patterns are present and we perceive "disorder." It is best to avoid the order/disorder arguments though. Order and disorder are value judgments that we humans impose on arrangements based on our perception. Entropy is a quantifiable measure of the dispersion of energy and our personal perceptions have no place here.

Boltzmann invented this idea of microstates and its relation to macroscopic entropy. He defined entropy as proportional to the natural log of the number of microstates, Ω. (The proportionality constant is the Boltzmann constant \(k\).)

\[S = k \ln \Omega \]

The greater the number of microstates (Ω), the greater the entropy. Boltzmann found it difficult to explain such behavior to those who were not predisposed to understanding the scientific rigor needed for such an explanation. So he "simplified" things by saying its like order and disorder. This was much easier to the layman and ever since has been perpetuated through the years. The order/disorder perception of entropy must be discontinued if we are ever going to better understand entropy and the second law of thermodynamics. We must think of energy dispersal and energy becomes more dispersed when more microstates are available.



Entropy change

We can break down entropy change as resulting from changes in specific properties of our system. There are changes in five things that will lead to a change in the entropy of the system.

  1. Temperature
  2. Volume
  3. Phase
  4. Mixing
  5. Composition (chemistry)

Changes in temperature will lead to changes in entropy. The higher the temperature the more thermal energy the system has; the more thermal energy the system has, the more ways there are to distribute that energy; the more ways there are to distribute that energy, the higher the entropy. Increasing the temperature will increase the entropy.

Changes in volume will lead to changes in entropy. The larger the volume the more ways there are to distribute the molecules in that volume; the more ways there are to distribute the molecules (energy), the higher the entropy. An increase in volume will increase the entropy. This is typically only important for systems involving gases since they are the only materials that undergo large volume changes.

Changes in phase will lead to changes in entropy. Some phases have larger numbers of microstates and thus higher energy. Solids have the fewest microstates and thus the lowest entropy. Liquids have more microstates (since the molecules can translate) and thus have a higher entropy. When a substance is a gas it has many more microstates and thus have the highest entropy.

Mixing of substances will increase the entropy. This is because there are many, many more microstates for the mixed system than for the un-mixed system. More microstates means greater entropy. This is one of the examples from which the misconception that entropy is "disorder" arises.

Lastly, the entropy can change as the result of chemistry. Different molecules have different entropies. Thus it can be difficult to look at a reaction and guess if the entropy is going up or going down. However, in general if the products have a larger number of molecules than the reactants then the entropy is likely to increase. Additionally, if products are in phases of higher entropy than the reactants than the reaction is likely to have a higher entropy. For example

\[\rm 2Fe(s) + 3O_2(g) \rightarrow 2Fe_2O_3(s)\] \[ \Delta S_r < 0\]

The change in entropy for this reaction will be negative. This because the reactants have both a solid and a gaseous species while the product is simply a solid compound.

For these particular changes in entropy we will be doing quantitative calculations for the entropy changes that arise from temperature change, phase change, and chemistry.


Entropy and temperature change

To calculate the entropy for temperature change, we have a slightly different formula than just the heat divided by the temperature.  This is because as the heat is flowing the temperature is changing.  So we need to integrate (add up) the heat as a function of temperature.  The resulting formula for heating at constant pressure (with a constant heat capacity is)

\[\Delta S = n\; C_{\rm m} \ln\left({T_f \over T_i}\right)\]

As we will only consider situations of constant pressure with heat capacities that don't change with temperature, this formula can be used for all of our temperature change situations. Note that the "\(n\;C_{\rm m}\)" part of this equation is just the number of moles times the molar heat capacity - the same thing you use to calculate heat (\(q\)) flow but with \(\Delta T\).


Entropy and phase change

The entropy for a phase change is just the heat (which is the reversible heat) divided by the phase transition temperature.   Again we are almost always looking at constant pressure, so the heat is the enthalpy change for the phase transition.  Thus the entropy change is

\[\Delta S_{\rm trans} = {\Delta H_{\rm trans} \over T_{\rm trans}}\]


Entropy Change of Surroundings (and Total)

The second law depends on the entropy change of everything, not just the system. It is possible for a process to occur that lowers the energy of the system, but raises the entropy of the surroundings. As long as the surrounding increase more than the system goes down, then this process will occur. Water freezing in a constant temperature surroundings at -5 °C is an example of this. The system (the water) will decrease in entropy first because its temperature will go down and second because it will go from a liquid to a solid. The entropy of the surroundings will increase since energy (heat) is flowing into the surroundings from the system.

How do we calculate entropy changes for the surroundings? In most cases, the surroundings will be at a constant temperature. Therefore the entropy change will simply be related to the amount of energy that enters the surroundings in the form of heat divided by the temperature of the surroundings. Since we typically give heat a sign based on the system, the heat from the perspective of the surroundings is equal to the heat of the system but opposite in sign. Heat flowing out of the system is flowing into the surroundings. Then entropy change of the surroundings is

\[\Delta S_{\rm surroundings} = {q_{surroundings} \over T_{surroundings}} = {-q_{system} \over T_{surroundings}}\]

As heat is generally defined from the perspective of the system, the subscript "system" is often left off of the heat in the last version of this equation.
If we have the entropy changes of the system and surroundings, we can calculate total entropy change. The total entropy change is simply the sum of the system and the surroundings.

\[\Delta S_{total} = \Delta S_{system} + \Delta S_{surrounding}\]


Entropy for Reactions

To calculate the entropy change for reactions, we simply look at the entropy of the final state minus the entropy of the initial state.  The final state is the products in their standard state and the initial state is the reactants in their standard state.  We define the entropy of a pure crystalline solid at  T = 0 K as having an entropy of zero.  This is the third law of thermodynamics.  Since we now know the entropy at T = 0 K and we can calculate the entropy change for a temperature and phase change, we can then calculate the absolute entropy for any substance at any temperature.  These numbers are thankfully tabulated (usually along with the enthalpies of formation). So for a reaction we simply sum the entropies of the products (times the number of moles in our thermochemical equation) minus the sum of the entropies of the reactants (times the number of moles in the thermochemical equation).

\[\Delta S_{\rm rxn}^\circ = \Sigma n S_{\rm products}^\circ - \Sigma n S_{\rm reactants}^\circ\]

Note: \(\Delta S_{rxn}\) is not equal to \(\Delta H_{\rm rxn} / T \). This is only true at a very particular condition of equilibrium that will be investigated extensively in CH302.

\(\Delta H_{rxn}\) is very important for finding the entropy change of the surroundings. Since at constant temperature \(q = \Delta H\), we can use the enthalpy of the reaction to find the entropy change of the surroundings.

\[\Delta S_{\rm surroundings} = {-q \over T_{surroundings}} = {-\Delta H_{sys} \over T_{surroundings}}\]


Entropy examples

Below is a worked example of calculating the entropy change for a process that involves both a temperature change and a phase change.