Probability theory is nothing but common sense reduced to calculation.
LaPlace
To begin, we must recognize that the disorder of a system can change in two ways. First, disorder occurs due to the physical arrangement (distribution) of atoms, and we represent this with the configurational entropy.3 There is also a distribution of kinetic energies of the particles, and we represent this with the thermal entropy. For an example of kinetic energy distributions, consider that a system of two particles, one with a kinetic energy of 3 units and the other of 1 unit, is microscopically distinct from the same system when they both have 2 units of kinetic energy, even when the configurational arrangement of atoms is the same. This second type of entropy is more difficult to implement on the microscopic scale, so we focus on the configurational entropy in this section.4
Configurational entropy is associated with spatial distribution. Thermal entropy is associated with kinetic energy distribution.
Entropy and Spatial Distributions: Configurational Entropy
Given N molecules and M boxes, how can these molecules be distributed among the boxes? Is one distribution more likely than another? Consideration of these issues will clarify what is meant by microstates and macrostates and how entropy is related to disorder. Our consideration will focus on the case of distributing particles between two boxes.
Distinguishability of particles is associated with microstates. Indistinguishability is associated with macrostates.
First, let us suppose that we distribute N = 2 ideal gas5 molecules in M = 2 boxes, and let us suppose that the molecules are labeled so that we can identify which molecule is in a particular box. We can distribute the labeled molecules in four ways, as shown in Fig. 4.1. These arrangements are called microstates because the molecules are labeled. For two molecules and two boxes, there are four possible microstates. However, a macroscopic perspective makes no distinction between which molecule is in which box. The only macroscopic characteristic that is important is how many particles are in a box, rather than which particle is in a certain box. For macrostates, we just need to keep track of how many particles are in a given box, not which particles are in a given box. It might help to think about connecting pressure gauges to the boxes. The pressure gauge could distinguish between zero, one, and two particles in a box, but could not distinguish which particles are present. Therefore, microstates α and δ are different macrostates because the distribution of particles is different; however, microstates β and γ give the same macrostate. Thus, from our four microstates, we have only three macrostates.
Figure 4.1. Illustration of configurational arrangements of two molecules in two boxes, showing the microstates. Not that β and γ would have the same macroscopic value of pressure.
To find out which arrangement of particles is most likely, we apply the “principle of equal a priori probabilities.” This “principle” states that all microstates of a given energy are equally likely. Since all of the states we are considering for our non-interacting particles are at the same energy, they are all equally likely.6 From a practical standpoint, we are interested in which macrostate is most likely. The probability of a macrostate is found by dividing the number of microstates in the given macrostate by the total number of microstates in all macrostates as shown in Table 4.1. For our example, the probability of the first macrostate is 1/4 = 0.25. The probability of the evenly distributed state is 2/4 = 0.5. That is, one-third of the macrostates possess 50% of the probability. The “most probable distribution” is the evenly distributed case.
Table 4.1. Illustration of Macrostates for Two Particles and Two Boxes
What happens when we consider more particles? It turns out that the total number of microstates for N particles in M boxes is MN, so the counting gets tedious. For five particles in two boxes, the calculations are still manageable. There will be two microstates where all the particles are in one box or the other. Let us consider the case of one particle in box A and four particles in box B. Recall that the macrostates are identified by the number of particles in a given box, not by which particles are in which box. Therefore, the five microstates for this macrostate appear as given in Table 4.2(a).
Table 4.2. Microstates for the Second and Third Macrostates for Five Particles Distributed in Two Boxes
The counting of microstates for putting two particles in box A and three in box B is slightly more tedious, and is shown in Table 4.2(b). It turns out that there are 10 microstates in this macrostate. The distributions for (three particles in A) + (two in B) and for (four in A) + (one in B) are like the distributions (two in A) + (three in B), and (one in A) + (four in B), respectively. These three cases are sufficient to determine the overall probabilities. There are MN = 25 = 32 microstates total summarized in the table below.
Note now that one-third of the macrostates (two out of six) possess 62.5% of the microstates. Thus, the distribution is now more peaked toward the most evenly distributed states than it was for two particles where one-third of the macrostates possessed 50% of the microstates. This is one of the most important aspects of the microscopic approach. As the number of particles increases, it won’t be long before 99% of the microstates are in one-third of the macrostates. The trend will continue, and increasing the number of particles further will quickly yield 99% of the microstates in that one-tenth of the macrostates. In the limit as N→∞ (the “thermodynamic limit”), virtually all of the microstates are in just a few of the most evenly distributed macrostates, even though the system has a very slight finite possibility that it can be found in a less evenly distributed state. Based on the discussion, and considering the microscopic definition of entropy (Eqn. 4.2), entropy is maximized at equilibrium for a system of fixed energy and total volume.7
With a large number of particles, the most evenly distributed configurational state is most probable, and the probability of any other state is small.
Generalized Microstate Formulas
To extend the procedure for counting microstates to large values of N (~1023), we cannot imagine listing all the possibilities and counting them up. It would require 40 years to simply count to 109 if we did nothing but count night and day. We must systematically analyze the probabilities as we consider configurations and develop model equations describing the process.
How do we determine the number of microstates for a given macrostate for large N? For the first step in the process, it is fairly obvious that there are N ways of moving one particle to box B, i.e., 1 came first, or 2 came first, and so on, which is what we did to create Table 4.2(a). However, counting gets more complicated when we have two particles in a box. Since there are N ways of moving the first particle to box B, and there are (N – 1) particles left, we begin with the same logic for the (N – 1) remaining particles. For example, with five particles, there would then be five ways of placing the first particle, then four ways of placing the second particle for a total of 20 possible ways of putting two particles in box B. One way of writing this would be 5·4, which is equivalent to (5·4·3·2·1)/(3·2·1), which can be generalized to N!/(N – m)!, where m is the number of particles we have placed in the first box.8 (N! is read “N factorial,” and calculated as N·(N – 1)·(N – 2)……·2·1). Our formula gives 20 ways, but Table 4.2(b) shows only 10 ways. What are we missing? Answer: When we count this way, we are implicitly double counting some microstates. Note in Table 4.2(b) that although there are two ways that we could put the first particle in box B, the order in which we place them does not matter when we count microstates. Therefore, using N!/(N – m)! implicitly distinguishes between the order in which particles are placed. For counting microstates, the history of how a particular microstate was achieved does not interest us. Therefore, we say there are only 10 distinguishable microstates.
Factorials are a quick tool for counting arrangements.
It turns out that it is fairly simple to correct for this overcounting. For two particles in a box, they could have been placed in the order 1-2, or in the order 2-1, which gives two possibilities. For three particles, they could have been placed 1-2-3, 1-3-2, 2-1-3, 2-3-1, 3-1-2, 3-2-1, for six possibilities. For m particles in a box, without correction of the formula, we overcount by m!. Therefore, we modify the above formula by dividing by m! to correct for overcounting. Finally, the number of microstates for arranging N particles in two boxes, with m particles in one of the boxes, is:9
The general formula for M boxes is:10
General formula for number of microstates for N particles in M boxes.
mij is the number of particles in the ith box at the jth macrostate. We will not derive this general formula, but it is a straightforward extension of the formula for two boxes which was derived above. Therefore, with 10 particles, and three in the first box, two in the second box and five in the third box, we have 10!/(3!2!5!) = 3,628,800/(6·2·120) = 2520 microstates for this macrostate.
Recall the microscopic definition of entropy given by Eqn. 4.2. Let us use it to calculate the entropy change for an ideal gas due to an isothermal change in volume. The statistics we have just derived will apply since an ideal gas consists of non-interacting particles whose energy is independent of their nearest neighbors. During an expansion like that described, the energy is constant because the system is isolated. Therefore, the temperature is also constant because dU = CV dT for an ideal gas.
Entropy and Isothermal Volume/Pressure Change for Ideal Gases
Suppose an insulated container, partitioned into two equal volumes, contains N molecules of an ideal gas in one section and no molecules in the other. When the partition is withdrawn, the molecules quickly distribute themselves uniformly throughout the total volume. How is the entropy affected? Let subscript 1 denote the initial state and subscript 2 denote the final state. Here we take for granted that the final state will be evenly distributed.
We can develop an answer by applying Eqn. 4.4, and noting that 0! = 1:
Substituting into Eqn. 4.2, and recognizing ,
ΔS = S2 – S1 = kln(p2/p1) = k{ln(N!) – 2 In[(N/2)!]}
Stirling’s approximation may be used for ln(N!) when N > 100,
The approximation is a mathematical simplification, and not, in itself, related to thermodynamics.
Therefore, entropy of the system has increased by a factor of ln(2) when the volume has doubled at constant T. Suppose the box initially with particles is three times as large as the empty box. In this case the increase in volume will be 33%. Then what is the entropy change? The trick is to imagine four equal size boxes, with three equally filled at the beginning.
Entropy of a constant temperature system increases when volume increases.
A similar application of Stirling’s approximation gives,
We may generalize the result by noting the pattern with this result and the previous result,
where the subscript T indicates that this equation holds at constant T. For an isothermal ideal gas, we also may express this in terms of pressure by substituting V = RT/P in Eqn. 4.6
Formulas for isothermal entropy changes of an ideal gas.
Therefore, the entropy decreases when the pressure increases isothermally. Likewise, the entropy decreases when the volume decreases isothermally. These concepts are extremely important in developing an understanding of entropy, but by themselves, are not directly helpful in the initial objective of this chapter—that of determining inefficiencies and maximum work. The following example provides an introduction to how these conceptual points relate to our practical objectives.
Example 4.1. Entropy change and “lost work” in a gas expansion
An isothermal ideal gas expansion produces maximum work if carried out reversibly and less work if friction or other losses are present. One way of generating “other losses” is if the force of the gas on the piston is not balanced with the opposing force during the expansion, as shown in part (b) below. Consider a piston/cylinder containing one mole of nitrogen at 5 bars and 300 K is expanded isothermally to 1 bar.
a. Suppose that the expansion is reversible. How much work could be obtained and how much heat is transferred? What is the entropy change of the gas?
b. Suppose the isothermal expansion is carried out irreversibly by removing a piston stop and expanding against the atmosphere at 1 bar. Suppose that heat transfer is provided to permit this to occur isothermally. How much work is done by the gas and how much heat is transferred? What is the entropy change of the gas? How much work is lost compared to a reversible isothermal process and what percent of the reversible work is obtained (the efficiency)?
Solution
Basis: 1 mole, closed unsteady-state system.
a. The energy balance for the piston/cylinder is ΔU = Q + WEC = 0 because the gas is isothermal and ideal. dWEC = –PdV = –(nRT/V)dV; WEC = –nRTln(V2/V1) = –nRTln(P1/P2) = –(1)8.314(300)ln(5) = –4014J. By the energy balance Q = 4014J. The entropy change is by Eqn. 4.7, ΔS = –nRln(P2/P1) = –(1)8.314ln(1/5) = 13.38 J/K.
b. The energy balance does not depend on whether the work is reversible and is the same. Taking the atmosphere as the system, the work is WEC,atm = –Patm(V2,atm –V1,atm) = –WEC = –Patm(V1–V2) = Patm(nRT/P2–nRT/P1) = nRT(Patm/P2–Patm/P1) ⇒ WEC = nRT(Patm/P1–Patm/P2) = (1)8.314(300)(1/5–1) = –1995J, Q = 1995J. The entropy change depends on only the state change and this is the same as (a), 13.38 J/K. The amount of lost work is Wlost = 4014 – 1995 = 2019J, the percent of reversible work obtained (efficiency) is 1995/4014 · 100% = 49.7%.
An important point is suggested by Example 4.1, even though the example is limited to ideal gas constraints. We saw that the isothermal entropy change for the gas was the same for the reversible and irreversible changes because the gas state change was the same. Though Eqn. 4.7 is limited to ideal gases, the relation between entropy changes and state changes is generalizable as we prove later. We will show later that case (b) always generates more entropy.
Entropy of Mixing for Ideal Gases
Mixing is another important process to which we may apply the statistics that we have developed. Suppose that one mole of pure oxygen vapor and three moles of pure nitrogen vapor at the same temperature and pressure are brought into intimate contact and held in this fashion until the nitrogen and oxygen have completely mixed. The resultant vapor is a uniform, random mixture of nitrogen and oxygen molecules. Let us determine the entropy change associated with this mixing process, assuming ideal-gas behavior.
Since the Ti and Pi of both ideal gases are the same, ViN2 = 3ViO2 and Vitot = 4ViO2. Ideal gas molecules are point masses, so the presence of O2 in the N2 does not affect anything as long as the pressure is constant. The main effect is that the O2 now has a larger volume to access and so does N2. The component contributions of entropy change versus volume change can be simply added. Entropy change for O2:
ΔS = nO2Rln(4) = ntotR[–xO2ln(0.25)] = ntotR[–xO2ln(xO2)]
Entropy change for N2:
Entropy change for total fluid:
This is an important result as it gives the entropy change of mixing for non-interacting particles. Remarkably, it is also a reasonable approximation for ideal solutions where energy and total volume do not change on mixing. This equation provides the underpinning for much of the discussion of mixtures and phase equilibrium in Unit III.
The entropy of a mixed ideal gas or an ideal solution, here both denoted with a superscript “is:”
Note that these equations apply to ideal gases if we substitute y for x. In this section we have shown that a system of ideal gas molecules at equilibrium is most likely to be found in the most randomized (distributed) configuration because this is the macrostate with the largest fraction of microstates. In other words, the entropy of a state is maximized at equilibrium for a system of fixed U,V, and N.
But how can temperature be related to disorder? We consider this issue in the next subsection.
Entropy and Temperature Change: Thermal Entropy
One key to understanding the connection between thermal entropy and disorder is the appreciation that energy is quantized. Thus, there are discrete energy levels in which particles may be arranged. These energy levels are analogous to the boxes in the spatial distribution problem. The effect of increasing the temperature is to increase the energy of the molecules and make higher energy levels accessible.
To see how this affects the entropy, consider a system of three molecules and three energy levels εo, ε1 = 2εo, ε3 = 3εo. Suppose we are at a low temperature and the total energy is U = 3εo. The only way this can be achieved is by putting all three particles in the lowest energy level. The other energy levels are not accessible, and S = So. Now consider raising the temperature to give the system U = 4εo. One macrostate is possible (one molecule in ε1 and two in εo), but there are now three microstates, ΔS = kln(3). Can you show when U = 6εo that the macrostate with one particle in each level results in ΔS = kln(6)? Real systems are much larger and the molecules are more complex, but the same qualitative behavior is exhibited: increasing T increases the accessible energy levels which increases the microstates, increasing entropy.
We can advance our understanding of thermal effects on entropy by contemplating the Einstein solid.11 Albert Einstein’s (1907) proposal was that a solid could be treated as a large number of identical vibrating monatomic sites, modeling the potential energies as springs that follow Hooke’s law. The quantum mechanical analog to the energy balance is known as Shrödinger’s equation, which relates the momentum (kinetic energy) and potential energy. An exact solution is possible only for equally spaced energy levels, known as quantum levels. The equally spaced quantized states for each oscillator are separated by hf where h is Planck’s constant and f is the frequency of the oscillator. Thus, the system is described as a system of harmonic oscillators. Assuming that each oscillator and each dimension (x,y,z) is independent, we can develop an expression for the internal energy and the heat capacity.
Albert Einstein (1879 – 1955) was a German-born physicist. He contributed to an understanding of quantum behavior and the general theory of relativity. He was awarded the 1921 Nobel Prize in physics.
The Einstein solid model was one of the earliest and most convincing demonstrations of the limitations of classical mechanics. It serves today as a simple illustration of the manner in which quantum mechanics and statistical mechanics lead to consistent and experimentally verifiable descriptions of thermodynamic properties. The assumptions of the Einstein solid model are as follows:
• The total energy in a solid is the sum of M harmonic oscillator energies where M/3 is the number of atoms because the atoms oscillate independently in three dimensions. Since the energy of each oscillator is quantized, we can say that the total internal energy is
where εq is the (constant) energy step for each quantum level, and qM gives the total quantum multiplier for all oscillator quantum energies added. The term Mεq/2 represents the ground state energy that oscillators have in the lowest energy level. It is often convenient to relate the energy to the average quantum multiplier,
• Each oscillator in each dimension is independent, so we can allocate integer multiples of εq to any oscillator as long as the total sum of multipliers is qM. Each independent specification represents a microstate. For M = 3 oscillators, (3,1,1) specifies three units of energy in the first oscillator and one unit in each of the other two for a total of qM = 5, U = 5εq + 3εq/2.
• Raising the magnitude of qM (by adding heat to raise T) makes more microstates accessible, increasing the entropy.
For qM=3 units of energy distributed in an Einstein solid with M=4 oscillators, below is the detailed listing of the possible distributions of the energy, a total of 20 different distributions for three units of energy among four oscillators (a “multiplicity” of 20).
If we are trying to develop a description of a real solid with Avogadro’s number of oscillators, enumeration is clearly impractical. Fortunately, mathematical expressions for the multiplicity make the task manageable. Callen12 gives the general formula for the number of microstates as pi = (qMi+Mi–1)!/[qMi!(Mi–1)!]. There is a clever way to understand this formula. Instead of distributing qMi quanta among M oscillator “boxes,” consider that there are Mi – 1 “partitions” between oscillator “boxes.” In the table above, there are four oscillators, but there are three row boundaries. Consider that the quanta can be redistributed by all the permutations of the particles and boundaries (qMi+Mi–1)!. However, the permutations overcount in that the qMi are indistinguishable, so we divide by qMi!, and that the (Mi–1) boundaries are indistinguishable, so we divide by (Mi–1)!. To apply the formula for qM=2, M=2:
For the case with qM = 3 and M = 4, pi = (3+4–1)!/[3!(3!)] = 20, as enumerated above.
Example 4.2. Stirling’s approximation in the Einstein solid
a. Show that Callen’s formula is consistent with enumeration for:
1. qM = 3, M = 3; (2) qM = 4, M = 3
b. Use the general formula to develop an expression for S = S(qM,M) when M > 100. Express the answer in terms of the average quantum multiplier, <qM>.
c. Plot S/Mk versus <qM>. What does this indicate about entropy changes when heat is added?
Solution
a.
1. (3,0,0),(0,3,0),(0,0,3),(2,1,0),(2,0,1),(1,2,0),(1,0,2),(0,2,1),(0,1,2),(1,1,1) = 10. Check.
2. (4,0,0), (0,4,0), (0,0,4), (3,1,0), (3,0,1), (1,3,0), (1,0,3), (0,3,1), (0,1,3), (2,1,1), (1,2,1), (1,1,2), (2,0,2), (2,2,0), (0,2,2) = 15. Check.
b. Si = k ln(pi) = k { ln[(qMi+Mi–1)!] – ln[qMi!(Mi–1)!] }. Applying Stirling’s approximation,
S/k = [(qM+M–1)ln(qM+M–1)–(qM+M–1)] – qMlnqM + qM – (M–1)ln(M–1) + M–1 = S/Mk = (qM/M)ln[(qM+M–1)/qM] + [(M–1)/M]ln[(qM+M–1)/(M–1)]
S/Mk = <qM>ln(1+1/<qM>) + ln(<qM>+1)
c. Fig. 4.2 shows that S increases with <qM> = qM/M (=U/Mεq – ½). When T increases, U will increase, meaning that <qM> and S increase. It would be nice to relate the change in entropy quantitatively to the change in temperature, but a complete analysis of the entire temperature range requires advanced derivative manipulations that distracts from the main concepts at this stage. We return to this problem in Chapter 6.
Figure 4.2. Entropy of the Einstein solid with increasing energy and T as explained in Example 4.2.
You should notice that we represented the interactions of the monatomic sites as if they were connected by Hooke’s law springs in the solid phase. They are not rigorously connected this way, but the simple model approximates the behavior of two interacting molecules in a potential energy well. If the spring analogy were exact, the potential energy well would be a parabola (thus a harmonic oscillator). Look back at the Lennard-Jones potential in Chapter 1 and you can see that the shape is a good approximation if the atoms do not vibrate too far from the minimum in the well. The Einstein model gives qualitatively the right behavior, but the Debye model that followed in 1912 is more accurate because it represents collective waves moving through the solid. We omit discussion of the Debye model because our objectives are met with the Einstein model.13 We briefly extend the concept of the Einstein model in Chapter 6 where we develop more powerful methods for manipulation of derivatives.
In the present day, the subtle relations between entropy and molecular distributions are complex but approachable. Imagine how difficult gaining this understanding must have been for Boltzmann in 1880, before the advent of quantum mechanics. Many scientists at the time refused even to accept the existence of molecules. Trying to explain to people the nature and significance of his discoveries must have been extremely frustrating. What we know for sure is that Boltzmann drowned himself in 1903. Try not to take your frustrations with entropy quite so seriously.
Test Yourself
1. Does molar entropy increase, decrease, or stay about the same for an ideal gas if: (a) volume increases isothermally?; (b) pressure increases isothermally?; (c) temperature increases isobarically?; (d) the gas at the vapor pressure condenses?; (e) two pure gas species are mixed?
2. Does molar entropy increase, decrease, or stay about the same for a liquid if: (a) temperature increases isobarically?; (b) pressure increases isothermally?; (c) the liquid evaporates at the vapor pressure?; (d) two pure liquid species are mixed?
Leave a Reply