View page as slide show

Chapter 22 - Second Law of Thermodynamics and Entropy

The second law of thermodynamics

The first law relates heat energy, work and the internal thermal energy of a system, and is essentially a statement of conservation of energy.

The second law of thermodynamics adds a restriction on the direction of thermodynamic processes.

One of the earliest statements of the second law, due to Rudolf Clausius is that:

Heat cannot spontaneously flow from a cold object to a hot one, whereas the reverse, a spontaneous flow of heat from a hot object to a cold one, is possible.

We should note that the first law

$\Delta E_{int}=Q-W$

would not prohibit such a process, so the second law adds something fundamentally new to our understanding of thermodynamics.

Heat Engine

A heat engine is a system for turning temperature gradient between two thermal reservoirs in to mechanical work. We will consider heat engines that operate with a continuous cycle, which means that the system always returns to its initial state at the end of the cycle and there is no change in the internal energy.

The first law tells us that in the case


In writing this equation we have adopted a new sign convention, where all the heats and the work done are positive.

Steam Engines

Some of the earliest engines were steam engines, though steam engines should not be thought of a historical relics, about 80% of the worlds electricity comes from steam turbines. The earliest “steam engine” the Aeolipile does not do very much work. To get work out of an engine one needs to design an efficient heat engine cycle.


The efficiency, $e$, of an engine is defined as the ratio of the work we get from the engine $W$ to the input heat $Q_{H}$


As we know that



The lower the waste heat, the more efficient the engine, however the second law of thermodynamics prevents $Q_{L}$ being zero. Kelvin in fact stated the second law explicitly in those terms:

No device is possible whose sole effect is to transform a given amount of heat directly in to work.


Carnot Cycle

To find the hypothetical maximum efficiency of a heat engine we can consider a cycle called the Carnot Cycle first proposed by Carnot.

Carnot cycle animation

The Carnot cycle is based entirely on reversible processes, which is not achievable in reality, as it would require each process to be executed infinitely slowly so that the process can be considered as a continuous progression through equilibrium states. We can however consider the Carnot cycle as a theoretical ideal which can be approached.

There are 4 processes in the Carnot cycle, which we will consider as in terms of the expansion and compression of an ideal gas.

  1. From A to B. An isothermal expansion, in which an amount of heat $Q_{H}$ is added to the gas.
  2. From B to C. An adiabatic expansion, in which no heat is exchanged and the temperature of the gas is lowered.
  3. From C to D. An isothermal compression, in which an amount of heat $Q_{L}$ is removed from the gas.
  4. From D to A. An adiabatic compression, returning the system to it's original high temperature state.

Efficiency of the Carnot Cycle

The work done in the first isothermal process is


and as the process is isothermal this means that the heat added is equal to the work done.


The heat lost in in the second isothermal process will be


For the adiabatic processes

$P_{B}V_{B}^{\gamma}=P_{C}V_{C}^{\gamma}$ and $P_{D}V_{D}^{\gamma}=P_{A}V_{A}^{\gamma}$

and from the ideal gas law

$\frac{P_{B}V_{B}}{T_{H}}=\frac{P_{C}V_{C}}{T_{L}}$ and $\frac{P_{D}V_{D}}{T_{L}}=\frac{P_{A}V_{A}}{T_{H}}$

These equations can be used to eliminate the temperatures and show that


which can be used with the equations for the isothermal processes to show that


making the efficiency



Carnot's theorem

Carnot's theorem generalizes the result we just derived by stating:

All reversible engines operating between two constant temperatures $T_{H}$ and $T_{L}$ have the same efficiency.


Any irreversible engine operating between the same two fixed temperatures will have a lower efficiency.

Otto Cycle

A four stroke car engine runs on a cycle that can be approximated by the Otto Cycle

Otto cycle animation

In this cycle neither AB or CD are isothermal, but are rather adiabatic processes. BC and DA can be considered to be isovolumetric.

As the heat input and exhaust cycles occur at constant volume $Q_{H}=nc_{V,m}(T_{C}-T_{B})$ and $Q_{L}=nc_{V,m}(T_{D}-T_{A})$

and the efficiency of an Otto cycle is


Using the fact for an adiabatic process $PV^{\gamma}=\mathrm{constant}$ and for an ideal gas $P=\frac{nrT}{V}$ it can be shown that

$T_{A}V_{A}^{\gamma - 1}=T_{B}V_{B}^{\gamma - 1}$ and $T_{C}V_{C}^{\gamma - 1}=T_{D}V_{D}^{\gamma - 1}$

which combined with the fact that $V_{C}=V_{B}$ and $V_{A}=V_{D}$ gives (after some manipulation!)




We can produce refrigeration only by doing work, to do otherwise would violate the second law of thermodynamics. We can achieve refrigeration by going around one of the cycles we discussed earlier in the opposite direction.

The coefficient of performance, $COP$, of a refrigerator is defined as the heat removed $Q_{L}$ divided by the work done $W$. As before we apply the first law $Q_{L}+W=Q_{H}$ so


As with a heat engine we can consider the Carnot cycle to be the ideal case, which means that for an ideal refrigerator



Methods of cooling

Most household refrigerators run on a vapor compression cycle. Let's pretend that the steps in this process can be approximated as those in the Carnot cycle, and recall that we are going around the cycle in the reverse direction to when we use it to produce work. In this approximation the stages of the cycle in which heat transfer occurs are isothermal, but in fact this is very much not the case, for a refrigerator to work we actually rely on these stages to convert the refrigerant from liquid to vapor in the evaporator (due to the $Q_{L}$ added to the refrigerant) and from vapor to liquid in the condenser (due to the $Q_{H}$ removed from the refrigerant). In the compressor we do work on the gas, in the expansion valve the gas does work (but less than the compressor does).

The same cycle is the basis of air conditioning, though in this case the heat is removed from inside the house and dumped outside.

Heat Pump

A geothermal heat pump is an efficient way of both heating and cooling a house.

In winter a heat pump will mechanically expand a refrigerant, lowering it's temperature to below that of a thermal reservoir (which will have a temperature around 10oC-15oC all year round) so that it can absorb heat $Q_{L}$ from the reservoir. It will then mechanically compress the refrigerant, increasing it's temperature so that it can dump heat $Q_{H}$ in to the house, before being expanded again and passed back to the reservoir.

This is more efficient that direct heating, because the total heat supplied to the house $Q_{H}$ is equal to $Q_{L}+W$ and the $Q_{L}$ is supplied for free, only the $W$ needs to be produced electrically. Direct electric heating would require all the heat to be provided via electricity. In summer the pumping direction is reversed and the heat pump acts as an efficient air conditioner (efficient because the transfer to the thermal reservoir is generally more efficient than typical A/C heat transfer to the air).

What is Entropy?

We have introduced the second law of thermodynamics which placed limits on the physically possible thermodynamic processes above and beyond conservation of energy (the 1st law).

Now we will introduce a quantity, entropy, which we can use to build a deeper understanding of the second law. We'll come to see that entropy relates to the disorder of a system, and is also a measurement of the available thermal energy for a process to occur.

We will define the change in entropy in a reversible process at constant temperature as

$\Delta S =\frac{Q}{T}$

If we want to treat non-constant temperature cases we can express the change of entropy in differential form


and then the change in entropy in going from state $a$ to $b$ will be

$\Delta S =S_{b}-S_{a}=\int_{a}^{b}\,dS=\int_{a}^{b}\frac{dQ}{T}$


Entropy as a state variable

A state variable is a variable that describes the current state of a dynamical system. Heat is not a state variable, it depends on the path taken to get the system to it's state. Entropy, on the other hand, is a state variable, the change in entropy required to change a system from one state to another via a reversible process is independent of the path taken.

We can recall that in the Carnot Cycle we were able to derive that

$\frac{Q_{L}}{Q_{H}}=\frac{T_{L}}{T_{H}}$ or $\frac{Q_{L}}{T_{L}}=\frac{Q_{H}}{T_{H}}$

where $Q_{L}$ is the magnitude of the heat flowing out of the system and $Q_{H}$ is the magnitude of the heat flowing into the system. To make these compatible with our equation for entropy we give $Q_{H}$ a positive sign and $Q_{L}$ a negative sign.


And we can see on the PV diagram below that either of the paths from A to C will have the same change in entropy. We can also see that a complete Carnot cycle has a net change of entropy of zero.

Consequence of entropy being a state variable

We can calculate the entropy changes for reversible process using


This equation is only valid for reversible processes. However if we want to find the entropy change for an irreversible process (ie. any real process) that goes from state A to state B we can calculate the entropy change for a reversible process that goes from point to A and B, and the entropy change for the system will be the same for the two processes.

As we have stated that the entropy of a system is a state variable it stands to reason that for any closed cycle the systems entropy will return to it's initial value after the completion of a full cycle.

So what distinguishes a reversible cycle from an irreversible one? The distinction considers the change of the entropy for the environment the system is exchanging heat with. The definition of a reversible cycle is that the system is always infinitesimally close to being in thermal equilibrium with it's environment, in which case we can see that the change of entropy in the environment is also zero. If this is not the case then a heat engine will instead increase the entropy of its environment which each cycle, and as the system itself does not have a changes in entropy this leads to a net increase in the entropy of the universe whenever work is done by any real engine cycle.

Any reversible cycle can be represented as series of Carnot cycles, with each Carnot cycle contributing no increase of entropy, therefore any reversible cycle results in no increase of entropy.

A reversible process will have equal and opposite entropy change if it is reversed.

We will now look at the entropy change for some non reversible processes.

Entropy change for mixing

We can consider the entropy change when two objects transfer heat from one to another, for example when we mix water at two different temperatures. When the temperature difference is fairly small we can approximate the entropy change for each process as

$\Delta S=\frac{Q}{\overline{T}}$

$\overline{T}$ is an average temperature for the process.

In the example of mixing equal quantities of water at different temperatures $T_{H}$ and $T_{L}$, and amount of heat $Q$ will be transferred from the hot water until the temperature is a new temperature $T_{M}=\frac{T_{H}+T_{L}}{2}$. For the water which is cooling the entropy change will be negative

$\Delta S_{cooling}=-\frac{Q}{T_{cooling}}$

where $T_{H}>T_{cooling}>T_{M}$

For the water whose temperature is increasing the change in entropy is

$\Delta S_{heating}=\frac{Q}{T_{heating}}$

where $T_{L}<T_{heating}<T_{M}$

As $T_{cooling}>T_{heating}$ we can see that the total entropy change of the system

$\Delta S_{cooling}+\Delta S_{heating}>0$

Transfer of entropy to the environment

In many cases when considering the total entropy change in a system we need to consider the entropy change to the environment. For example, if we take an object which is cooling by heat lost to the environment through a quasistatic process where $dQ=mc\,dT$

$\Delta S_{object}=\int \frac{dQ}{T}=mc\int_{T_{1}}^{T_{2}}\frac{dT}{T}=mc\ln\frac{T_{2}}{T_{1}}=-mc\ln\frac{T_{1}}{T_{2}}$

If we consider the environment as thermal reservoir at a fixed temperature $T_{2}$ the entropy change is

$\Delta S_{environment}=mc\frac{T_{1}-T_{2}}{T_{2}}=mc(\frac{T_{1}}{T_{2}}-1)$

$\Delta S_{total}=mc((\frac{T_{1}}{T_{2}}-1)+\ln\frac{T_{1}}{T_{2}}))$ which we can see is always greater than zero.

Second law of thermodynamics from an entropy viewpoint

The examples we have discussed fit with our expectations of how things work. If we mix hot and cold water they will equilibrate to the same temperature and they won't spontaneously separate again in to hot and cold water (which would decrease entropy). If we have a hot object in an cooler environment it will transfer the heat to the environment, the reverse, which would decrease entropy won't happen.

We can express the second law of thermodynamics in terms of entropy:

The entropy of an isolated system never decreases, it either stays constant (for a reversible process) or increases (for irreversible processes). As all real processes are irreversible the total entropy for a system and it's environment increases as a result of any natural process.

While we can decrease the entropy of part of the universe, some other parts entropy will be increased by a greater amount, leading towards an continual overall increase of the universe's entropy.


Order, disorder and availability of energy

Entropy can be seen as a measure of the order of a system.

For example when we have hot and cold fluids we have a form of order, which is lost when we mix them together. We also lose the capacity to use them for work, while they were separated we could have used them to drive a heat engine, which requires a temperature gradient to do work, once they are mixed we cannot get work from them, even though no energy has been lost.

We can view the continual change of order to disorder as a gradual heating of the universe to a uniform temperature (expansion of the universe would complicate this, as this would result in a lower final temperature, which might eventually tend to absolute zero). The long term consequence of the universe acquiring a uniform temperature and maximal entropy state would be it's eventual heat death in which all mechanical energy would be lost.

Entropy and Statistics

So far we have considered entropy along the lines it was first proposed by Clausius.

Boltzmann was responsible for giving entropy a statistical basis.

So far when we have considered the state of the system we are referring to it's macrostate, ie. what is the pressure, volume, temperature etc. For each of these macrostates there is some set of microstates which give rise to the macrostate.

If we consider a gas knowing the microstate of the gas would imply that we know the velocity and position of each and every molecule. This is impossible to know, but to determine the probability of a given macrostate we don't need to know the details of the microstate, simply how many microstates there are which correspond to that macrostate. Those macrostates which have the greatest number of microstates have the greatest probability of occurrence. Boltzmann expressed entropy in terms of the number of microstates; the entropy of a given macrostate is

$S=k\ln W$

where $W$ is the number of microstates corresponding to that state.

Second law as a consequence of statistics

As we have now made a link between entropy and probability

$S=k\ln W$

we can now see that the second law of thermodynamics is simply a statement that a change of a system will be towards one that is more probable. For example, it is extremely unlikely that all the molecules of gas in a room will arrange themselves neatly ordered on one side of the room, because this would be a single microstate, whereas there are a very large number of microstates which obey the Maxwell distribution in which the molecules are evenly distributed and move randomly.

Maxwell's Demon

However, the statistical viewpoint actually leads us to the conclusion that the second law may not be as rigid as we have so far presented it. Processes which decrease entropy are not strictly forbidden, they are just very unlikely to occur, a concept Maxwell considered through the thought experiment known as Maxwell's demon. Irrespective of this idea, over time the occurrence of a few statistically unlikely events will have little effect on the overall direction.

Nevertheless, this has not prevented this concept from being considered as a means of interstellar travel.

An application of entropy closer to home

The salt we added to the ice in our previous class lowers its melting point. This effect is a colligative property which means it depends on the number of a molecules in a solvent. Colligative properties are a consequence of entropy. When we add salt to the ice, it makes a solution with liquid water (which is always present on the surface of the ice, even if the temperature is below zero Celsius). This solution has higher entropy than pure liquid water. The salt has no effect on the entropy of the ice itself which remains pure. As we have increased the entropy of the solution, or in other words, the number of available configurations available in the liquid salt-water phase, compared to the number of configurations available in the solid phase we have increased the probability of transition from ice to the liquid phase and shifted the equilibrium temperature where salt-water and ice can co-exist to a lower temperature.


A consequence of the lower melting point of salty water is that ice in the sea actually forms relatively pure ice with channels of liquid brine (very salty water). When the air is very cold compared to the sea temperature the brine will have a temperature significantly less than the sea and so will seep out of the channels at the bottom of the ice, rapidly freezing the fresher water it comes in to contact with, forming a brinicle or finger of death.

Third law of thermodynamics

The statistical definition of entropy leads logically to the third law of thermodynamics which defines the absolute value of entropy.

As $S=k\ln W$

we can see that the entropy is equal to zero when there is a single microstate for the system, or $W=1$, which corresponds to a perfectly ordered state occurring at $T=0\mathrm{K}$.

phy131studiof16/lectures/chapter22.txt · Last modified: 2016/07/21 12:07 (external edit)
CC Attribution-Noncommercial-Share Alike 3.0 Unported
Driven by DokuWiki