View page as slide show

Fall 2018: Lecture 36 - Entropy

What is Entropy?

In our last class we introduced the second law of thermodynamics which placed limits on the physically possible thermodynamic processes above and beyond conservation of energy (the 1st law).

Here we introduce a quantity, entropy, which we can use to build a deeper understanding of the second law. We'll come to see that entropy relates to the disorder of a system, and is also a measurement of the available thermal energy for a process to occur.

We will define the change in entropy in a reversible process at constant temperature as

$\Delta S =\frac{Q}{T}$

If we want to treat non-constant temperature cases we can express the change of entropy in differential form

$dS=\frac{dQ}{T}$

and then the change in entropy in going from state $a$ to $b$ will be

$\Delta S =S_{b}-S_{a}=\int_{a}^{b}\,dS=\int_{a}^{b}\frac{dQ}{T}$

Entropy as a state variable

A state variable is a variable that describes the current state of a dynamical system. Heat is not a state variable, it depends on the path taken to get the system to it's state. Entropy, on the other hand, is a state variable, the change in entropy required to change a system from one state to another via a reversible process is independent of the path taken.

We can recall that in the Carnot Cycle we were able to derive that

$\frac{Q_{L}}{Q_{H}}=\frac{T_{L}}{T_{H}}$ or $\frac{Q_{L}}{T_{L}}=\frac{Q_{H}}{T_{H}}$

where $Q_{L}$ is the magnitude of the heat flowing out of the system and $Q_{H}$ is the magnitude of the heat flowing into the system. To make these compatible with our equation for entropy we give $Q_{H}$ a positive sign and $Q_{L}$ a negative sign.

$\frac{Q_{H}}{T_{H}}+\frac{Q_{L}}{T_{L}}=0$

And we can see on the PV diagram below that either of the paths from A to C will have the same change in entropy. We can also see that a complete Carnot cycle has a net change of entropy of zero.

Consequence of entropy being a state variable

We can calculate the entropy changes for reversible process using

$dS=\frac{dQ}{T}$

This equation is only valid for reversible processes. However if we want to find the entropy change for an irreversible process (ie. any real process) that goes from state A to state B we can calculate the entropy change for a reversible process that goes from point to A and B, and the entropy change for the system will be the same for the two processes.

As we have stated that the entropy of a system is a state variable it stands to reason that for any closed cycle the systems entropy will return to it's initial value after the completion of a full cycle.

So what distinguishes a reversible cycle from an irreversible one? The distinction considers the change of the entropy for the environment the system is exchanging heat with. The definition of a reversible cycle is that the system is always infinitesimally close to being in thermal equilibrium with it's environment, in which case we can see that the change of entropy in the environment is also zero. If this is not the case then a heat engine will instead increase the entropy of its environment which each cycle, and as the system itself does not have a changes in entropy this leads to a net increase in the entropy of the universe whenever work is done by any real engine cycle.

Any reversible cycle can be represented as series of Carnot cycles, with each Carnot cycle contributing no increase of entropy, therefore any reversible cycle results in no increase of entropy.

A reversible process will have equal and opposite entropy change if it is reversed.

We will now look at the entropy change for some non reversible processes.

Entropy change for mixing

We can consider the entropy change when two objects transfer heat from one to another, for example when we mix water at two different temperatures. When the temperature difference is fairly small we can approximate the entropy change for each process as

$\Delta S=\frac{Q}{\overline{T}}$

$\overline{T}$ is an average temperature for the process.

In the example of mixing equal quantities of water at different temperatures $T_{H}$ and $T_{L}$, and amount of heat $Q$ will be transferred from the hot water until the temperature is a new temperature $T_{M}=\frac{T_{H}+T_{L}}{2}$. For the water which is cooling the entropy change will be negative

$\Delta S_{cooling}=-\frac{Q}{T_{cooling}}$

where $T_{H}>T_{cooling}>T_{M}$

For the water whose temperature is increasing the change in entropy is

$\Delta S_{heating}=\frac{Q}{T_{heating}}$

where $T_{L}<T_{heating}<T_{M}$

As $T_{cooling}>T_{heating}$ we can see that the total entropy change of the system

$\Delta S_{cooling}+\Delta S_{heating}>0$

Transfer of entropy to the environment

In many cases when considering the total entropy change in a system we need to consider the entropy change to the environment. For example, if we take an object which is cooling by heat lost to the environment through a quasistatic reversible process where $dQ=mc\,dT$

$\Delta S_{object}=\int \frac{dQ}{T}=mc\int_{T_{1}}^{T_{2}}\frac{dT}{T}=mc\ln\frac{T_{2}}{T_{1}}=-mc\ln\frac{T_{1}}{T_{2}}$

If we consider the environment as thermal reservoir at a fixed temperature $T_{2}$ the entropy change is

$\Delta S_{environment}=mc\frac{T_{1}-T_{2}}{T_{2}}=mc(\frac{T_{1}}{T_{2}}-1)$

$\Delta S_{total}=mc((\frac{T_{1}}{T_{2}}-1)+\ln\frac{T_{1}}{T_{2}}))$ which we can see is always greater than zero.

Second law of thermodynamics from an entropy viewpoint

The examples we have discussed fit with our expectations of how things work. If we mix hot and cold water they will equilibrate to the same temperature and they won't spontaneously separate again in to hot and cold water (which would decrease entropy). If we have a hot object in an cooler environment it will transfer the heat to the environment, the reverse, which would decrease entropy won't happen.

We can express the second law of thermodynamics in terms of entropy:

The entropy of an isolated system never decreases, it either stays constant (for a reversible process) or increases (for irreversible processes). As all real processes are irreversible the total entropy for a system and it's environment increases as a result of any natural process.

While we can decrease the entropy of part of the universe, some other parts entropy will be increased by a greater amount, leading towards an continual overall increase of the universe's entropy.

Order, disorder and availability of energy

Entropy can be seen as a measure of the order of a system.

For example when we have hot and cold fluids we have a form of order, which is lost when we mix them together. We also lose the capacity to use them for work, while they were separated we could have used them to drive a heat engine, which requires a temperature gradient to do work, once they are mixed we cannot get work from them, even though no energy has been lost.

We can view the continual change of order to disorder as a gradual heating of the universe to a uniform temperature (expansion of the universe would complicate this, as this would result in a lower final temperature, which might eventually tend to absolute zero). The long term consequence of the universe acquiring a uniform temperature and maximal entropy state would be it's eventual heat death in which all mechanical energy would be lost.

Entropy and Statistics

So far we have considered entropy along the lines it was first proposed by Clausius.

Boltzmann was responsible for giving entropy a statistical basis.

So far when we have considered the state of the system we are referring to it's macrostate, ie. what is the pressure, volume, temperature etc. For each of these macrostates there is some set of microstates which give rise to the macrostate.

If we consider a gas knowing the microstate of the gas would imply that we know the velocity and position of each and every molecule. This is impossible to know, but to determine the probability of a given macrostate we don't need to know the details of the microstate, simply how many microstates there are which correspond to that macrostate. Those macrostates which have the greatest number of microstates have the greatest probability of occurrence. Boltzmann expressed entropy in terms of the number of microstates; the entropy of a given macrostate is

$S=k\ln W$

where $W$ is the number of microstates corresponding to that state.

Second law as a consequence of statistics

As we have now made a link between entropy and probability

$S=k\ln W$

we can now see that the second law of thermodynamics is simply a statement that a change of a system will be towards one that is more probable. For example, it is extremely unlikely that all the molecules of gas in a room will arrange themselves neatly ordered on one side of the room, because this would be a single microstate, whereas there are a very large number of microstates which obey the Maxwell distribution in which the molecules are evenly distributed and move randomly.

Third law of thermodynamics

The statistical definition of entropy leads logically to the third law of thermodynamics which defines the absolute value of entropy.

As $S=k\ln W$

we can see that the entropy is equal to zero when there is a single microstate for the system, or $W=1$, which corresponds to a perfectly ordered state occurring at $T=0\mathrm{K}$.

phy141kk/lectures/36-18.txt · Last modified: 2018/11/30 10:27 by kkumar
CC Attribution-Noncommercial-Share Alike 3.0 Unported
Driven by DokuWiki