This shows you the differences between two versions of the page.

— |
phy141kk:lectures:36-18 [2018/11/30 10:27] (current) kkumar created |
||
---|---|---|---|

Line 1: | Line 1: | ||

+ | ~~SLIDESHOW~~ | ||

+ | ====== Fall 2018: Lecture 36 - Entropy ====== | ||

+ | |||

+ | /* | ||

+ | |||

+ | ---- | ||

+ | If you need a pdf version of these notes you can get it [[http://www.ic.sunysb.edu/class/phy141md/lecturepdfs/141lecture38F11.pdf|here]] | ||

+ | |||

+ | ===== Video of lecture ==== | ||

+ | |||

+ | <html> | ||

+ | <video id="video" width="640" height="360" controls="true"/> | ||

+ | <source src="lecturevids/phy141f13lecture38.mp4" type="video/mp4"></source> | ||

+ | <source src="lecturevids/phy141f13lecture38.webm" type="video/webm"></source> | ||

+ | |||

+ | |||

+ | <object width="640" height="384" type="application/x-shockwave-flash" data="player.swf"> | ||

+ | <param name="movie" value="player.swf" /> | ||

+ | <param name="flashvars" value="file=lecturevids/phy141f13lecture38.mp4" /> | ||

+ | |||

+ | </object> | ||

+ | |||

+ | |||

+ | </video> | ||

+ | </html> | ||

+ | |||

+ | */ | ||

+ | |||

+ | ===== What is Entropy? ===== | ||

+ | |||

+ | In our last class we introduced the second law of thermodynamics which placed limits on the physically possible thermodynamic processes above and beyond conservation of energy (the 1st law). | ||

+ | |||

+ | Here we introduce a quantity, [[wp>Entropy|entropy]], which we can use to build a deeper understanding of the second law. We'll come to see that entropy relates to the disorder of a system, and is also a measurement of the available thermal energy for a process to occur. | ||

+ | |||

+ | We will define the change in entropy in a reversible process at constant temperature as | ||

+ | |||

+ | $\Delta S =\frac{Q}{T}$ | ||

+ | |||

+ | If we want to treat non-constant temperature cases we can express the change of entropy in differential form | ||

+ | |||

+ | $dS=\frac{dQ}{T}$ | ||

+ | |||

+ | and then the change in entropy in going from state $a$ to $b$ will be | ||

+ | |||

+ | $\Delta S =S_{b}-S_{a}=\int_{a}^{b}\,dS=\int_{a}^{b}\frac{dQ}{T}$ | ||

+ | |||

+ | ===== Entropy as a state variable ===== | ||

+ | |||

+ | A [[wp>State_variable|state variable]] is a variable that describes the current state of a dynamical system. Heat is not a state variable, it depends on the path taken to get the system to it's state. Entropy, on the other hand, is a state variable, the change in entropy required to change a system from one state to another via a reversible process is independent of the path taken. | ||

+ | |||

+ | We can recall that in the [[phy141:lectures:38#efficiency_of_the_carnot_cycle|Carnot Cycle]] we were able to derive that | ||

+ | |||

+ | $\frac{Q_{L}}{Q_{H}}=\frac{T_{L}}{T_{H}}$ or $\frac{Q_{L}}{T_{L}}=\frac{Q_{H}}{T_{H}}$ | ||

+ | |||

+ | where $Q_{L}$ is the magnitude of the heat flowing out of the system and $Q_{H}$ is the magnitude of the heat flowing into the system. To make these compatible with our equation for entropy we give $Q_{H}$ a positive sign and $Q_{L}$ a negative sign. | ||

+ | |||

+ | $\frac{Q_{H}}{T_{H}}+\frac{Q_{L}}{T_{L}}=0$ | ||

+ | |||

+ | And we can see on the PV diagram below that either of the paths from A to C will have the same change in entropy. We can also see that a complete Carnot cycle has a net change of entropy of zero. | ||

+ | |||

+ | {{carnot.png?400}} | ||

+ | |||

+ | ===== Consequence of entropy being a state variable ===== | ||

+ | |||

+ | We can calculate the entropy changes for reversible process using | ||

+ | |||

+ | $dS=\frac{dQ}{T}$ | ||

+ | |||

+ | This equation is only valid for reversible processes. However if we want to find the entropy change for an irreversible process (ie. any real process) that goes from state A to state B we can calculate the entropy change for a reversible process that goes from point to A and B, and the entropy change for the system will be the same for the two processes. | ||

+ | |||

+ | As we have stated that the entropy of a system is a state variable it stands to reason that for any closed cycle the systems entropy will return to it's initial value after the completion of a full cycle. | ||

+ | |||

+ | So what distinguishes a reversible cycle from an irreversible one? The distinction considers the change of the entropy for the environment the system is exchanging heat with. The definition of a reversible cycle is that the system is always infinitesimally close to being in thermal equilibrium with it's environment, in which case we can see that the change of entropy in the environment is also zero. If this is not the case then a heat engine will instead increase the entropy of its environment which each cycle, and as the system itself does not have a changes in entropy this leads to a net increase in the entropy of the universe whenever work is done by any real engine cycle. | ||

+ | |||

+ | Any reversible **cycle** can be represented as series of Carnot cycles, with each Carnot cycle contributing no increase of entropy, therefore any reversible cycle results in no increase of entropy. | ||

+ | |||

+ | A reversible process will have equal and opposite entropy change if it is reversed. | ||

+ | |||

+ | We will now look at the entropy change for some non reversible processes. | ||

+ | |||

+ | ===== Entropy change for mixing ===== | ||

+ | |||

+ | We can consider the entropy change when two objects transfer heat from one to another, for example when we mix water at two different temperatures. When the temperature difference is fairly small we can approximate the entropy change for each process as | ||

+ | |||

+ | $\Delta S=\frac{Q}{\overline{T}}$ | ||

+ | |||

+ | $\overline{T}$ is an average temperature for the process. | ||

+ | |||

+ | In the example of mixing equal quantities of water at different temperatures $T_{H}$ and $T_{L}$, and amount of heat $Q$ will be transferred from the hot water until the temperature is a new temperature $T_{M}=\frac{T_{H}+T_{L}}{2}$. For the water which is cooling the entropy change will be negative | ||

+ | |||

+ | $\Delta S_{cooling}=-\frac{Q}{T_{cooling}}$ | ||

+ | |||

+ | where $T_{H}>T_{cooling}>T_{M}$ | ||

+ | |||

+ | For the water whose temperature is increasing the change in entropy is | ||

+ | |||

+ | $\Delta S_{heating}=\frac{Q}{T_{heating}}$ | ||

+ | |||

+ | where $T_{L}<T_{heating}<T_{M}$ | ||

+ | |||

+ | As $T_{cooling}>T_{heating}$ we can see that the total entropy change of the system | ||

+ | |||

+ | $\Delta S_{cooling}+\Delta S_{heating}>0$ | ||

+ | |||

+ | ===== Transfer of entropy to the environment ===== | ||

+ | |||

+ | In many cases when considering the total entropy change in a system we need to consider the entropy change to the environment. For example, if we take an object which is cooling by heat lost to the environment through a quasistatic reversible process where $dQ=mc\,dT$ | ||

+ | |||

+ | $\Delta S_{object}=\int \frac{dQ}{T}=mc\int_{T_{1}}^{T_{2}}\frac{dT}{T}=mc\ln\frac{T_{2}}{T_{1}}=-mc\ln\frac{T_{1}}{T_{2}}$ | ||

+ | |||

+ | If we consider the environment as thermal reservoir at a fixed temperature $T_{2}$ the entropy change is | ||

+ | |||

+ | $\Delta S_{environment}=mc\frac{T_{1}-T_{2}}{T_{2}}=mc(\frac{T_{1}}{T_{2}}-1)$ | ||

+ | |||

+ | $\Delta S_{total}=mc((\frac{T_{1}}{T_{2}}-1)+\ln\frac{T_{1}}{T_{2}}))$ which we can see is always greater than zero. | ||

+ | |||

+ | {{coolingentropy.png}} | ||

+ | |||

+ | ===== Second law of thermodynamics from an entropy viewpoint ===== | ||

+ | |||

+ | The examples we have discussed fit with our expectations of how things work. If we mix hot and cold water they will equilibrate to the same temperature and they won't spontaneously separate again in to hot and cold water (which would decrease entropy). If we have a hot object in an cooler environment it will transfer the heat to the environment, the reverse, which would decrease entropy won't happen. | ||

+ | |||

+ | We can express the second law of thermodynamics in terms of entropy: | ||

+ | |||

+ | The entropy of an isolated system never decreases, it either stays constant (for a reversible process) or increases (for irreversible processes). As all real processes are irreversible the total entropy for a system and it's environment increases as a result of any natural process. | ||

+ | |||

+ | While we can decrease the entropy of part of the universe, some other parts entropy will be increased by a greater amount, leading towards an continual overall increase of the universe's entropy. | ||

+ | |||

+ | ===== Order, disorder and availability of energy ===== | ||

+ | |||

+ | Entropy can be seen as a measure of the order of a system. | ||

+ | |||

+ | For example when we have hot and cold fluids we have a form of order, which is lost when we mix them together. We also lose the capacity to use them for work, while they were separated we could have used them to drive a heat engine, which requires a temperature gradient to do work, once they are mixed we cannot get work from them, even though no energy has been lost. | ||

+ | |||

+ | We can view the continual change of order to disorder as a gradual heating of the universe to a uniform temperature (expansion of the universe would complicate this, as this would result in a lower final temperature, which might eventually tend to absolute zero). The long term consequence of the universe acquiring a uniform temperature and maximal entropy state would be it's eventual [[wp>Heat_death_of_the_universe|heat death]] in which all mechanical energy would be lost. | ||

+ | |||

+ | ===== Entropy and Statistics ===== | ||

+ | |||

+ | So far we have considered entropy along the lines it was first proposed by Clausius. | ||

+ | |||

+ | Boltzmann was responsible for giving entropy a statistical basis. | ||

+ | |||

+ | So far when we have considered the state of the system we are referring to it's macrostate, ie. what is the pressure, volume, temperature etc. For each of these macrostates there is some set of microstates which give rise to the macrostate. | ||

+ | |||

+ | If we consider a gas knowing the microstate of the gas would imply that we know the velocity and position of each and every molecule. This is impossible to know, but to determine the probability of a given macrostate we don't need to know the details of the microstate, simply how many microstates there are which correspond to that macrostate. Those macrostates which have the greatest number of microstates have the greatest probability of occurrence. Boltzmann expressed entropy in terms of the number of microstates; the entropy of a given macrostate is | ||

+ | |||

+ | $S=k\ln W$ | ||

+ | |||

+ | where $W$ is the number of microstates corresponding to that state. | ||

+ | |||

+ | ===== Second law as a consequence of statistics ===== | ||

+ | |||

+ | As we have now made a link between entropy and probability | ||

+ | |||

+ | $S=k\ln W$ | ||

+ | |||

+ | we can now see that the second law of thermodynamics is simply a statement that a change of a system will be towards one that is more probable. For example, it is extremely unlikely that all the molecules of gas in a room will arrange themselves neatly ordered on one side of the room, because this would be a single microstate, whereas there are a very large number of microstates which obey the Maxwell distribution in which the molecules are evenly distributed and move randomly. | ||

+ | |||

+ | /* | ||

+ | ===== Maxwell's Demon ===== | ||

+ | |||

+ | |||

+ | However, the statistical viewpoint actually leads us to the conclusion that the second law may not be as rigid as we have so far presented it. Processes which decrease entropy are not strictly forbidden, they are just very unlikely to occur, a concept Maxwell considered through the thought experiment known as [[wp>Maxwell's_demon|Maxwell's demon]]. Irrespective of this idea, over time the occurrence of a few statistically unlikely events will have little effect on the overall direction. | ||

+ | |||

+ | This has not stopped the harnessing of such events being proposed as [[http://www.youtube.com/watch?v=hbvQmu-ULF8&feature=related|a means of faster than light travel]]. | ||

+ | |||

+ | ===== An application of entropy closer to home ===== | ||

+ | |||

+ | The salt we added to the ice at the beginning of the class lowers its melting point. This effect is a [[wp>Colligative_properties|colligative property]] which means it depends on the number of a molecules in a solvent. Colligative properties are a consequence of entropy. When we add salt to the ice, it makes a solution with liquid water (which is always present on the surface of the ice, even if the temperature is below zero Celsius). This solution has higher entropy than pure liquid water. The salt has no effect on the entropy of the ice itself which remains pure. As we have increased the entropy of the solution, or in other words, the number of available configurations available in the liquid salt-water phase, compared to the number of configurations available in the solid phase we have increased the probability of transition from ice to the liquid phase and shifted the equilibrium temperature where salt-water and ice can co-exist to a lower temperature. | ||

+ | |||

+ | Because the water in the bottles is fairly pure and has been cooled slowly and without agitation we will hopefully find that at least some of the bottles are supercooled. | ||

+ | ===== Brinicles ===== | ||

+ | |||

+ | |||

+ | A consequence of the lower melting point of salty water is that ice in the sea actually forms relatively pure ice with channels of liquid brine (very salty water). When the air is very cold compared to the sea temperature the brine will have a temperature significantly less than the sea and so will seep out of the channels at the bottom of the ice, rapidly freezing the fresher water it comes in to contact with, forming a [[http://www.bbc.co.uk/nature/15835017|brinicle or finger of death]]. | ||

+ | |||

+ | */ | ||

+ | |||

+ | ===== Third law of thermodynamics ===== | ||

+ | |||

+ | The statistical definition of entropy leads logically to the [[wp>Third_law_of_thermodynamics|third law of thermodynamics]] which defines the absolute value of entropy. | ||

+ | |||

+ | As $S=k\ln W$ | ||

+ | |||

+ | we can see that the entropy is equal to zero when there is a single microstate for the system, or $W=1$, which corresponds to a perfectly ordered state occurring at $T=0\mathrm{K}$. |