Basic Thermodynamics 2: The First law, Entropy and the Second Law
Last time, we wrapped up by stating the first law.
It is no more or no less, than saying that energy is conserved. In other words, you can’t win. You can never get more energy out of a process than was put in.
Specifically, dU = dQ – dW.
Mathematically, this says that the change in total energy of a system is the change in heat of the system minus the change in work done by that system on something else.
Let’s try to make that concrete. Last time, I talked a lot about how if you heat a gas, what that means is that the molecules or atoms of the gas have more kinetic energy, they literally move faster, and smack the walls of the tank harder. Hence, pressure goes up.
So consider the following set up. I have a cylinder filled with some gas. However, I am going to add some heat to it with a small burner, or perhaps by putting it in a warm bath, it doesn’t matter, and I have a piston on the top.
Moving the piston takes energy. How much energy isn’t hard to calculate. You just need to remember that pressure is force over area and integrate over the distance traversed. Remember that work is the integral of force through distance. The differentials work out to integrating PdV. For those who don’t want to worry about the actual calculation, don’t worry about it. The important point is that it took some energy that came from somewhere to move the piston.
So, we are heating up the gas, which in turn, is moving its atoms or molecules faster, and smacking the piston harder. The piston goes up.
What the first law is telling us, is that we dumped some energy into the system, and that all of the energy is accounted for. We didn’t get anything for free. Any change in the overall energy of the system was exactly accounted for by heating the gas (and cylinder), and whatever work was done in moving the piston. As the gas pushed it transfered energy (which we covered in the first lecture) where did that energy go? Well a lot of it went to moving the piston! If we lived in a “perfect” world from the view of an engineer, all of that energy would go to the piston. In the real world, it did not, and that will be the subject of entropy.
Let’s take a step back though before getting into that, and play some more with our cylinder.
Last time, we wrote:
Heat is a measure of the amount of energy that went into raising something to a certain temperature. Heat is energy. Temperature is a manifestation of that energy that is different for different substances. Different items in thermal contact with each other exchange heat, until they come to the same temperature. Mathematically, a change in heat Q, is related to a change in temperature T, by the heat capacity of a substance.
From this point of view, temperature can almost be thought of as a measure of how much heat a substance has “free” to give up compared to something at a lower temperature. Let me please refine that. I wrote last time about all sorts of different possible motions on the microscale, let me just call it waggling here.
If I have something hot touch something cold, the hot thing’s atoms and molecules are waggling a lot more than the cold things atoms and molecules are. On a microscopic scale, the hot thing causes the cold thing’s bits to waggle more, by knocking into th cold things and shaking them up, or spitting of photons which the cold things catch, while the cold thing causes the hot things bits to waggle less by accepting energy from them. If the two items are at the same temperature, they both internally waggle exactly as much as their given heat capacities demand they do for that temperature. Putting them in thermal contact with each other causes neither one to change. But, if one is hotter than the other, then the hot one will cool and the cool one will warm – until they come to the same temperature.
So let’s apply this idea in the real world. Suppose I have a box. Inside the box is a cylinder with a piston that is partially filled with some fluid that has a low specific heat and the piston is compressed half way. I move the piston up. As I do this, the temperature of the gas in the piston goes down. Since the gas has a low specific heat, it goes down a lot. Look at the first law, and the previous discussion of temperature to convince yourself this is true. How much heat… PdV… Now the air in my box is at some warmer temperature than the gas in the piston and that air warms up the piston and cools a bit itself in the process. Heat went from the air in the box to the gas in my expanded piston. Energy is conserved, more heat now “lives” in the gas in the piston than it did before.
Now for the clever part…
If I compress the piston now, the gas has more internal heat than it did before and in the compression, it gets much hotter than it was before, but this time, I don’t let it sit there and exchange heat with the air in my box. Rather, I open a valve and send the gas out to a long tube outside of the box. It is happy to go, since it is at high pressure and squirts right out! The gas is now in the long tube and because of the compression, is hotter than the outside air. There is a lot of surface area over which to exchange heat, and remember that the gas or fluid involved has a low specific heat, so it gives up heat quickly and equilibrates to the cooler outside temperature quickly. We just pumped heat to the outside air! Once it has cooled, the piston sucks the now cooler gas back inside to start the process over again.
This is how a refrigerator works. If you don’t believe me, check the back of your fridge. The back is certainly hot (that is where that long tube is!) and it is both radiating heat and having heat carried away by conduction and convection with the air in your house. In this case, I am supplying energy to use the piston to act on the fluid and move heat around.
A common canard amongst people who deny AGW, when current shifts occur and the Atlantic thermohaline pump is affected, is “how can you make something hot to make it cold?” Well there is one example of how, and almost everyone in the Western world has seen it in action in their very own home. Heat pumps are real things.
If I run the process in reverse, and use energy stored in the fluid to move my piston, I have an internal combustion engine. In that case, I take the energy stored in the chemical bonds of the fluid, release it (with a spark plug and an oxidation reaction) and push the cylinder to do work on something else, like turning wheels.
In all cases though, the First law is just telling me that in the course of moving all of this energy around, I am never getting more out than I put in to begin with.
There are more details for those who are interested of course. I haven’t really gone into changes in pressure and volume and Carnot Cycles and all of that. I leave those topics to those who wish to go further on their own.
But all is not perfect. All of the heat does not go into pumping the piston. I have to spend more energy than the first law (if naively applied) might lead me to believe to get what I want. This is the second law. You can’t break even.
Another statement of the second law which is equally useful comes from Kelvin.
No process is possible whose sole result is the absorption of heat from a reservoir and the conversion of this heat into work.
Some of the heat went into warming the walls of the piston, or into friction with the piston and the cylinder wall. That isn’t doing me any good. It is lost energy. Entropy is the measure of this.
Lots of people have heard that entropy is some sort of measure of disorder in a system. When pressed further as to what that might mean, they wave their hands. This does not stop them from saying something like well entropy always increases according to the second law, evolution goes from lower to higher order and there must be a contradiction!
We are going to tackle that nonsense right now. Not only did they misstate the law, entropy increases in a closed system – the closed part is important! – but, they have no clue what it is in the first place.
First, let’s talk about what entropy is.
Consider a system with a lot of small parts – like any macroscopic thing you have ever encountered, since it is made up of atoms and molecules. Consider every possible way you could rearrange those parts, or every different possible way those parts could be waggling or exchanging heat. Of all those possible ways it could be or things it could be doing, only some of them are doing work for you, or retaining a form you like. Everything else is something that it can also do, that is not useful to you. If the system can go into those states or modes, it will.
Let me give an everyday example. Why do your computer cords always tangle? You certainly laid them out straight. Well, the cords have some tension in them when you lay them out and they are going to coil and move when you set them down. Of all the possible ways the cords can end up, there is only one way for them to be straight, but many, many ways for them to tangle. As they coil up, the chances of them staying straight, when there are so many other ways they could move into a state of being tangled are almost nil. In this case, the tendency to get tangled is an example of entropy.
Entropy is in short, a measure of all the ways that a system can manifest its heat, energy or configuration in a way you do not like. In the case of an internal combustion engine, it is heat getting lost to friction and the walls of the cylinder rather than pushing the pistons. There are all sorts of mathematical ways to calculate this, and there is a way to say this firmly without any hand waving in terms of equations, but let’s leave that for those who want to dig deeper and stay purely conceptual.
The Second Law says that entropy will always increase in a closed system. A closed system is one which is not exchanging mass with an outside source. Closed systems are capable of exchanging heat with their surroundings though.
It is possible to fight entropy, but not to win forever.
If you are taking in energy, then it might be possible to use that energy to fight entropy and put things in the order you desire/need etc. And it might be possible to impose a certain state at the expense of another.
As an additional thought:
Don’t let the language of desire or need put you off. You want an efficient engine; you do what you can to minimize losses to entropy. This can either be a process done by an engineer, or it can be done by natural selection. A better heart, for example, is a survival advantage. You also “want” for instance, to maintain your own cellular structure, so you take in energy and material to do so.
So consider the following example of an isolated system: you, and Bessie the cow. Isolated systems do not exchange mass or energy with the outside world. And, this is a subset of the closed system of you, Bessie, the plants Bessie eats and the Sun. Your cells are undergoing normal wear and tear – going into configurations that you would not want, because if they get too messed up you die. Fortunately, you have all kinds of mechanisms to correct this, if you give them energy to do so. You make Bessie into steaks and eat her. Your entropy went down, but hers went way up! Getting sliced into steaks, chewed up and digested is unlikely a configuration she likes! Total entropy in the system increased.
Where does the energy ultimately come to fuel Bessie and almost everything else? The Sun! But as the Sun burns, its entropy is going up! In the closed system of the Earth and everything on it getting power from the Sun, the total entropy is still going up all the time as the Sun burns. One day, it will burn out.
Let’s take another example. Back in the day, people used magnetic tapes and disks to store information. Little magnetic crystals would be put into either an up or down configuration. These were little magnets and by scanning them with the right electronics, the changing magnetic fields could be converted into changing voltages that were then read by a machine or converted into a signal, or broadcast on a speaker. That part doesn’t matter. What matters is that the more you used them the more they might wobble out of place. Eventually bits of data would be lost here or there, or your tape would sound more and more fuzzy. This too is a measure of entropy.
To take it further, suppose you went to copy any stream of bits, as in a 0 or a 1, and you have a really good machine to do it. There is only one configuration of those bits which is the information you want. For each bit that gets recorded, there is a chance of a bit of the information being recorded wrong - as something different. Out of 10 bits, there is only one way to have whatever your 10 bit message was, but there are 1023 ways for it to be something else. If transmission were completely random, then you would have less than a one in 1000 chance of getting the right message. But it isn’t completely random, any given bit is most likely to be copied correctly. However, over multiple copies, errors will and must occur eventually. Entropy becomes a measure of how easy it is to get into one of those other states.
So let’s say that you have a really good machine to copy your information and you build the mechanism so it is hard to get a bit wrong. There is still some chance that it will mess up somewhere. Eventually, a copy will have a bit wrong. Eventually, some copy of that will have two or more bits wrong. This is entropy increasing.
Here we come to the part where we nail the evolution deniers hard. DNA is nothing more than a coded string with many bits of data. We have really good cellular mechanisms for copying it from one generation to the next. But what is a mutation if not a change in the DNA? Eventually over time, all sorts of errors will creep in. If those “errors” end up being beneficial, then they get copied to the next generation that makes more babies. If not, they die off. In fact, on the level of encoded information, evolution is exactly an example of entropy increasing.
For those that wish to dig deeper, I invite you to go into Shannon’s Law.
Next up, the third law, absolute zero and you can’t leave the game.