CHEM 125a: Freshman Organic Chemistry I
|Transcript||Audio||Low Bandwidth Video||High Bandwidth Video|
Freshman Organic Chemistry I
CHEM 125a - Lecture 36 - Bond Energies, the Boltzmann Factor and Entropy
Chapter 1. Chupka and Inghram’s Determination of Graphite’s Heat of Atomization [00:00:00]
Professor Michael McBride: Okay, at the end last time we were looking at how you could possibly know the heat of atomization of graphite; how much energy it takes to put a carbon atom into the gas phase from graphite. Why would you want to know that? Why is that a key value? For what purpose? If you want to be able to know whether you can add together bonds and get the energy of a molecule, that means you start with separated atoms and you then see how much energy is given off when those come together to give a particular molecule, to make a whole bunch of bonds. Right? Now you can measure the energy of a molecule with respect to CO2 quite easily. How do you do that? Burn it. And you can measure the CO2 relative to graphite. How do you do that? How do you know the energy of a carbon in — or the energy of CO2 relative to the energy of graphite plus oxygen? You burn graphite. Okay? But the last thing you need is the energy of graphite relative to the atom. And if you then knew that, then you could complete this scheme and know, just by burning things, whether you can add together bond energies to get the energy of a molecule.
Okay, so how do you get the energy from graphite to an atom? One way is spectroscopy, and we talked about that last time, that if you see what the minimum amount of energy you can put into CO2 and have it break into two atoms, that gives you the energy of the two atoms relative to CO2 — pardon me, relative to CO I should be saying. Right? And then you burn CO and get it relative to CO2, and then you have everything you need. Okay? Okay, except that people who were very smart, Nobel Prize winners and so on, differed on interpreting this, because some of them thought that when you break [CO] into atoms, with the light, the atom you get is not the lowest energy state of the atom, but a higher energy state of the atom. So some of the energy you’re putting in is going into making an excited atom. And the true energy of forming the minimum-energy carbon atom from graphite is lower than what you do spectroscopically. So the spectroscopic value is very precise. You can measure the position, the color of the light, very accurately. But they didn’t know what it corresponded to, so they needed a different way to know what the energy of the carbon atom was relative to graphite.
Now Professor Sharpless said — you know, he talked about increasing dimension of carbon; start with an atom, go to a line, polyacetylene and then to a bunch of double bonds, and finally to a saturated hydrocarbon — he said that carbon atoms are very hard to come by. You need a really, really low vacuum to get it. Right? That is, the equilibrium constant for carbon atoms coming together to form bonds is very, very favorable. Right? So it’s hard to get atoms. So how are you ever going to get atoms? Well let’s think about that problem here. So we know that the equilibrium constant at room temperature is 10-(3/4)ΔE. So you can measure k. Then you know ΔE. So, but if the equilibrium you’re trying to measure is between graphite and carbon atoms, and the energy to take a carbon atom out of graphite is 170 kilocalories/mol, then the equilibrium constant is 10-127. Right? And that means, since there are only something of the order of 1080 atoms in the universe, there would not be, at room temperature, a single carbon atom at equilibrium with graphite if everything in the universe was graphite. Right? So it’s not a very favorable equilibrium constant. What could you do about it, if you wanted to measure the equilibrium constant, and in that way get the energy difference? Any knobs you can twist? Lucas?
Professor Michael McBride: You could increase the temperature. Right? Because it’s ΔE/kT. And remember, we’re talking about room temperature. So if we’re much higher than room temperature, then the exponent gets much smaller. So suppose we went to ten times room temperature, to 3000 Kelvin. Right? Then it would be, instead of 3/4ths, it would be 3/40ths, because the denominator would be ten times bigger. And now it would be 1 in 1013th would be an atom. And now that’s a substantial number, if you’re talking about Avogadro’s number. So if you had something really, really sensitive, to measure atoms, you might be able to measure the atoms at equilibrium. But you have to establish equilibrium at something of the order of 3000°, or at least really, really hot. Right? So you could write the equilibrium constant, the concentration of atoms over the concentration of graphite, in this way. That has to do — that is the heat of formation, atoms relative to graphite. Right?
Now, of course, exactly what one means by the concentration of graphite needs to be — you scratch your head a little bit about that. Right? But you don’t actually need to know it, because if you could measure the pressure of the C atoms at equilibrium with graphite, at very high temperature — call that the pressure of carbon, right? — that is some constant, and that constant will include whatever this concentration of graphite should be. So B. So multiply both sides by that concentration of graphite. Right? So we get some constant on the right, and then that heat of formation, in the exponent, that we want to find. So that means we could write — if we took the log of both sides. You have the log of the pressure of carbon atoms, is the log of whatever B is — that’s some constant — minus this other term. And what that means is that if you — that that minus the heat of formation of the carbon atom, divided by R, is the slope of a plot of the pressure of carbon atoms, versus 1/T. Does everyone see that? In that equation, that equation says that there’s an intercept, which is the log of b, and a slope, which is -ΔH formation to carbon, divided by R, if you plot against 1/T. Everybody with me on that? Nod if you see it. Okay. So all you need to do is plot the log of the pressure of carbon atoms, which will be very low, versus 1/T, at very high temperature — right?; of the order of 3000 Kelvin.
So that’s not easy to do. But it was done in 1955 by Chupka and Inghram. And this is the sketch of their instrument, and I’ll show you how they did it. First there’s a graphite cylinder. Okay? And around it is a can made out of tantalum. Now why tantalum? Because that’s about the highest melting thing you can get; it’s the highest melting metal. So 3293 Kelvin is its melting point. So you could really heat the heck out of this thing. Okay? And now you surround that with wires, and those wires are made of tungsten, because that’s very high melting too. And so you connect wire — you connect electricity to the tungsten, and also to the tantalum. So electrons boil off the tungsten and bombard the tantalum, and when the electrons hit it, they heat it up. So you can heat the heck out of this tantalum can by bombarding it with electrons. And that, of course, heats the graphite that’s inside. Now, so inside there’s going to be a gas of carbon, in equilibrium with the graphite. Now it won’t just be C atoms. It’ll also be C2, C3, C60, C70, and so on; but lots of different forms of carbon, as a gas. So you can’t just measure the pressure in there; and indeed it would be tough to measure the pressure inside, at 3000°. But you at least have these things there at equilibrium.
Now what they did, Chupka and Inghram, was to drill a tiny hole, in the top of this thing. And that will allow a little bit of the gas to escape; not so much that you destroy the equilibrium inside, just a really slow leak. Right? So now, if you could measure the amount of these different carbon species leaking out, you could know what the pressure of them was inside the can. Okay? Now, so there’s a beam of carbon species coming out, gaseous carbon species. This is all at very, very high vacuum. Right? And then up there, an electron beam comes across and hits these carbon things, and knocks electrons out of them. And that converts them into C+; C1, C2, C3, C60+. Okay? So you have a beam coming out of these charged carbon species. Now why do you want them to be charged? For two reasons. One, so you can detect them; because you can detect it when charge hits a plate. And the other one is so you can deflect the beams with a magnetic field. Right? So these various carbon species come out, and they go into a magnetic field, which puts a lateral force on them and causes them to bend. And, of course, the lighter they are, the easier they are to bend if they all have the same charge. So you’re going to get curved trajectories. C3+, C2+, C1+ will be the most bent. And now if you put a detector at these different positions, you can see how much C1, how much C2, how much C3 there was. Okay? So that’s how you’re going to do it. But this machine has to be pretty special, because you’re heating it so high. Lucas?
Student: How do you know the electron beam only knocks off one electron?
Professor Michael McBride: It can knock off two, but then you get it at a very different position. You can tell these things. This thing is called mass spectroscopy. But it knocks off one much more often than it knocks off two or three.
Student: And isn’t this kind of the same problem as we had with the other thing, where the C is ionized?
Professor Michael McBride: Which other thing? I can’t remember what problem —
Student: You said that we couldn’t measure the other one because it was a high energy state, spectrographically.
Professor Michael McBride: Oh. Might it be an excited state?
Professor Michael McBride: No because it’s in equilibrium. [Note: the initial fragments are at equilibrium. No one cares if the ions are in their ground state. They are only being used to measure the abundance of fragments, and their only relevant properties are charge and mass.]
Professor Michael McBride: The excited state would be much, much less at equilibrium, because it’s higher in energy. Right? That’s a good point though; good that you thought about that. Now, so you have to put shielding around this stuff, so it doesn’t melt everything. So you use tantalum, which is high melting, and a series of can-inside-a-can, like Russian dolls. Right? So the inside most one is very, very hot; then a little less hot, a little less hot. Okay? So you shield it. And now you need to know the temperature inside. And you do it by drilling a tiny hole through those shields so you can have what’s called an optical pyrometer. That’s just something that looks at the color of something that’s really hot. And something that’s hot gives — even something that’s cold — gives off black-body radiation. And the color has to do with the temperature; you know, there’s red heat, white heat, blue heat, and so on. So by measuring the color, you can see what the temperature was inside. And that window, that the light comes through, has to be made of quartz, not Pyrex glass. You know why?
Student: Pyrex would melt.
Professor Michael McBride: Because Pyrex glass would melt, even at that distance. Right? So you use quartz glass. Okay, so that’s what you do. And here’s a graph of the pressure of these various species, measured with that mass spectrometer, at different temperatures, measured as one over temperature here. So it goes from 2150 Kelvin to 2450 Kelvin, and the pressure increases. And it’s plotted as the log of the intensity of the signal; that is, the pressure times temperature, because the pressure has to be corrected for temperature. Because if you have the same number of things giving the pressure, but they’re hotter, they’ll be pressing harder on things. It’s the intensity of the signal times the temperature that you plot there. And from the slope, you can see up there at the top, that for C1 you get — the slope says it’s 171 kilocalories/mol; Q.E.D. Right?
Now you know which was the right one, measured by spectroscopy. It was the one that said 171. So this experiment, actually measuring the equilibrium between carbon atoms and graphite, by a really gargantuan kind of experiment, is what settled the question finally. So that when you look at the appendix of this book, Streitwieser, Heathcock and Kosower, you find that there are heats of formation for atoms and radicals, measured by spectroscopy and things like this. And there you see carbon, 170.9. And that was done by Professor Chupka, who was in this department and who used to come and tell people how he did this experiment. But as you can see, he passed away in 2007. So thanks to Professor Chupka. And the nice thing is, once that’s done, it’s done. Now you know it and you just plug it in when you burn your stuff and want to know what its energy is. You can get it relative to carbon atoms in the gas phase.
Chapter 2. Calculating Equilibrium Constants from Bond Dissociation Energies [00:14:20]
Okay, now how good are these spectroscopic experiments? Well this is a neat thing. The heat of atomization of methane, measured in the ways we’ve just been talking about, is 397.5 kilocalories/mol. Now that comes from mating a carbon atom with four hydrogen atoms; that is 397.5. Right? So we know what the average bond energy. There are four such bonds. So each one’s worth 99.4 kilocalories/mol. So about 100 kilocalories/mol for a C-H bond; that’s convenient to remember. Okay? But that’s not how much it costs to take a single hydrogen atom away from methane. Taking a single hydrogen atom away from methane, the so-called “bond dissociation energy”, which is the actual experimental energy it takes to do some particular process — average bond energy is just an average, but the individual ones are not the same — taking a hydrogen away from methane is 104.99, plus or minus 0.03 kilocalories/mol. Close, but not the same thing. And then you have CH3. If you take a second H off that, it’s 110.4. The next one is 101.3, and the final one, taking H away from C, is only 80.9. Right? Now these are done by spectroscopy. And Barney Ellison may come and talk to us in the spring; he’s often traveling through and talks about how he measured these things. But those are done by spectroscopy. But the neat thing about it is if you add all those four numbers together — so, pardon me, I was going to say no individual bond equals the average. Right? But if you add them together you get 397.5, which is precisely the average.
So these are very good experiments. So we know, through heroic spectroscopy and this work of Chupka and Inghram, we know what these energies are, bond energies, and bond dissociation energies. So here are average bond energies in a table that you have at the end of this organic chemistry text. And it says a carbon-hydrogen bond is 99; and now you see where we get that. And you see that a carbon-carbon bond is 83. But the second carbon-carbon bond, in a double bond, is only 63. Right? Why is it weaker? Why is the second bond of a carbon-carbon double bond weaker than the first? Pardon me? Devin, what do you say?
Student: The overlap.
Professor Michael McBride: Yeah, you have bad overlap between the π electrons. In fact, the first, the single bond of a double bond, is probably stronger than a normal single bond. Can anybody see why that would be?
Student: More s character.
Professor Michael McBride: It’s got more s character, better overlap; sp2-sp2. So the second one is probably more than 20 kilocalories weaker than the single one. But at any rate it’s 146, that you add up to get a double bond, and 200 for a triple bond; which means the third bond is worth only 54 kilocalories/mol. Okay? And in C=O, it’s about the same as C-C. So C-O is eighty-six. But the double bond, notice, is different in this case. Now the second bond is stronger than the first. So the carbonyl group is an especially stable group. Okay? So you have the question, can you sum up these average bond energies and get useful heats of atomization? So can you look at a structure and say how stable it’s going to be?
Okay, so let’s try it. So here’s heats of atomization by additivity of average bond energies. So we have these average bond energies from the table. A C-C single-bond is 83. A C-H is 99. C double bond C is 146, 86, 111 and so on. And we’re going to sum them all up to get the heat of atomization — compare it with the actual heat of atomization. Okay, so for ethene, there are four C-H bonds, there’s one C-C double bond. Add them up, and you get 542. The actual heat of atomization is 537.7. So there’s an error of 4.3 kilocalories/mol, which is less than 1%. That’s pretty good. But on the other hand, ethene probably entered in to determining these average bond energies. Right? So it’s not 100% fair. Okay? How about cyclohexane? Now we have 6 carbon-carbon single bonds and 12 carbon-hydrogen single bonds: 1686 versus 1680.1. An error of only 5.9, less than half a percent error. So pretty good, by adding up bonds. Cyclohexanol. Remember we had quite a bit of trouble with these partly oxidized things before, when we were trying to just do it on the basis of the elements. But if you add bonds together, you get within 0.3% of the right value. Or if you do glucose, which has lots of oxygens in it, then you get within again less then 1%; 0.7% of the right value.
So this is pretty darn good; very impressive, very small errors, to predict these. But the question is, is it useful? How accurate does it have to be, to be useful? So why do you need to know the values? Because you want to know equilibrium constants. You want to know which direction a reaction will go, for example. Okay, so we know that the — if we want to calculate an equilibrium constant, we can do it at room temperature with this 3/4ths trick. So the calculated equilibrium constant is whatever we’re calculating here, for energy, between two things. We have two things: calculate the energy of these, the energy of these, compare the energies, and that’ll give us the equilibrium constant, according to this formula. But notice I’m doing it on the basis of calculation. Now so that’s — whatever — the calculated energy is whatever the true energy would be. But there’s also some error in there. Okay? But if you add two exponents, that’s the same as multiplying two things together. So the calculated equilibrium constant is the true equilibrium constant — that’s the first part, the part that it has with ΔH true — times the part that has this exponent. Right? 3/4ths of the ΔH error.
So if you want the error to be small, that factor to be small, then ΔH error has to be small; small, not on a percentage basis, but absolutely it has to be small. That error, not the percentage error, determines this error factor. Right? So to keep the error less than a factor of ten, in the equilibrium constant, you need to know the equilibrium constants within 1.3 kilocalories/mol, so that 3/4ths of it will be one, and that would mean you’d be within a factor of ten. Everybody with me on this? So you need to do even better than this. You can’t use the average bond energies and get something that’s very useful, because if you’re off by sixteen down here, in the case of sugar, that means you’re off by a factor of 1012th in predicting the equilibrium constant, which wouldn’t be acceptable probably. Okay?
So let’s try it with the equilibrium between a ketone and the so-called enol, which is an isomer of a ketone in which a hydrogen has been taken from the methyl group on the right and put on the oxygen, and the double bond moved. Okay? So that’s a very important equilibrium that we’ll encounter when we talk about the chemistry of ketones. Let’s see what the equilibrium constant — should there be more ketone or more enol? Do you have a guess right at the outset of which one would be more stable? I would guess the ketone, because of what I just told you, that the C-O double bond is remarkably stable. Right? In the other case you have a C-C double bond. Okay, let’s see. Now we could add together all the bonds. But most of them, most bonds are the same between the starting material and product. We only need to compare the ones that change. Okay? So we’ve highlighted in red the bonds that change between the two forms, the two isomers, because we’re interested in the difference in energy between these two, to get the equilibrium constant. Okay, so these are the numbers I took from the table, that you see on the top left there: 179 for C-O double bond and so on. And I sum them up and that’s 361, for those bonds. And the new set of bonds, in the enol, sum to 343. So the ketone indeed is more stable, it appears, by 18 kilocalories/mol. So 18 kilocalories/mol means that you have a factor of 1013th; the equilibrium constant is 1013.5. So it should lie, for practical purposes, entirely in the direction of the ketone. However, if you do it experimentally, you find that the equilibrium constant is only 107th, not 1014th. Right? So the true energy difference is 9.3 kilocalories; not the eighteen that we got by adding bonds together. Right? So that means we’re going to have to deal with addressing why the enol is too stable. It’s 9 kilocalories/mol too stable, compared to our predictions, on the basis of adding bonds together.
Now why? Well one thing is that we — that those bonds that we cancelled, the C-H bonds that didn’t change, in fact did change between the starting material and the product. Why could I say that they did change? In both cases, on both sides, there are single carbon-carbon, carbon-hydrogen bonds. How can I say they change? Angela?
Student: Well with the ketone, they’re sp3 hybridized.
Professor Michael McBride: Ah ha.
Student: In the enol they’re sp2hybridized.
Professor Michael McBride: Right. They’re changing hybridization. Actually, yeah, they go from sp3to sp2, on the carbon, as you go across. And the sp2’s on the right should be more stable. Okay? So the sp2-H should be stronger. So these things that I was saying cancelled do not actually cancel, if we take hybridization into account. So that’s one factor. And there’s another as well, which is you have that unshared pair, on the top right here, on the oxygen, is adjacent to a double bond. That means that this high HOMO can be stabilized by the π* low LUMO; it’ll overlap. That isn’t a possibility here, where the unshared pairs on the oxygen did not overlap with the π* orbital. So you get intramolecular HOMO/LUMO mixing in the enol that you don’t get in the ketone; which will help stabilize the enol, with that — we could draw that resonance structure. So those two things together make up that 9 kilocalorie error; or at least we can think — they contribute to it at least. So constitutional energy, what we would get by adding bonds together, has to be corrected for various “effects”, we’ll call them, such as resonance, that’s what we just looked at, like this HOMO/LUMO thing, such as hybridization changes, or such as strain, as in the case of axial methylcyclohexane, that we looked at. So there are lots and lots of these corrections that you have to apply to this model, where you add together bond energies in order to predict the energies of a particular molecule.
But for many cases now, you can do a pretty good job of predicting these things, and actually not do so bad at predicting relative energies of isomers, and therefore equilibrium constants. And these effects, of course, are a polite name for error. Right? They’re correcting — various ways of correcting errors that you think there should be in this scheme of just adding bonds together. Now, energy determines what can happen. Things always move toward equilibrium. Right? So if the ratio of two things is something, but the equilibrium ratio is different, the ratio will always move toward that, toward the equilibrium, if it’s in isolation. But there’s another equally important thing is how fast will it go there? And that, as we’ve seen before, can be approximated as 1013th/second, times this same kind of factor, relating to the barrier.
Chapter 3. The Boltzmann Factor: How Is Temperature Related to Energy? [00:27:55]
Now both of these things suggest that being low in energy is good. Right? You favor things that are low in energy. But you might ask why? That’s not what people say about money. They don’t say the less money you have the better. Right? Why the less energy the better? This is a really interesting case, and it has to do with statistics. And especially at Yale we should talk about this, because in 1902, when Yale celebrated it’s bicentennial, they published a number of books showing off the scholarship of Yale; as you can see here. And the most important of those books, by about 500 miles, was this one: Elementary Principles in Statistical Mechanics and the Rational Foundation of Thermodynamics, by J. Willard Gibbs. So it’s statistical mechanics. It’s trying to understand the behavior of chemical substances, on the basis of statistics.
Now when you do this, you get exponents. And the organization of our presentation here is going to have to do with three different ways in which statistics enters into exponents, for purposes of doing equilibrium. So there’s the Boltzmann Factor; that’s what we’ve been talking about, the 10(3/4 ΔH), that’s called the Boltzmann Factor. It includes the Boltzmann Constant. Then there are things that have to do with entropy, which often seems to be a very confusing topic. And finally there’s a thing called the Law of Mass Action. And all of these things have exponents in them. And if you understand how the exponents behave, you understand what’s going on. So let’s look first at the Boltzmann Factor. So here’s Ludwig Boltzmann, who committed suicide in 1906. And this is his important paper on “The Relationship between the Second Law of Thermodynamics and Probability Calculations” — so statistics — “Regarding the Laws of Thermal Equilibrium,” in 1877. And his key equation is S = k lnW . So log relates to an exponential, and we’ll see why that is. And here’s his tombstone in the cemetery in Vienna. And you’ll notice, up at the top there, S = k ln W. Okay?
So what Boltzmann considered was the implication of random distribution of energy. Suppose you have a certain amount of energy, in a system, but it’s distributed at random. Right? So purely statistically. Then how should it be distributed? How much energy should any particular molecule have, is the question. And we can visualize this in a simple case, which is very like what he did, except he did it analytically and in a much bigger system. But just using four containers, which are like molecules, and each one can have a certain amount of energy in it. And we’ll consider the energy to be — to come in bits. He used that idea, that there were bits of energy to be distributed among molecules, or degrees of freedom within molecules. He didn’t think that energy came in bits, but it made it possible to do the statistics, and then he just took the limit when these bits get very, very, very, very small, so that it becomes like a continuum of stuff, like a whole sand of energy bits.
But anyhow, let’s just count up how many different ways there are of putting three different bits of energy — or actually not different, they’re all the same — but three bits of energy into the red container. Okay? So if you — how many complexions — that’s what, any particular arrangement he called a complexion — how many different complexions have a certain number of bits in the first container? Well suppose you put all three of those energy bits into the first container. How many different ways are there of doing that? Just one. Okay? But suppose you put only two into the first container? Now how many different ways are there of arranging it so that there are two in the first container? How many different ways? There are three: 1, 2, 3. So there’ll be three ways of putting two bits in the first container. How many of putting one in the first container? Well we put one there; there’s 1, 2, 3, 4, 5, 6 ways of doing it. Okay? So there’s six ways of putting one in there. And how many of putting none at all in there? 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. Okay? So there are 10 ways of doing that. So let’s make a graph and see how many ways — what’s the probability that you’ll have a certain number of bits of energy in the first container? Okay, it looks like that: 10, 6, 3, 1. And what does that curve look like, if you made a plot of that? What type of curve does it look like? Is it a straight line? Anybody got a name for it, or something that looks a little bit like that?
Professor Michael McBride: It’s exponential decay. Now it’s not truly exponential, in this case, of three bits among four containers. But if you do 30 bits among 20 containers, then it looks like that, and there is an exponential. Right? So what Boltzmann was able to do, to show mathematically, was that the limit, when you have very many, very small energy bits, is truly an exponential. So the probability of having a certain amount of energy, in a degree of freedom, is exponential: e-(whatever that energy is, divided by kT). So Boltzmann showed that that was the limit for lots of infinitesimal energy bits. And the idea behind it is quite clear. If all the complexions for a given total are equally likely — and that’s what he assumes; it’s random, they can be any place they want to be — then shifting energy into any one degree of freedom, of one molecule, is disfavored. Because when you put more in one molecule, there are fewer ways to distribute the rest among the others. So there’ll be fewer ways, the more you put in this one. Is that clear? Because that’s really the key concept. The more bits of energy you put in this one, the fewer different ways there are of permuting what’s left among the others. Right? And it’s exponential. Right? So if you have fewer ways among the others, then it’s less likely. Right?
So it turns out that if you do this, the average energy is ½ kT, in this degree of freedom; which is to say that k, the Boltzmann Constant, relates temperature to average energy; which is to say that temperature is average energy. Temperature and average energy at equilibrium are the same thing — okay? — for each degree of freedom. You can put — what did we mean by these little buckets into which we could put energy? We had a way of putting energy into the molecule, like stretching this bond, or stretching this bond, or bending some bonds, or torsion, or something like that. Now truly, we deal with quantum states. So you put energy into different quantum states and you count up the quantum states, to see how likely things are. Okay, so that’s where the Boltzmann Factor is. That exponential, that e to the — exponential ¾ ΔH comes just because that’s what you expect. If you randomly distribute things, it’ll come out that way.
Chapter 4. Entropy and the Tendency toward “Disordered Arrangements” [00:36:24]
Now how about the entropy factor? And this one is fun. Feynman, in his wonderful Lectures on Physics, says: “It is the change from an ordered arrangement to a disordered arrangement, which is the source of irreversibly.” Have you heard this said, that entropy is disorder, and that you increase entropy in order to increase disorder? Okay, so that’s what Feynman is saying here. “The change from an ordered arrangement to a disordered arrangement.” Okay. Now here are two arrangements of the same number of dots. Which one is more ordered, left or right?
Professor Michael McBride: Okay. You know I’m setting you up, right? So you’re — but I know what you would have voted for. Right? So I’m not going to ask you to vote. Okay? But look at the one on the right, from a different point of view. [Laughter] So what do you conclude? Which one is more ordered? The one on the right is just as ordered as the one on the left, but we didn’t perceive the order. Okay? And that’s like constellations; you know, the shepherds lay out and saw bears and dragons and things, in the sky — right? — and thought they were ordered. Okay? Now disorder, reversibility and Couette flow. Now I brought an experiment to do here, and it’s — but I’m not really sure it would work. So what I’m going to do — because I didn’t practice it before I came. I’ve done it before, but I didn’t practice it today, and my pipette broke and I had to get a new one made before class. So if you want to see that, come after class and we’ll try the actual experiment. But I’m going to show you a movie of it instead. Here. So this is the same thing here. What it is, is a — well you’ll see in the movie. I’ll just start it up.
Professor Michael McBride: Okay, so it’s a glass rod that goes up inside a glass cylinder. So there’s like a doughnut inside, right? So I’m now going to pour Karo syrup in there; I brought Karo syrup with me to show you. So it’s in that annulus between the rod and the cylinder. Okay? And now I’m going to take some yellow dye and put a strip of it — first I’m going to mark it, so I can tell — I’m going to rotate that outside cylinder, so you can see that it’s rotating. And I’m putting a strip of yellow dye between the rod and the cylinder. Everybody see what I’m doing? And then I’m going to stir it up. And the way I’m going to stir it is by rotating the outside cylinder. Okay, so we’ll zoom in and you’ll see the watch, not very well, but to show that I’m not just running the movie backward or anything. Okay? Okay, now I start rotating the outside. So there’s one rotation, two rotations, three rotations. So now it’s all mixed up. And now watch. I’m unrotating. And it comes back. So if you want to see that happen, we’ll try it after class here.
Student: Oh wow.
Professor Michael McBride: So you unmix things. That doesn’t sound like entropy is working right. Okay? Now here’s the way it actually happened. So there was the syrup, between the rod and the cylinder — a look down from the top — and we put a strip of ink in between, and then started rotating. And as we rotated, the ink spread out, like this. Right? Because the outer part moved, and the inner part didn’t move, where it was in contact with the rod. So after I’d done three rotations, it looked like that. It wasn’t really evenly mixed up. It’s just that when we looked at it, it looked like it was mixed up — right? — when we looked through it. And now when we unrotate it, the whole thing — nothing diffused and molecules didn’t move at random. They just got spread out that way, but in a particular way. So it came back again. Okay? So the rotated state only seemed to be disordered. So that’s the basis of the trick. Right? But that raises a very fundamental question. If disorder is in the mind of the beholder — in this case, or in the case of that dinosaur, connect the dots — if disorder is in the mind of the beholder, how can it measure a fundamental property, like entropy, if it depends on who’s looking at it, to say whether it’s disordered or not? Right? In fact, a disordered arrangement is an oxymoron, because arrangement is arrangement, and disordered is not arrangement. Right?
So how can you have a disordered arrangement, if the shepherd sees a dragon? Okay? The situation favored at equilibrium, by entropy, is one where particles have diffused every which-away, not into a coiled up piece of paper like the yellow thing, or not into a dinosaur. Every which-away; the key word is ‘every’. That’s what’s statistical about it. A disordered arrangement is a code word for a collection of random distributions, whose individual structures are not obvious. So if a thing looks like, you know, a regular lattice like that, I say, “Ah ha, that’s a regular lattice.” But if it looks like this, I don’t say it’s exactly that; I say it’s disordered, by which I mean I can’t tell the difference between that one and this one, or this one, or this one. Right? So there are a whole bunch of those arrangements that I count together when I say ‘disordered’. It’s a collective word. So if all of them are equally likely, it’s much more likely to have disordered — many, many, many arrangements — than the particular ordered ones that we’re thinking about, even if they’re all equally likely.
So that’s the idea. It is favored at equilibrium because it includes so many individual distributions. So entropy is actually counting, in disguise. You count all these different arrangements, or all the different quantum levels, and the more you have, under a certain name, the higher the entropy associated with that name is. So, for example, a very common value of the entropy difference between two things is 1.377 entropy units. That seems a weird number, right? Now 1.377 happens to be R times the natural log of two. Now, consider the difference in entropy between gauche and anti-butane. Okay? So the equilibrium constant is e(-ΔG/RT). Do you remember what G is? That’s the Gibbs Free Energy, which includes both heat, both the kind of things — bond energy that we’ve been talking about — and entropy is included in there too. So we can split it apart, into the part that has to do with heat — or enthalpy, the ΔH between the two things, gauche and anti — and TΔS, the part that has to do with entropy. I suspect you’ve seen this G=H+TS before; H-TS before. But let’s just split it apart. Since they’re in the exponent we can multiply two things together. So we have the first part, the one we’ve been dealing with, 3/4thsΔH. Right? And then we have the red part, that has to do with entropy. But you can simplify that. How can you simplify the part that has to do with entropy, right off the bat?
Student: Cancel the T’s.
Professor Michael McBride: Cancel the T’s. Okay, so it’s actually ΔS/R. Now suppose that the value of ΔS is Rln2; which I said was a very common entropy difference. Right? Now you can simplify it further. Can you see how to simplify it further, for that particular entropy difference? Well obviously the R’s cancel. And what’s E raised to the power ln2?
Professor Michael McBride: Two. So actually what that is, is our 3/4thsΔH times two. So when you see 1.377 entropy units, that’s somebody who likes math telling you that’s there a factor of two involved. That sounds more reasonable. Right? Two. Why should there be a factor of two, that favors gauche- over anti-butane? Yes?
Student: There are twice as many gauches.
Professor Michael McBride: There are twice as many gauches as there are anti’s, because it can be right-handed or left-handed gauche. So you see what a crock this is, to say that the entropy difference between gauche- and anti-butane is 1.377 entropy units? It’s just that there are twice as many of one as the other. Right? So that, the fact that ΔS occurs in an exponent, is just a complicated way of telling you that there’s a statistical factor. You have to count how many of these things there are. Okay? Because you have two gauche butanes. So the conclusion; it just means a factor of two. And then that the equilibrium constant depends on temperature, because of ΔH, not because of ΔS. Often people think that because the free energy is H and TS, that therefore the entropy thing is changing as T changes. But in fact that’s not true, because you divide by T to get anything out of it again. Right? So what really changes with temperature is the contribution due to ΔH. So sometimes that’s just used to obscure what is fundamentally very simple.
Okay, we’re going to stop here. And just so everybody’s on the same page, we’ll have the final lecture on Wednesday. But then I’ll be here at class time on Friday too, and we can have a discussion then, to review for the exam. And I’m willing to have another one. I forget. When did I say? On Monday night. Now do people have — there are not exams at night, are there; or are there? Does anybody have a — is Monday night okay to have the review? It’s probably the best time to have it, so you have a full day after that before the exam. So I’ll get a room for next Monday night, a week from tonight, for a review session. But also I’ll be here on Friday at lecture time. So we’ll see you. If anybody wants to see this experiment, we’ll do it.
[end of transcript]Back to Top
|mp3||mov [100MB]||mov [500MB]|