This despite the fact that (as I told T-Stone) I was presenting arguments from Brian Greene in The Fabric of the Cosmos. Greene happens to be a physicist. T-Stone happens to be dimwitted. Which one wins this contest?
Parsimony isn't a statistical evaluation, it's an evaluation of *economy*.Unfortunately for T-Stone:
The notion of entropy was first developed during the industrial revolution by scientists concerned with the operation of furnaces and steam engines, who helped develop the field of thermodynamics. Through many years of research, the underlying ideas were sharply refined, culminating in Bolzmann’s approach. His version of entropy, expressed concisely by the equation on his tombstone [S = k log W], uses statistical reasoning to provide a link between the huge number of individual ingredients that make up a physical system and the overall properties the system has.
Greene, Brian. 2004. The Fabric of the Universe. New York: Vintage Books p. 151 (emphasis in bold added)
To carry on with Greene’s thought:
…[I]magine unbiding a copy of War and Peace, throwing its 693 double-sided pages high into the air, and then gathering the loose sheets into a neat pile. When you examine the resulting stack, it is enormously more likely that the pages will be out of order than in order. The reason is obvious. There are many ways in which the order of the pages can be jumbled, but only one way for the order to be correct. …A simple but essential observation is that, all else being equal, the more ways something can happen, the more likely it is that it will happen. And if something can happen in enormously more ways, like the pages landing in the wrong numerical order, it is enormously more likely that it will happen….Naturally, there are some differences between this and physics:
Entropy is a concept that makes this idea precise by counting the number of ways, consistent with the laws of physics, in which any given physical situation can be realized. High entropy means that there are many ways; low entropy means there are few ways. If the pages of War and Peace are stacked in proper numerical order, that is a low-entropy configuration, because there is one and only one ordering that meets the criterion. If the pages are out of numerical order, that is a high-entropy situation, because a little calculation shows that there are [Greene then writes a number that continues for the next page and a half which only a masochist would reproduce here]—about 10 ^ 1878—different out-of-order page arrangements.
(ibid, pp. 151-153, all italics his)
Of course, in making the concept of entropy precise and universal, the physics definition does not involve counting the number of page rearrangements of one book or another that leave it looking the same, either ordered or disordered. Instead, the physics definition counts the number of rearrangements of fundamental constituents—atoms, sub-atomic particles, and so on—that leave the gross, overall, “big-picture” properties of a given physical system unchanged. As in the example of War and Peace, low entropy means that very few rearrangements would go unnoticed, so the system is highly ordered, while high entropy means that many rearrangements would go unnoticed, and that means the system is very disordered.Now that we have established how unlikely it is for even one Coke bottle's worth of carbon dioxide to randomly form out of a high entropy situation, it is time for the paradox:
For a good physics example, and one that will shortly prove handy, let’s think about [a] bottle of Coke… When gas, like the carbon dioxide that was initially confined in the bottle, spreads evenly throughout a room, there are many rearrangements of the individual molecules that will have no noticeable effect. For example, if you flail your arms, the carbon dioxide molecules will move to and fro, rapidly changing positions and velocities. But overall, there will be no qualitative effect on their arrangements. The molecules were spread uniformly before you flailed your arms, and they will be spread uniformly after you’re done. …By contrast, if the gas were spread in a smaller space, as it was in the bottle, or confined by a barrier to a corner of the room, it has significantly lower entropy. The reason is simple. Just as thinner books have fewer page reorderings, smaller spaces provide fewer places for molecules to be located, and so allow for fewer rearrangements.
But when you twist off the bottle’s cap or remove the barrier, you open up a whole new universe to the gas molecules, and through their bumping and jostling they quickly disperse to explore it. Why? It’s the same statistical reasoning as with the pages of War and Peace. No doubt, some of the jostling will move a few gas molecules purely within the initial blob of gas or nudge a few that have left the blob back toward the initial dense gas cloud. But since the volume of the room exceeds that of the initial cloud of gas, there are many more rearrangements available to the molecules if they disperse out of the cloud than there are if they remain within it. On average, then, the gas molecules will diffuse from the initial cloud and slowly approach the state of being spread uniformly throughout the room. Thus, the lower-entropy initial configuration, with the gas all bunched in a small region, naturally evolves toward the higher-entropy configuration, with the gas uniformly spread in the larger space….
The tendency of physical systems to evolve toward states of higher entropy is known as the second law of thermodynamics. (The first law is the familiar conservation of energy.) As above, the basis of the law is simple statistical reasoning: there are more ways for a system to have higher entropy, and “more ways” means it is more likely that a system will evolve into one of these high-entropy configurations. [I note in passing that this is the third time Greene has used “statistical reasoning” in regards to entropy; perhaps T-Stone should e-mail him to correct Greene’s obvious stupidity!] Notice, though, that this is not a law in the conventional sense since, although such events are rare and unlikely, something can go from a state of high entropy to one of lower entropy. When you toss a jumbled stack of pages into the air and then gather them into a neat pile, they can turn out to be in perfect numerical order. You wouldn’t want to place a high wager on its happening, but it is possible. It is also possible that the bumping and jostling will be just right to cause all the dispersed carbon dioxide molecules to move in concert and swoosh back into your open bottle of Coke. Don’t hold your breath waiting for this outcome either, but it can happen.
The large number of pages in War and Peace and the large number of gas molecules in the room are what makes the entropy difference between the disordered and ordered so huge, and what causes low-entropy outcomes to be so terribly unlikely. If you tossed only two double-sided pages in the air over and over again, you’d find that they landed in the correct order about 12.5 percent of the time. With three pages this would drop to about 2 percent of the tosses, with four pages it’s about .3 percent, with five pages it’s about .03 percent, and with 693 pages the percentage of tosses that would yield the correct order is so small—it involves so many zeros after the decimal point—that I’ve been convinced by the publisher not to use another page to write it out explicitly. Similarly, if you dropped only two gas molecules side by side into an empty Coke bottle, you’d find that at room temperature their random motion would bring them back together (within a millimeter of each other), on average, roughly every few seconds. But for a group of three molecules, you’d have to wait days, for four molecules you’d have to wait years, and for an initial dense blob of a million billion billion molecules it would take a length of time far greater than the current age of the universe for their random, dispersive motion to bring them back together into a small, ordered bunch. With more certainty than death and taxes, we can count on systems with many constituents evolving toward disorder.
(ibid, pp. 153-157, italics his)
Earlier, we introduced the dilemma of past versus future by comparing our everyday observations with properties of Newton’s laws of classical physics. We emphasized that we continually experience an obvious directionality to the way things unfold in time but the laws themselves treat what we call forward and backward in time on an exactly equal footing. As there is no arrow within the laws of physics that assigns a direction to time, no pointer that declares, “Use these laws in this temporal orientation but not in reverse,” we were lead to ask: If the laws underlying experience treat both temporal orientations symmetrically, why are the experiences themselves so temporally lopsided, always happening in one direction but not the other? …Brian Greene includes a note here that illustrates even more clearly how absurd T-Stone has been in questioning what I previously wrote:
Notice that in our discussion of entropy and the second law, we did not modify the laws of classical physics in any way. Instead, all we did was use the laws in a “big picture” statistical [there’s that word again, T-Stone] framework: we ignored fine details…and instead focused our attention on gross, overall features…. We found that when physical systems are sufficiently complicated (books with many pages, fragile objects that can splatter into many fragments, gas with many molecules), there is a huge difference in entropy between their ordered and disordered configurations. And this means that there is a huge likelihood that the systems will evolve from lower to higher entropy, which is a rough statement of the second law of thermodynamics. But the key fact to notice is that the second law is derivative: it is merely a consequence of probalistic reasoning applied to Newton’s laws of motion.
This leads us to a simple but astounding point: Since Newton’s laws of physics have no built-in temporal orientation, all of the reasoning we have used to argue that systems will evolve from lower to higher entropy toward the future works equally well when applies toward the past. Again, since the underlying laws of physics are time-reversal symmetric, there is no way for them even to distinguish between what we call the past and what we call the future. …Thus, not only is there an overwhelming probability that the entropy of a physical system will be higher in what we call the future, but there is the same overwhelming probability that it was higher in what we call the past. …
This is the key point for all that follows, but it’s also deceptively subtle. A common misconception is that if, according to the second law of thermodynamics, entropy increases toward the future, then entropy necessarily decreases toward the past. But that’s where the subtlety comes in. The second law actually says that if at any give moment of interest, a physical system happens not to possess the maximum possible entropy, it is extraordinarily likely that the physical system will subsequently have and previously had more entropy. …With laws that are blind to past-versus-future distinction, such time symmetry is inevitable.
That’s the essential lesson. It tells us that the entropic arrow of time is double-headed. From any specified moment, the arrow of entropy increase points toward the future and toward the past. And that makes it decidedly awkward to propose entropy as the explanation of the one-way arrow of experiential time.
Think about what the double-headed entropic arrow implies in concrete terms. If it’s a warm day and you see partially melted ice cubes in a glass of water, you have full confidence that half an hour later the cubes will be more melted, since the more melted they are, the more entropy they will have. But you should have exactly the same confidence that half an hour earlier they were also more melted, since exactly the same statistical reasoning implies that entropy should increase toward the past. And the same conclusion applies to the countless other examples we encounter every day….
Toward this end, imagine it’s 10:30 p.m. and for the past half hour you’ve been staring at a glass of ice water (it’s a slow night at the bar), watching the cubes slowly melt into small, misshapen forms. You have absolutely no doubt that a half hour earlier the bartender put fully formed ice cubes into the glass; you have no doubt because you trust your memory. And if, by some chance, your confidence regarding what happened during the last half hour should be shaken, you can ask the guy across the way, who was also watching the ice cubes melt (it’s a really slow night at the bar), or perhaps the video taken by the bar’s surveillance camera, both of which would confirm that your memory is accurate….
But as we’ve seen, such entropic reasoning—reasoning that simply says things are more likely to be disordered since there are more ways to be disordered, reasoning which is demonstrably powerful at explaining how things unfold toward the future—proclaims that entropy is just as likely to have been higher in the past. This would mean that the partially melted cubes you see at 10:30 p.m. would actually have been more melted at earlier times; it would mean that at 10:00 p.m. they did not begin as solid ice cubes, but, instead, slowly coalesced out of room-temperature water on the way to 10:30 p.m., just as surely as they will slowly melt into room-temperature water on their way to 11:00 p.m.
No doubt, that sounds weird—or perhaps you’d say nutty. To be true, not only would H2O molecules in a glass of room-temperature water have to coalesce spontaneously into partially formed cubes of ice, but the digital bits in the surveillance camera, as well as the neurons in your brain and those in the brain of the guy across the way, would all need to spontaneously arrange themselves by 10:30 p.m. to attest to there having been a collection of fully formed ice cubes that melted, even though there never was. Yet this bizarre-sounding conclusion is where a faithful application of entropic reasoning—the same reasoning that you embrace without hesitation to explain why the partially melted ice you see at 10:30 p.m. continues to melt toward 11:00 p.m.—leads when applied in the time-symmetric manner dictated by the laws of physics. This is the trouble with having fundamental laws of motion with no inbuilt distinction between past and future, laws whose mathematics treats the future and past of any given moment in exactly the same way….
Math and intuition concur that if there really were fully formed ice cubes at 10 p.m., then the most likely sequence of events would be for them to melt into the partial cubes you see at 10:30 p.m.: the resulting increase in entropy is in line both with the second law of thermodynamics and with experience. But where math and intuition deviate is that our intuition, unlike math, fails to take account of the likelihood, or lack thereof, of actually having fully formed ice cubes at 10 p.m., given the observation we are taking as unassailable, as fully trustworthy, that right now, at 10:30 p.m., you see partially melted cubes.
This is the pivotal point, so let me explain. The main lesson of the second law of thermodynamics is that physical systems have an overwhelming tendency to be in high-entropy configurations because there are so many ways such states can be realized. And once in such high-entropy states, physical systems have an overwhelming tendency to stay in them. High entropy is the natural state of being. You should never be surprised by or feel the need to explain why any physical system is in a high-entropy state. Such states are the norm. On the contrary, what does need explaining is why any given physical system is in a state of order, a state of low entropy. These states are not the norm. They can certainly happen. But from the viewpoint of entropy, such ordered states are rare aberrations that cry out for an explanation. So the one fact in the episode we are taking as unquestionably true—your observation at 10:30 p.m. of low-entropy partially formed ice cubes—is in fact in need of an explanation.
And from the point of view of probability, it is absurd to explain this low-entropy state by invoking the even lower-entropy state, the even less likely state, that at 10 p.m. there were even more ordered, more fully formed ice cubes being observed in a more pristine, more ordered environment. Instead, it is enormously more likely that things began in an unsurprising, totally normal, high-entropy state: a glass of uniform liquid water with absolutely no ice. Then, through an unlikely but ever-so-often-expectable statistical fluctuation, the glass of water went against the grain of the second law and evolved to a state of lower entropy in which partially formed ice cubes appeared. This evolution, although requiring rare and unfamiliar processes, completely avoids the even lower-entropy, the even less likely, the even more rare state of having fully formed ice cubes. At every moment between 10 p.m. and 10:30 p.m., this strange-sounding evolution has higher entropy than the normal ice-melting scenario…and so it realizes the accepted observation at 10:30 p.m. in a way that is more likely--hugely more likely—than the scenario in which fully formed ice cubes melt. That is the crux of the matter.
(ibid p.157-165, italics his)
Remember, on pages 152-53 we showed the huge difference between the number of ordered and disordered configurations for a mere 693 double-sided sheets of paper. We are now discussing the behavior of roughly 10^24 H2O molecules, so the difference between the number of ordered and disordered configurations is breathtakingly monumental. Moreover, the same reasoning holds for all other atoms and molecules within you and within the environment (brains, security cameras, air molecules, and so on). Namely, in the standard explanation in which you can trust your memories, not only would the partially melted ice cubes have begun, at 10 p.m., in a more ordered—less likely—state, but so would everything else: when a video camera records a sequence of events, there is a net increase in entropy (from the heat and noise released by the recording process); similarly, when a brain records a memory, although we understand the microscopic details with less accuracy, there is a net increase in entropy (the brain may gain order but as with any order-producing process, if we take account of heat generated, there is a net increase in entropy). Thus, if we compare the total entropy in the bar between 10 p.m. and 10:30 p.m. in the two scenarios—one in which you trust your memories, and the other in which things spontaneously arrange themselves from an initial state of disorder to be consistent with what you see, now, at 10:30 p.m.—there is an enormous entropy difference. The latter scenario, every step of the way, has hugely more entropy than the former scenario, and so, from the standpoint of probability, is hugely more likely.So now we can see that when T-Stone offers his “trick question” about two decks of cards and asking which has more entropy, he’s not even in the right playing field. The fact is that the entropy paradox does exist, and scientists do choose the value that is less statistically likely—that is, they trust their observations are correct. In so doing, they continually stipulate that the further back we push time, the less entropy was in the universe, which means that the further back in time we go the less likely it was to have spontaneously arisen this way and the more likely it is that our memories are wrong. But since very few people want to live in a universe where we cannot trust our own experiences, the net result is that scientists ignore the entropy paradox and assume the least parsimonious explanation. To be sure, there have been attempts to imagine how the big bang could have introduced low entropy at the beginning of the universe, but these theories are untestable (and, as T-Stone is so fond of saying, untestability means it’s not science).
(ibid. p. 165 note)
Now, T-Stone can certainly feel free to continue to mock me if he wishes, but his protestations do not affect reality. To use a metaphor from the book, he can continue to flail his arms around in the air, but it will not override the reality that the air is uniformly mixed.