From Cosmos to Chaos The Science of Unpredictability by Peter Coles
Author:Peter Coles [Coles, Peter]
Format: epub
Tags: Science
ISBN: 0198567626
Publisher: Oxford UP USA
Published: 2010-09-25T03:00:00+00:00
m
3
2
pðvà ¼ 4
v2 exp à mv2 ;
2pkT
2kT
where m is the molecular mass; the mean kinetic energy is just 3kT. This may be ringing vague bells about the way I sneaked in a different definition of entropy in Chapter 4. There I introduced the quantity Z
pðxÃ
S ¼ Ã
pðxà log mðxà dx
From Engines to Entropy
109
as a form of entropy without any reference to thermodynamics at all. One can actually derive the MaxwellâBoltzmann distribution using this form too: the distribution of velocities in each direction is found by maximising the entropy subject to the constraint that the variance is constant (as the variance determines the mean square velocity and hence the mean kinetic energy). This means that the distribution of each component of the velocity must be Gaussian and if the system is statistically isotropic each component must be independent of the others. The speed is thus given byffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
q
ffi
v ¼
v2 þ v2 þ v2
x
y
z
and each of the components has a Gaussian distribution with variance kT. A straightforward simplification leads to the MaxwellâBoltzmann form. But what was behind this earlier definition of entropy? In fact the discrete form (with uniform measure)
X
S ¼ ÃI ¼ Ã
pi log pi
i
derives not from physics but from information theory. Claude Shannon derived the expression for the information content I of a probability distribution defined for a discrete distribution in which i runs from 1 to n. Information is sometimes called negentropy because in Shannonâs definition entropy is simply negative information: the state of maximum entropy is the state of least information. If one uses logarithms to the base 2, the information entropy is equal to the number of yes or no questions required to take our state of knowledge from wherever it is now to one of certainty. If we are certain already we do not need to ask any questions so the entropy is zero. If we are ignorant then we have to ask a lot; our entropy is maximized. The similarity of this statement of entropy to that involved in the Gibbs algorithm is not a coincidence. It hints at something of great significance, namely that probability enters into the field of statistical mechanics not as a property of a physical system but as a way of encoding the uncertainty in our knowledge of the system. The missing link in this chain of reasoning was supplied in 1965 by the remarkable and much undervalued physicist Ed Jaynes. He showed that if we set up a system according to the Gibbs algorithm, i.e. so 110
From Cosmos to Chaos
that the starting configuration corresponds to the maximum Gibbs entropy, the subsequent evolution of Gibbs entropy is numerically identical to the macroscopic definition given by Clausius I introduced right at the beginning of this Chapter. This is an amazingly beautiful result that is amazingly poorly known.
This interpretation often causes hostility among physicists who use the word âsubjectiveâ to describe its perceived shortcomings. I do not think subjective is really the correct word to use, but there is some sense in which it does apply to thermodynamics.
Download
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.
Man-made Catastrophes and Risk Information Concealment by Dmitry Chernov & Didier Sornette(4735)
The Revenge of Geography: What the Map Tells Us About Coming Conflicts and the Battle Against Fate by Kaplan Robert D(3597)
Zero Waste Home by Bea Johnson(3288)
COSMOS by Carl Sagan(2950)
In a Sunburned Country by Bill Bryson(2946)
Good by S. Walden(2914)
The Fate of Rome: Climate, Disease, and the End of an Empire (The Princeton History of the Ancient World) by Kyle Harper(2435)
Camino Island by John Grisham(2383)
A Wilder Time by William E. Glassley(2362)
Organic Mushroom Farming and Mycoremediation by Tradd Cotter(2306)
Human Dynamics Research in Smart and Connected Communities by Shih-Lung Shaw & Daniel Sui(2177)
The Ogre by Doug Scott(2114)
Energy Myths and Realities by Vaclav Smil(2060)
The Traveler's Gift by Andy Andrews(2011)
Inside the Middle East by Avi Melamed(1939)
Birds of New Guinea by Pratt Thane K.; Beehler Bruce M.; Anderton John C(1906)
Ultimate Navigation Manual by Lyle Brotherton(1767)
A History of Warfare by John Keegan(1713)
And the Band Played On by Randy Shilts(1615)