Genetix Invitrogen
Access via PAY PER ARTICLE (expires 14 Sep 02 5:30 AM Eastern Time)
HelpSubscriptionsFeedbackSign In

Summary of this Article
Reprint (PDF) Version of this Article
Similar articles found in:
SCIENCE Online
PubMed
PubMed Citation
Search Medline for articles by:
Cho, A.
Alert me when:
new articles cite this article
Download to Citation Manager
   
Collections under which this article appears:
Physics

STATISTICAL PHYSICS:
A Fresh Take on Disorder, Or Disorderly Science?

Adrian Cho*

For nearly 80 years the definition of entropy has been literally etched in stone. A few physicists want to carve a new one, but others say the idea is cracked

Near the middle of Vienna's sprawling Central Cemetery stands the imposing tomb of Ludwig Boltzmann, the 19th century Austrian physicist who first connected the motions of atoms and molecules to temperature, pressure, and other properties of macroscopic objects. Carved into the gravestone, a single short equation serves as the great man's epitaph: S = klnW. No less important than Einstein's E = mc2, the equation provides the mathematical definition of entropy, a measure of disorder that every physical system strives to maximize. The equation serves as the cornerstone of "statistical mechanics," and it has helped scientists decipher phenomena ranging from the various states of matter to the behavior of black holes to the chemistry of life.

But roll over, Boltzmann. A maverick physicist has proposed a new definition of entropy, and his idea has split the small and already contentious community of statistical physicists like a cue ball opening a game of pool. Supporters say the new definition extends the reach of statistical mechanics to important new classes of problems. Skeptics counter that the new theory amounts to little more than fiddling with a fudge factor.

The new definition gives insight into the myriad physical systems that verge on a kind of not-quite-random unpredictability called "chaos," says Constantino Tsallis of the Brazilian Center for Research in Physics in Rio de Janeiro. Tsallis proposed the definition in 1988, and since then researchers have applied it to subjects from the locomotion of microorganisms to the collisions of subatomic particles, and from the motions of stars to the swings in stock prices. The new definition appears to account for subtleties in the data exceedingly well, Tsallis says. It also probes a gap in Boltzmann's reasoning that Einstein spotted nearly a century ago.

But many physicists remain highly skeptical. So-called Tsallis entropy simply adds another mathematical parameter that physicists can twiddle to make their formulae better match the data, says Itamar Procaccia of the Weizmann Institute of Science in Rehovot, Israel. "It's just mindless curve-fitting," he says. Joel Lebowitz of Rutgers University in Piscataway, New Jersey, says that researchers crank out papers on the new entropy by the dozen (Tsallis lists nearly 1000 of them on his Web page, tsallis.cat.cbpf.br) but that most contain few physical insights. "The ratio of papers to ideas has gone to infinity," he says.

Several well-respected physicists, however, say that the skeptics have closed their minds to a potentially fruitful innovation. "It's ridiculous to reject this out of hand," says E. G. D. Cohen of Rockefeller University in New York City. Michel Baranger of the Massachusetts Institute of Technology (MIT) in Cambridge says that behind the skepticism lurks more personal misgivings about Tsallis, who traverses the globe stumping for his idea. "He spends an enormous amount of time making sure his work gets recognition," Baranger admits, but that doesn't mean his idea isn't a good one.


Figure 1
Into the mix. Proponents hope a new entropy will help physicists untangle tortuous subjects such as turbulence.

CREDIT: PETER VOROBIEFF/UNIVERSITY OF NEW MEXICO AND ROBERT E. ECKE/LOS ALAMOS NATIONAL LABORATORY


Counting the ways. Anyone who has ever touched a hot burner should have an intuitive feel for the concept of entropy. As heat flows from the metal of the burner into the flesh of a finger, it jiggles the atoms and molecules in the digit, knocking them out of their usual, painless order and leaving them in excruciating disarray. The amount of disorder determines the entropy of the fingertip.

In the 1870s, when most physicists still doubted the very existence of atoms and molecules, Boltzmann provided the essential mathematical link between the positions and velocities of the tiny particles and macroscopic quantities such as heat and temperature. Boltzmann realized that the positions and velocities of the atoms or molecules within an object could be rearranged in many different ways without changing the object's macroscopic properties. The entropy of the object, he reasoned, simply equals a constant, k in modern notation, times the logarithm of the number of equivalent microscopic arrangements--a gargantuan number denoted W. With that definition Boltzmann bridged the conceptual chasm between the macroscopic and microscopic realms.

Or rather, he vaulted across it. At a key point in his analysis, Boltzmann simply assumed that the molecules shift from one microscopic configuration to the next in such a way that every possible arrangement is equally likely. But that isn't necessarily true, as Einstein noted in 1910. How the system moves from one configuration to the next depends on the precise interactions between the molecules, and the details of these "dynamics" might make some configurations more likely than others, Einstein observed. If that's the case, Cohen says, the equation for entropy might take a different form: "Tsallis entropy is the first example in classical statistical mechanics that there is something to Einstein's idea."

Tsallis allows the probabilities of different configurations of particles to vary only in certain ways. Each configuration can be thought of as a single point in a vast abstract "phase space," typically with six times as many dimensions as there are particles in the system (three for position, three for velocity). As the particles change configuration, the system traces out a complicated path in this space.

Boltzmann essentially assumed that the system would wander so that it spent the same amount of time in each equally sized region of phase space. In contrast, Tsallis assumes that the system follows a path that has the shape of a fractal, a curious mathematical object that can have, for example, 2.381 dimensions and that looks essentially the same no matter how much it is magnified (see figure). The fractal limits the ways the system can get from one patch of phase space to another much as an airline's routes might limit the ways a traveler can get from New Orleans to Chicago, Tsallis says. "The two airports aren't connected," he says, "so you can't go from one to the other without going through Houston."


Figure 2
Don't go there. Where normal systems wander all over "phase space" (left), Tsallis's stick to patchy fractals.

ILLUSTRATION: C. SLAYDEN


To account for such fractal paths, Tsallis changed the mathematical form of the definition of entropy and introduced a new parameter, q (see box). The new definition encompasses the old one, Tsallis says, as the two formulae are identical when q equals 1. But when q differs from 1, the new entropy behaves in important new ways. For example, the entropy of an entire system no longer equals the sum of the entropies of its various parts. Systems that behave this way are called nonextensive, Tsallis says, and many systems on the verge of chaos display this property.


Doing the Math Tsallis entropy involves a power law: For an isolated system, W is raised to the power 1 - q. But when the parameter q goes to 1, the Tsallis entropy equals the logarithmic Boltzmann entropy.


Figure 3

Although the equation on Boltzmann's grave captures the essence of his insight into entropy, he never wrote it down himself. It was German physicist Max Planck who, in 1900, first put it into the form that became Boltzmann's epitaph.


Tsallis stresses that conventional entropy still applies whenever an object or system is in thermodynamic equilibrium, a placid state in which it has a well-defined uniform temperature. The new entropy comes into play, Tsallis says, primarily when a system is far from equilibrium, either because of some peculiarity of its dynamics or because an outside force continually perturbs it. But such systems are hardly rare special cases, says MIT's Baranger. "Actually, most of the systems in the universe are not in thermal equilibrium," he says, so Tsallis's work might open many new avenues of research.

A theory of q. Both proponents and skeptics agree that it's not enough to extract the value of q from the data; the case for the new entropy rests on determining what q means and how to predict its value. Christian Beck of the University of London might have taken a key first step in that direction last year with his analysis of turbulence. Beck studied data accumulated by Harry Swinney and Gregory Lewis, of the University of Texas, Austin, who had produced turbulent flows by placing a liquid in the space between two cylinders and then spinning the inner cylinder. Swinney and Lewis compared the velocity of the flow at two different positions around the cylinder.

Beck showed that the Tsallis approach nicely accounted for the observed variations in velocity--something that Boltzmann's entropy can't do. More important, Beck produced an equation that connected the value of q to temperature variations from place to place in the roiling liquid--the first time anyone had derived q from details of a system's interactions.

The work doesn't quite clinch the case for Tsallis entropy, Swinney says, because no one has proved that the temperature varies in just the way Beck presumed. Although measuring the temperature distribution won't be easy, Swinney says, "that's a hypothesis that can be tested."

Ultimately, nature will reveal whether Tsallis entropy belongs among the established concepts of statistical mechanics or on the scrap heap of bright, but failed, ideas. And if Boltzmann's fate is any guide, even affirmation might come only slowly and cruelly. Boltzmann's work met with hostility during his lifetime, and the physicist hanged himself in 1906--just a few years before his ideas were vindicated. Not until 1933 did authorities move his body to a place of honor and erect the headstone that memorializes his great insight. In the study of entropy, it seems, acceptance comes only at the end of a long and disorderly path.

Adrian Cho is a freelance writer in Grosse Pointe Park, Michigan.


Summary of this Article
Reprint (PDF) Version of this Article
Similar articles found in:
SCIENCE Online
PubMed
PubMed Citation
Search Medline for articles by:
Cho, A.
Alert me when:
new articles cite this article
Download to Citation Manager
   
Collections under which this article appears:
Physics

Volume 297, Number 5585, Issue of 23 Aug 2002, pp. 1268-1269.
Copyright © 2002 by The American Association for the Advancement of Science. All rights reserved.

Functional Genomics