Fluctuation theorem
Encyclopedia
The fluctuation theorem which originated from statistical mechanics
, deals with the relative probability that the entropy
of a system which is currently away from thermodynamic equilibrium
(i.e., maximum entropy) will increase or decrease over a given amount of time. While the second law of thermodynamics
predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
In other words, for a finite non-equilibrium system in a finite time, the FT gives a precise mathematical expression for the probability that entropy will flow in a direction opposite to that dictated by the second law of thermodynamics
.
Mathematically, the FT is expressed as:
This means that as the time or system size increases (since is extensive), the probability of observing an entropy production opposite to that dictated by the second law of thermodynamics decreases exponentially. The FT is one of the few expressions in non-equilibrium statistical mechanics that is valid far from equilibrium.
The FT was first proposed and tested using computer simulations, by Denis Evans
, E.G.D. Cohen
and Gary Morriss in 1993 in the journal Physical Review Letters. The first mathematical proof was given by Evans and Debra Searles
in 1994. Since then, much mathematical and computational work has been done to show that the FT applies to a variety of statistical ensembles. The first laboratory experiment that verified the validity of the FT was carried out in 2002. In this experiment, a plastic bead was pulled through a solution by a laser. Fluctuations in the velocity were recorded that were opposite to what the second law of thermodynamics would dictate for macroscopic systems. See Wang et al. [Phys Rev Lett, 89, 050601(2002)] and later Carberry et al., [Phys Rev Lett, 92, 140601(2004)]. This work was widely reported in the press - Second law of thermodynamics "broken" (NewScientist, 19 July 2002); Nature July 23, 2002, http://www.nature.com/nsu/020722/020722-2.html .
Note that the FT does not state that the second law of thermodynamics is wrong or invalid. The second law of thermodynamics is a statement about macroscopic systems. The FT is more general. It can be applied to both microscopic and macroscopic systems. When applied to macroscopic systems, the FT is equivalent to the Second Law of Thermodynamics.
This inequality is called the Second Law Inequality [Searles & Evans, Aust J Chem, 57, 1119 (2004)]. This inequality can be proved for systems with time dependent fields of arbitrary magnitude and arbitrary time dependence.
It is important to understand what the Second Law Inequality does not imply. It does not imply that the ensemble averaged entropy production is non-negative at all times. This is untrue, as consideration of the entropy production in a viscoelastic fluid subject to a sinusoidal time dependent shear rate shows. In this example the ensemble average of the time integral of the entropy production is however non negative - as expected from the Second Law Inequality.
see Carberry et al. J Chem Phys 121, 8179(2004). Thus in spite of the Second Law Inequality which might lead you to expect that the average would decay exponentially with time, the exponential probability ratio given by the FT exactly cancels the negative exponential in the average above leading to an average which is unity for all time!
There are many important implications from the FT. One is that small machines (such as nanomachines or even mitochondria in a cell) will spend part of their time actually running in "reverse". By "reverse", it is meant that they function so as to run in a way opposite to that for which they were presumably designed. As an example, consider a jet engine
. If a jet engine were to run in "reverse" in this context, it would take in ambient heat and exhaust fumes to generate kerosene
and oxygen.
where k is Boltzmann's constant, is the initial (t = 0) distribution of molecular states , and is the molecular state arrived at after time t, under the exact time reversible equations of motion. is the INITIAL distribution of those time evolved states.
Note: in order for the FT to be valid we require that . This condition is known as the condition of ergodic consistency. It is widely satisfied in common statistical ensembles - e.g. the canonical ensemble
.
The system may be in contact with a large heat reservoir in order to thermostat the system of interest. If this is the case is the heat lost to the reservoir over the time (0,t) and T is the absolute equilibrium temperature of the reservoir - see Williams et al., Phys Rev E70, 066113(2004). With this definition of the dissipation function the precise statement of the FT simply replaces entropy production with the dissipation function in each of the FT equations above.
Example: If one considers electrical conduction across an electrical resistor in contact with a large heat reservoir at temperature T, then the dissipation function is
the total electric current density J multiplied by the voltage drop across the circuit, , and the system volume V, divided by the absolute temperature T, of the heat reservoir times Boltzmann's constant. Thus the dissipation function is easily recognised as the Ohmic work done on the system divided by the temperature of the reservoir. Close to equilibrium the long time average of this quantity is (to leading order in the voltage drop), equal to the average spontaneous entropy production per unit time - see de Groot and Mazur "Nonequilibrium Thermodynamics" (Dover), equation (61), page 348. However, the Fluctuation Theorem applies to systems arbitrariliy far from equilibrium where the definition of the spontaneous entropy production is problematic.
, which predicts that the entropy of an isolated system out of equilibrium should tend to increase rather than decrease or stay constant, stands in apparent contradiction with the time-reversible
equations of motion for classical and quantum systems. The time reversal symmetry of the equations of motion show that if one films a given time dependent physical process, then playing the movie of that process backwards does not violate the laws of mechanics. It is often argued that for every forward trajectory in which entropy increases, there exists a time reversed anti trajectory where entropy decreases, thus if one picks an initial state randomly from the system's phase space
and evolves it forward according to the laws governing the system, decreasing entropy should be just as likely as increasing entropy. It might seem that this is incompatible with the second law of thermodynamics
which predicts that entropy tends to increase. The problem of deriving irreversible thermodynamics from time-symmetric fundamental laws is referred to as Loschmidt's paradox
.
The mathematical proof of the Fluctuation Theorem and in particular the Second Law Inequality shows that, given a non-equilibrium starting state, the probability of seeing its entropy increase is greater than the probability of seeing its entropy decrease - see The Fluctuation Theorem from Advances in Physics 51: 1529. However, as noted in section 6 of that paper, one could also use the same laws of mechanics to extrapolate backwards from a later state to an earlier state, and in this case the same reasoning used in the proof of the FT would lead us to predict the entropy was likely to have been greater at earlier times than at later times. This second prediction would be frequently violated in the real world, since it is often true that a given nonequilibrium system was at an even lower entropy in the past (although the prediction would be correct if the nonequilibrium state were the result of a random fluctuation in entropy in an isolated system that had previously been at equilibrium - in this case, if you happen to observe the system in a lower-entropy state, it is most likely that you are seeing the minimum of the random dip in entropy, in which case entropy would be higher on either side of this minimum).
So, it seems that the problem of deriving time-asymmetric thermodynamic laws from time-symmetric laws cannot be solved by appealing to statistical derivations which show entropy is likely to increase when you start from a nonequilibrium state and project it forwards. Many modern physicists believe the resolution to this puzzle lies in the low-entropy state of the universe shortly after the big bang, although the explanation for this initial low entropy is still debated.
The FT (together with the Axiom of Causality
) gives a generalisation of the second law of thermodynamics
which includes as a special case, the conventional second law. It is then easy to prove the Second Law Inequality and the NonEquilibrium Partition Identity. When combined with the central limit theorem
, the FT also implies the famous Green-Kubo relations
for linear transport coefficients, close to equilibrium. The FT is however, more general than the Green-Kubo Relations because unlike them, the FT applies to fluctuations far from equilibrium. In spite of this fact, scientists have not yet been able to derive the equations for nonlinear response theory from the FT.
The FT does not imply or require that the distribution of time averaged dissipation be Gaussian. There are many examples known where the distribution of time averaged dissipation is non-Gaussian and yet the FT (of course) still correctly describes the probability ratios.
Lastly the theoretical constructs used to prove the FT can be applied to nonequilibrium transitions between two different equilibrium states. When this is done the so-called Jarzynski equality
or nonequilibrium work relation, can be derived. This equality shows how equilibrium free energy differences can be computed or measured (in the laboratory), from nonequilibrium path integrals. Previously quasi-static (equilibrium) paths were required.
The reason why the fluctuation theorem is so fundamental is that its proof requires so little. It requires:
In regard to the latter "assumption", all the equations of motion for either classical or quantum dynamics are in fact time reversible.
For an alternative view on the same subject see http://www.scholarpedia.org/article/Fluctuation_theorem
Statistical mechanics
Statistical mechanics or statistical thermodynamicsThe terms statistical mechanics and statistical thermodynamics are used interchangeably...
, deals with the relative probability that the entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...
of a system which is currently away from thermodynamic equilibrium
Thermodynamic equilibrium
In thermodynamics, a thermodynamic system is said to be in thermodynamic equilibrium when it is in thermal equilibrium, mechanical equilibrium, radiative equilibrium, and chemical equilibrium. The word equilibrium means a state of balance...
(i.e., maximum entropy) will increase or decrease over a given amount of time. While the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
Statement of the fluctuation theorem
Roughly, the fluctuation theorem relates to the probability distribution of the time-averaged irreversible entropy production [1], denoted . The theorem states that, in systems away from equilibrium over a finite time t, the ratio between the probability that takes on a value A and the probability that it takes the opposite value, −A, will be exponential in At.In other words, for a finite non-equilibrium system in a finite time, the FT gives a precise mathematical expression for the probability that entropy will flow in a direction opposite to that dictated by the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
.
Mathematically, the FT is expressed as:
This means that as the time or system size increases (since is extensive), the probability of observing an entropy production opposite to that dictated by the second law of thermodynamics decreases exponentially. The FT is one of the few expressions in non-equilibrium statistical mechanics that is valid far from equilibrium.
The FT was first proposed and tested using computer simulations, by Denis Evans
Denis Evans
Denis James Evans, , is a Professor in the Research School of Chemistry at the Australian National University...
, E.G.D. Cohen
E.G.D. Cohen
E.G.D. Cohen is an American physicist, and is Professor Emeritus at The Rockefeller University. He is widely recognised for his contributions to statistical physics. In 2004 Cohen was awarded the Boltzmann Medal, jointly with Prof. H. Eugene Stanley...
and Gary Morriss in 1993 in the journal Physical Review Letters. The first mathematical proof was given by Evans and Debra Searles
Debra Searles
Debra Searles is the professional name for an Australian theoretical chemist whose married name is Debra Bernhardt. She is best known for her contributions towards understanding the Fluctuation Theorem...
in 1994. Since then, much mathematical and computational work has been done to show that the FT applies to a variety of statistical ensembles. The first laboratory experiment that verified the validity of the FT was carried out in 2002. In this experiment, a plastic bead was pulled through a solution by a laser. Fluctuations in the velocity were recorded that were opposite to what the second law of thermodynamics would dictate for macroscopic systems. See Wang et al. [Phys Rev Lett, 89, 050601(2002)] and later Carberry et al., [Phys Rev Lett, 92, 140601(2004)]. This work was widely reported in the press - Second law of thermodynamics "broken" (NewScientist, 19 July 2002); Nature July 23, 2002, http://www.nature.com/nsu/020722/020722-2.html .
Note that the FT does not state that the second law of thermodynamics is wrong or invalid. The second law of thermodynamics is a statement about macroscopic systems. The FT is more general. It can be applied to both microscopic and macroscopic systems. When applied to macroscopic systems, the FT is equivalent to the Second Law of Thermodynamics.
Second law inequality
A simple consequence of the fluctuation theorem given above is that if we carry out an arbitrarily large ensemble of experiments from some initial time t=0, and perform an ensemble average of time averages of the entropy production then an exact consequence of the FT is that the ensemble average cannot be negative for any value of the averaging time t:This inequality is called the Second Law Inequality [Searles & Evans, Aust J Chem, 57, 1119 (2004)]. This inequality can be proved for systems with time dependent fields of arbitrary magnitude and arbitrary time dependence.
It is important to understand what the Second Law Inequality does not imply. It does not imply that the ensemble averaged entropy production is non-negative at all times. This is untrue, as consideration of the entropy production in a viscoelastic fluid subject to a sinusoidal time dependent shear rate shows. In this example the ensemble average of the time integral of the entropy production is however non negative - as expected from the Second Law Inequality.
Nonequilibrium partition identity
Another remarkably simple and elegant consequence of the FT is the so-called "nonequilibrium partition identity" (NPI):see Carberry et al. J Chem Phys 121, 8179(2004). Thus in spite of the Second Law Inequality which might lead you to expect that the average would decay exponentially with time, the exponential probability ratio given by the FT exactly cancels the negative exponential in the average above leading to an average which is unity for all time!
There are many important implications from the FT. One is that small machines (such as nanomachines or even mitochondria in a cell) will spend part of their time actually running in "reverse". By "reverse", it is meant that they function so as to run in a way opposite to that for which they were presumably designed. As an example, consider a jet engine
Jet engine
A jet engine is a reaction engine that discharges a fast moving jet to generate thrust by jet propulsion and in accordance with Newton's laws of motion. This broad definition of jet engines includes turbojets, turbofans, rockets, ramjets, pulse jets...
. If a jet engine were to run in "reverse" in this context, it would take in ambient heat and exhaust fumes to generate kerosene
Kerosene
Kerosene, sometimes spelled kerosine in scientific and industrial usage, also known as paraffin or paraffin oil in the United Kingdom, Hong Kong, Ireland and South Africa, is a combustible hydrocarbon liquid. The name is derived from Greek keros...
and oxygen.
Dissipation function
[1] Strictly speaking the fluctuation theorem refers to a quantity known as the dissipation function. In thermostatted nonequilibrium states that are close to equilibrium, the long time average of the dissipation function is equal to the average entropy production. However the FT refers to fluctuations rather than averages. The dissipation function is defined as,where k is Boltzmann's constant, is the initial (t = 0) distribution of molecular states , and is the molecular state arrived at after time t, under the exact time reversible equations of motion. is the INITIAL distribution of those time evolved states.
Note: in order for the FT to be valid we require that . This condition is known as the condition of ergodic consistency. It is widely satisfied in common statistical ensembles - e.g. the canonical ensemble
Canonical ensemble
The canonical ensemble in statistical mechanics is a statistical ensemble representing a probability distribution of microscopic states of the system...
.
The system may be in contact with a large heat reservoir in order to thermostat the system of interest. If this is the case is the heat lost to the reservoir over the time (0,t) and T is the absolute equilibrium temperature of the reservoir - see Williams et al., Phys Rev E70, 066113(2004). With this definition of the dissipation function the precise statement of the FT simply replaces entropy production with the dissipation function in each of the FT equations above.
Example: If one considers electrical conduction across an electrical resistor in contact with a large heat reservoir at temperature T, then the dissipation function is
the total electric current density J multiplied by the voltage drop across the circuit, , and the system volume V, divided by the absolute temperature T, of the heat reservoir times Boltzmann's constant. Thus the dissipation function is easily recognised as the Ohmic work done on the system divided by the temperature of the reservoir. Close to equilibrium the long time average of this quantity is (to leading order in the voltage drop), equal to the average spontaneous entropy production per unit time - see de Groot and Mazur "Nonequilibrium Thermodynamics" (Dover), equation (61), page 348. However, the Fluctuation Theorem applies to systems arbitrariliy far from equilibrium where the definition of the spontaneous entropy production is problematic.
The fluctuation theorem and Loschmidt's paradox
The second law of thermodynamicsSecond law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
, which predicts that the entropy of an isolated system out of equilibrium should tend to increase rather than decrease or stay constant, stands in apparent contradiction with the time-reversible
T-symmetry
T Symmetry is the symmetry of physical laws under a time reversal transformation: T: t \mapsto -t.Although in restricted contexts one may find this symmetry, the observable universe itself does not show symmetry under time reversal, primarily due to the second law of thermodynamics.Time asymmetries...
equations of motion for classical and quantum systems. The time reversal symmetry of the equations of motion show that if one films a given time dependent physical process, then playing the movie of that process backwards does not violate the laws of mechanics. It is often argued that for every forward trajectory in which entropy increases, there exists a time reversed anti trajectory where entropy decreases, thus if one picks an initial state randomly from the system's phase space
Phase space
In mathematics and physics, a phase space, introduced by Willard Gibbs in 1901, is a space in which all possible states of a system are represented, with each possible state of the system corresponding to one unique point in the phase space...
and evolves it forward according to the laws governing the system, decreasing entropy should be just as likely as increasing entropy. It might seem that this is incompatible with the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
which predicts that entropy tends to increase. The problem of deriving irreversible thermodynamics from time-symmetric fundamental laws is referred to as Loschmidt's paradox
Loschmidt's paradox
Loschmidt's paradox, also known as the reversibility paradox, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics...
.
The mathematical proof of the Fluctuation Theorem and in particular the Second Law Inequality shows that, given a non-equilibrium starting state, the probability of seeing its entropy increase is greater than the probability of seeing its entropy decrease - see The Fluctuation Theorem from Advances in Physics 51: 1529. However, as noted in section 6 of that paper, one could also use the same laws of mechanics to extrapolate backwards from a later state to an earlier state, and in this case the same reasoning used in the proof of the FT would lead us to predict the entropy was likely to have been greater at earlier times than at later times. This second prediction would be frequently violated in the real world, since it is often true that a given nonequilibrium system was at an even lower entropy in the past (although the prediction would be correct if the nonequilibrium state were the result of a random fluctuation in entropy in an isolated system that had previously been at equilibrium - in this case, if you happen to observe the system in a lower-entropy state, it is most likely that you are seeing the minimum of the random dip in entropy, in which case entropy would be higher on either side of this minimum).
So, it seems that the problem of deriving time-asymmetric thermodynamic laws from time-symmetric laws cannot be solved by appealing to statistical derivations which show entropy is likely to increase when you start from a nonequilibrium state and project it forwards. Many modern physicists believe the resolution to this puzzle lies in the low-entropy state of the universe shortly after the big bang, although the explanation for this initial low entropy is still debated.
Summary
The fluctuation theorem is of fundamental importance to nonequilibrium statistical mechanics.The FT (together with the Axiom of Causality
Axiom of Causality
The Axiom of Causality is the proposition that everything in the universe has a cause and is thus an effect of that cause. This means that if a given event occurs, then this is the result of a previous, related event. If an object is in a certain state, then it is in that state as a result of...
) gives a generalisation of the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
which includes as a special case, the conventional second law. It is then easy to prove the Second Law Inequality and the NonEquilibrium Partition Identity. When combined with the central limit theorem
Central limit theorem
In probability theory, the central limit theorem states conditions under which the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed. The central limit theorem has a number of variants. In its common...
, the FT also implies the famous Green-Kubo relations
Green-Kubo relations
The Green–Kubo relations give the exact mathematical expression for transport coefficients in terms of integrals of time correlation functions.-Thermal and mechanical transport processes:...
for linear transport coefficients, close to equilibrium. The FT is however, more general than the Green-Kubo Relations because unlike them, the FT applies to fluctuations far from equilibrium. In spite of this fact, scientists have not yet been able to derive the equations for nonlinear response theory from the FT.
The FT does not imply or require that the distribution of time averaged dissipation be Gaussian. There are many examples known where the distribution of time averaged dissipation is non-Gaussian and yet the FT (of course) still correctly describes the probability ratios.
Lastly the theoretical constructs used to prove the FT can be applied to nonequilibrium transitions between two different equilibrium states. When this is done the so-called Jarzynski equality
Jarzynski equality
The Jarzynski equality is an equation in statistical mechanics that relates free energy differences between two equilibrium states and non-equilibrium processes...
or nonequilibrium work relation, can be derived. This equality shows how equilibrium free energy differences can be computed or measured (in the laboratory), from nonequilibrium path integrals. Previously quasi-static (equilibrium) paths were required.
The reason why the fluctuation theorem is so fundamental is that its proof requires so little. It requires:
- knowledge of the mathematical form of the initial distribution of molecular states,
- that all time evolved final states at time t, must be present with nonzero probability in the distribution of initial states (t = 0) - the so-called condition of ergodic consistency and,
- an assumption of time reversal symmetry.
In regard to the latter "assumption", all the equations of motion for either classical or quantum dynamics are in fact time reversible.
For an alternative view on the same subject see http://www.scholarpedia.org/article/Fluctuation_theorem
See also
- Linear response functionLinear response functionA linear response function describes the input-output relationship of a signal transducer such as a radio turning electromagnetic waves into music or a neuron turning synaptic input into a response...
- Green's function (many-body theory)Green's function (many-body theory)In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators....
- Loschmidt's paradoxLoschmidt's paradoxLoschmidt's paradox, also known as the reversibility paradox, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics...
- Le Chatelier's principleLe Châtelier's principleIn chemistry, Le Chatelier's principle, also called the Chatelier's principle, can be used to predict the effect of a change in conditions on a chemical equilibrium. The principle is named after Henry Louis Le Chatelier and sometimes Karl Ferdinand Braun who discovered it independently...
- a nineteenth century principle that defied a mathematical proof until the advent of the Fluctuation Theorem. - Crooks fluctuation theoremCrooks Fluctuation TheoremThe Crooks equation is an equation in statistical mechanics that relatesthe work done on a system during a non-equilibrium transformation to thefree energy difference between the final and the initial state of the...
- an example of transient fluctuation theorem relating the dissipated work in non equilibrium transformations to free energy differences. - Jarzynski equalityJarzynski equalityThe Jarzynski equality is an equation in statistical mechanics that relates free energy differences between two equilibrium states and non-equilibrium processes...
- another nonequilibrium equality closely related to the fluctuation theorem and to the second law of thermodynamics - Green-Kubo relationsGreen-Kubo relationsThe Green–Kubo relations give the exact mathematical expression for transport coefficients in terms of integrals of time correlation functions.-Thermal and mechanical transport processes:...
- there is a deep connection between the fluctuation theorem and the Green-Kubo relations for linear transport coefficients - like shear viscosityViscosityViscosity is a measure of the resistance of a fluid which is being deformed by either shear or tensile stress. In everyday terms , viscosity is "thickness" or "internal friction". Thus, water is "thin", having a lower viscosity, while honey is "thick", having a higher viscosity...
or thermal conductivityThermal conductivityIn physics, thermal conductivity, k, is the property of a material's ability to conduct heat. It appears primarily in Fourier's Law for heat conduction.... - Boltzmann
- ThermodynamicsThermodynamicsThermodynamics is a physical science that studies the effects on material bodies, and on radiation in regions of space, of transfer of heat and of work done on or by the bodies or radiation...
- Brownian motorBrownian motorBrownian motors are nano-scale or molecular devices by which thermally activated processes are controlled and used to generate directed motion in space and to do mechanical or electrical work...