Maxwell–Boltzmann statistics
Encyclopedia
In statistical mechanics
Statistical mechanics
Statistical mechanics or statistical thermodynamicsThe terms statistical mechanics and statistical thermodynamics are used interchangeably...

, Maxwell–Boltzmann statistics describes the statistical distribution of material particles over various energy states in thermal equilibrium
Thermal equilibrium
Thermal equilibrium is a theoretical physical concept, used especially in theoretical texts, that means that all temperatures of interest are unchanging in time and uniform in space...

, when the temperature is high enough and density is low enough to render quantum effects negligible.

The expected number of particles with energy for Maxwell–Boltzmann statistics is where:


where:
  • is the number of particles in state i
  • is the energy
    Energy
    In physics, energy is an indirectly observed quantity. It is often understood as the ability a physical system has to do work on other physical systems...

     of the i-th state
  • is the degeneracy
    Degenerate energy level
    In physics, two or more different quantum states are said to be degenerate if they are all at the same energy level. Statistically this means that they are all equally probable of being filled, and in Quantum Mechanics it is represented mathematically by the Hamiltonian for the system having more...

     of energy level i, the number of particle's states (excluding the "free particle" state) with energy
  • μ is the chemical potential
    Chemical potential
    Chemical potential, symbolized by μ, is a measure first described by the American engineer, chemist and mathematical physicist Josiah Willard Gibbs. It is the potential that a substance has to produce in order to alter a system...

  • k is Boltzmann's constant
  • T is absolute temperature
    Temperature
    Temperature is a physical property of matter that quantitatively expresses the common notions of hot and cold. Objects of low temperature are cold, while various degrees of higher temperatures are referred to as warm or hot...

  • N is the total number of particles
  • Z is the partition function
    Partition function (statistical mechanics)
    Partition functions describe the statistical properties of a system in thermodynamic equilibrium. It is a function of temperature and other parameters, such as the volume enclosing a gas...

  • e(...) is the exponential function
    Exponential function
    In mathematics, the exponential function is the function ex, where e is the number such that the function ex is its own derivative. The exponential function is used to model a relationship in which a constant change in the independent variable gives the same proportional change In mathematics,...



Equivalently, the distribution is sometimes expressed as


where the index i  now specifies a particular state rather than the set of all states with energy , and

A derivation of the Maxwell–Boltzmann distribution

Suppose we have a container with a huge number of very small identical particles. Although the particles are identical, we still identify them by drawing numbers on them in the way lottery balls are being labelled with numbers and even colors.

All of those tiny particles are moving inside that container in all directions with great speed. Because the particles are speeding around, they do possess some energy. The Maxwell–Boltzmann distribution is a mathematical function that speaks about how many particles in the container have a certain energy.

It can be so that many particles have the same amount of energy . The number of particles with the same energy is . The number of particles possessing another energy is . In physical speech this statement is lavishly inflated into something complicated which states that those many particles with the same energy amount , all occupy a so called "energy level"  . The concept of energy level is used to graphically/mathematically describe and analyse the properties of particles and events experienced by them. Physicists take into consideration the ways particles arrange themself and thus there is more than one way of occupying an energy level and that's the reason why the particles were tagged like lottery ball, to know the intentions of each one of them.

To begin with, let's ignore the degeneracy problem: assume that there is only one single way to put particles into energy level  . What follows next is a bit of combinatorial thinking which has little to do in accurately describing the reservoir of particles.

The number of different ways of performing an ordered selection of one single object from N objects is obviously N. The number of different ways of selecting two objects from N objects, in a particular order, is thus N(N − 1) and that of selecting n objects in a particular order is seen to be N!/(N − n)!. The number of ways of selecting 2 objects from N objects without regard to order is N(N − 1) divided by the number of ways 2 objects can be ordered, which is 2!. It can be seen that the number of ways of selecting n objects from objects without regard to order is the binomial coefficient: N!/(n!(N − n)!). If we now have a set of boxes labelled a, b, c, d, e, ..., k, then the number of ways of selecting Na objects from a total of N objects and placing them in box a, then selecting Nb objects from the remaining N − Na objects and placing them in box b, then selecting Nc objects from the remaining N − Na − Nb objects and placing them in box c, and continuing until no object is left outside is


and because not even a single object is to be left outside the boxes, implies that the sum made of the terms Na, Nb, Nc, Nd, Ne, ..., Nk must equal N, thus the term (N - Na - Nb - Nc - ... - Nl - Nk)! in the relation above evaluates to 0! which makes possible to write down that relation as

Now going back to the degeneracy problem which characterize the reservoir of particles. If the i-th box has a "degeneracy" of , that is, it has "sub-boxes", such that any way of filling the i-th box where the number in the sub-boxes is changed is a distinct way of filling the box, then the number of ways of filling the i-th box must be increased by the number of ways of distributing the objects in the "sub-boxes". The number of ways of placing distinguishable objects in "sub-boxes" is . Thus the number of ways that a total of particles can be classified into energy levels according to their energies, while each level having distinct states such that the i-th level accommodates particles is:


This is the form for W first derived by Boltzmann
Ludwig Boltzmann
Ludwig Eduard Boltzmann was an Austrian physicist famous for his founding contributions in the fields of statistical mechanics and statistical thermodynamics...

. Boltzmann's fundamental equation relates the thermodynamic entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...

 S to the number of microstates W, where k is the Boltzmann constant. It was pointed out by Gibbs however, that the above expression for W does not yield an extensive entropy, and is therefore faulty. This problem is known as the Gibbs paradox
Gibbs paradox
In statistical mechanics, a semi-classical derivation of the entropy that doesn't take into account the indistinguishability of particles, yields an expression for the entropy which is not extensive...

 The problem is that the particles considered by the above equation are not indistinguishable
Identical particles
Identical particles, or indistinguishable particles, are particles that cannot be distinguished from one another, even in principle. Species of identical particles include elementary particles such as electrons, and, with some clauses, composite particles such as atoms and molecules.There are two...

. In other words, for two particles (A and B) in two energy sublevels the population represented by [A,B] is considered distinct from the population [B,A] while for indistinguishable particles, they are not. If we carry out the argument for indistinguishable particles, we are led to the Bose-Einstein expression for W:


Both the Maxwell-Boltzmann distribution and the Bose-Einstein distribution are only valid for temperatures well above absolute zero, implying that . The Maxwell-Boltzmann distribution also requires low density, implying that . Under these conditions, we may use Stirling's approximation for the factorial:


to write:


Using the fact that for we can again use Stirlings approximation to write:


This is essentially a division by N! of Boltzmann's original expression for W, and this correction is referred to as correct Boltzmann counting.

We wish to find the for which the function is maximized, while considering the constraint that there is a fixed number of particles and a fixed energy in the container. The maxima of and are achieved by the same values of and, since it is easier to accomplish mathematically, we will maximize the latter function instead. We constrain our solution using Lagrange multipliers
Lagrange multipliers
In mathematical optimization, the method of Lagrange multipliers provides a strategy for finding the maxima and minima of a function subject to constraints.For instance , consider the optimization problem...

 forming the function:



Finally


In order to maximize the expression above we apply Fermat's theorem (stationary points)
Fermat's theorem (stationary points)
In mathematics, Fermat's theorem is a method to find local maxima and minima of differentiable functions on open sets by showing that every local extremum of the function is a stationary point...

, according to which local extrema, if exist, must be at critical points (partial derivatives vanish):


By solving the equations above () we arrive to an expression for :


Substituting this expression for into the equation for and assuming that yields:


or, differentiating and rearranging:


Boltzmann realized that this is just an expression of the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...

. Identifying dE as the internal energy, the second law of thermodynamics states that for variation only in entropy (S) and particle number (N):


where T is the temperature
Temperature
Temperature is a physical property of matter that quantitatively expresses the common notions of hot and cold. Objects of low temperature are cold, while various degrees of higher temperatures are referred to as warm or hot...

 and μ is the chemical potential
Chemical potential
Chemical potential, symbolized by μ, is a measure first described by the American engineer, chemist and mathematical physicist Josiah Willard Gibbs. It is the potential that a substance has to produce in order to alter a system...

. Boltzmann's famous equation is the realization that the entropy is proportional to with the constant of proportionality being Boltzmann's constant. It follows immediately that and so that the populations may now be written:


Note that the above formula is sometimes written:


where is the absolute activity
Activity (chemistry)
In chemical thermodynamics, activity is a measure of the “effective concentration” of a species in a mixture, meaning that the species' chemical potential depends on the activity of a real solution in the same way that it would depend on concentration for an ideal solution.By convention, activity...

.

Alternatively, we may use the fact that


to obtain the population numbers as


where Z is the partition function
Partition function (statistical mechanics)
Partition functions describe the statistical properties of a system in thermodynamic equilibrium. It is a function of temperature and other parameters, such as the volume enclosing a gas...

 defined by:

Another derivation (not as fundamental)

In the above discussion, the Boltzmann distribution function was obtained via directly analysing the multiplicities of a system. Alternatively, one can make use of the canonical ensemble
Canonical ensemble
The canonical ensemble in statistical mechanics is a statistical ensemble representing a probability distribution of microscopic states of the system...

. In a canonical ensemble, a system is in thermal contact with a reservoir. While energy is free to flow between the system and the reservoir, the reservoir is thought to have infinitely large heat capacity as to maintain constant temperature, T, for the combined system.

In the present context, our system is assumed to have the energy levels with degeneracies . As before, we would like to calculate the probability that our system has energy .

If our system is in state , then there would be a corresponding number of microstates available to the reservoir. Call this number . By assumption, the combined system (of the system we are interested in and the reservoir) is isolated, so all microstates are equally probable. Therefore, for instance, if , we can conclude that our system is twice as likely to be in state than . In general, if is the probability that our system is in state ,


Since the entropy of the reservoir , the above becomes


Next we recall the thermodynamic identity (from the first law of thermodynamics
First law of thermodynamics
The first law of thermodynamics is an expression of the principle of conservation of work.The law states that energy can be transformed, i.e. changed from one form to another, but cannot be created nor destroyed...

):


In a canonical ensemble, there is no exchange of particles, so the term is zero. Similarly, This gives


where and denote the energies of the reservoir and the system at , respectively. For the second equality we have used the conservation of energy. Substituting into the first equation relating :


which implies, for any state s of the system


where Z is an appropriately chosen "constant" to make total probability 1. (Z is constant provided that the temperature T is invariant.) It is obvious that


where the index s runs through all microstates of the system. Z is sometimes called the Boltzmann sum over states (or "Zustandsumme" in the original German). If we index the summation via the energy eigenvalues instead of all possible states, degeneracy must be taken into account. The probability of our system having energy is simply the sum of the probabilities of all corresponding microstates:


where, with obvious modification,


this is the same result as before.

Comments

  • Notice that in this formulation, the initial assumption "... suppose the system has total N particles..." is dispensed with. Indeed, the number of particles possessed by the system plays no role in arriving at the distribution. Rather, how many particles would occupy states with energy follows as an easy consequence.

  • What has been presented above is essentially a derivation of the canonical partition function. As one can tell by comparing the definitions, the Boltzmann sum over states is really no different from the canonical partition function.

  • Exactly the same approach can be used to derive Fermi–Dirac and Bose–Einstein
    Bose–Einstein statistics
    In statistical mechanics, Bose–Einstein statistics determines the statistical distribution of identical indistinguishable bosons over the energy states in thermal equilibrium.-Concept:...

     statistics. However, there one would replace the canonical ensemble with the grand canonical ensemble
    Grand canonical ensemble
    In statistical mechanics, a grand canonical ensemble is a theoretical collection of model systems put together to mirror the calculated probability distribution of microscopic states of a given physical system which is being maintained in a given macroscopic state...

    , since there is exchange of particles between the system and the reservoir. Also, the system one considers in those cases is a single particle state, not a particle. (In the above discussion, we could have assumed our system to be a single atom.)

Limits of applicability

The Bose–Einstein
Bose–Einstein statistics
In statistical mechanics, Bose–Einstein statistics determines the statistical distribution of identical indistinguishable bosons over the energy states in thermal equilibrium.-Concept:...

 and Fermi–Dirac distributions may be written:


Assuming the minimum value of is small, it can be seen that the condition under which the Maxwell–Boltzmann distribution is valid is when


For an ideal gas
Ideal gas
An ideal gas is a theoretical gas composed of a set of randomly-moving, non-interacting point particles. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics.At normal conditions such as...

, we can calculate the chemical potential using the development in the Sackur–Tetrode article to show that:


where is the total internal energy, is the entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...

, is the volume, and is the thermal de Broglie wavelength. The condition for the applicability of the Maxwell–Boltzmann distribution for an ideal gas is again shown to be
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK