Boltzmann distribution

In classical statistical physics, the Boltzmann distribution  expresses the  relative probability  that a subsystem  of a physical system has a certain energy. The subsystem must part of a system that is in thermal equilibrium, that is, the total system must have a well-defined (absolute) temperature. For instance, a single molecule can be a subsystem of one mole of ideal gas and the Boltzmann distribution applies to the energies of the gas molecules, provided the total amount of the ideal gas is in thermal equilibrium.

The Boltzmann distribution (also known as the Maxwell-Boltzmann distribution) was proposed in 1859 by the Scotsman James Clerk Maxwell for the statistical distribution of the kinetic energies of ideal gas molecules. The Maxwell-Boltzmann law can be formulated as the ratio of two numbers. Consider an ideal gas of temperature T. Let n1 be the number of molecules with energy E1 and n2 be the number  with energy E2, then according to the Maxwell-Boltzmann distribution law,

\frac{n_1}{n_2} = \frac{e^{-E_1/(kT)}}{e^{-E_2/(kT)}},  \qquad\qquad \qquad\qquad (1) $$ where k is the Boltzmann constant. Most noticeable in this expression are (i) the exponential forms, (ii) the inverse temperature in the exponents, and (iii) the appearance of the natural constant k.

The molecular energies may, in addition to kinetic energies,  contain interactions with an external field. For instance, if a system, say a column of air, is in the gravitational field of the Earth, each molecular energy may contain the additional term mgh, where m is the molecular mass, g the gravitational acceleration and h the height of the molecule above the surface of the Earth.

Generalization to non-ideal gases
Maxwell's finding was generalized in 1871 by the Austrian Ludwig Boltzmann, who gave the distribution  of molecules of real gases over (kinetic plus potential) energies. A few years later (ca. 1877) the American Josiah Willard Gibbs  gave a further formalization. He introduced an ensemble (which plays the role of total system) of identical vessels (which play the role of subsystems). Each vessel contains the same large number N of the same real gas  molecules at the same temperature and pressure. The ensemble, consisting of a large number of identical subsystems, is in thermal equilibrium, i.e., the vessels are in thermal contact and they can exchange heat. The basic assumption is that the same law, equation (1), holds for relative probabilities, but now the energies in equation (1) are the total energies of the gas in the vessels (the subsystems), rather than the energies of the individual molecules (which are the subsystems in the case of an ideal gas). Because of the presence of molecular interactions (which are absent in an ideal gas), the energy of a real gas cannot be expressed in terms of one-molecule energies. This is the main reason why the generalization to ensembles is necessary. The absolute probability that a subsystem has total energy Ek can be obtained from the relative probability by normalizing the distribution. Let us assume, for convenience sake, that the energies of the subsystems (vessels filled with real gas) are discrete (as they are in quantum mechanics), then

\mathcal{P}(E_k) = \frac{e^{-E_k/(kT)}}{Q} \quad \hbox{with} \quad Q \equiv \sum_{i=0}^\infty e^{-E_i/(kT)} $$ The quantity Q is known as the partition function of the gas consisting of N interacting molecules (in the older literature: Zustandssumme, which is German for "sum over states").

In classical statistical physics the partition function of a system of N molecules is an integral over the 6N-dimensional phase space (space of momenta and positions). In the "old quantum theory" (around 1913) the classical partition function of N molecules was multiplied by quantum factor and became

Q_\mathrm{class} = \frac{1}{N! h^{3N}} \int E^{-E(p,q)/(kT)} \mathrm{d}\mathbf{q}_1 \cdots \mathrm{d}\mathbf{q}_N \,\mathrm{d}\mathbf{p}_1 \cdots \mathrm{d}\mathbf{p}_N, $$ where h is Planck's constant and N! = 1 &times; 2&times; ... &times; N.