Intermolecular
potentials / Molecular separations / Many-Body problem
In this section,
Feynman analyzes intermolecular potentials, molecular separations, and the
complexity of many-body problems in liquids. A more accurate title could be “The
Role of Intermolecular Potentials in Determining Molecular Separations”—or,
more briefly, “Intermolecular Potentials and Molecular Separations.” By
contrast, the title “Evaporation of a Liquid” would be misleading, since
the section is not specifically about the evaporation process. Evaporation refers
to the phase transition from liquid to vapor at a surface, governed by vapor
pressure, surface interactions, and the molecular energy distribution. However,
evaporation appears only briefly as a side remark connected to the
many-body problem, rather than as the main focus. In fact, Feynman discusses
evaporation directly in Chapter 42, under the section titled “Evaporation.”
1. Intermolecular
potentials:
“Let us take the case of just two molecules: the e−P.E./kT would be the probability of finding them at various mutual distances r (Feynman et al., 1963, p. 40-3).”
A shortcoming in Feynman’s description is that he equates the Boltzmann factor directly with probability, while it only provides relative probability weights. Strictly speaking, proper probability requires normalization through the partition function. In Statistical Mechanics, Feynman (1972) writes: “The key principle of statistical mechanics is as follows: If a system in equilibrium can be in one of N states, then the probability of the system having energy En is (1/Q)e-En/kT, where… Q is called the partition (p. 1).” In other words, the relative probability of finding two molecules separated by a distance r is proportional to e−U(r)/kT. To obtain the actual probability distribution, the Boltzmann factor must be normalized by the partition function. Thus, the actual probability density of finding the molecules at separation r is: P(r) = (1/Z)e−V(r)/kT where Z is the normalization factor (partition function) that ensures the total probability sums to 1.
“The total potential energy in this case would be the sum over all the pairs, supposing that the forces are all in pairs... Then the probability for finding molecules in any particular combination of rij’s will be proportional to exp[−∑i,jV(rij)/kT] (Feynman et al., 1963, p. 40-3).”
Feynman’s
expression ∑i,jV(rij) counts each pair of molecules twice:
once as (i, j) and again as (j, i). Since the interaction is symmetric, this
double counts the total potential energy. A simple analogy is counting
handshakes in a group: if you let everyone record “I shook hand with you,” each
handshake is counted twice. To correct this, we can either divide the total by
two or sum over unique pairs only. In statistical mechanics, the intermolecular
potential can be written as ½ ∑i≠jV(rij) or equivalently ∑i<jV(rij). This fixes the
summation index problem, making it clear that U is the sum over unique
pairs (i<j), and accurately defines U before substituting it into the
Boltzmann factor. With this correction, we can account for pairwise
interactions without double counting.
2. Molecular separations:
“Let us take the case of just two molecules: the e−P.E./kT would be the probability of finding them at various mutual distances r. Clearly, where the potential goes most negative, the probability is largest, and where the potential goes toward infinity, the probability is almost zero, which occurs for very small distances. That means that for such atoms in a gas, there is no chance that they are on top of each other, since they repel so strongly. But there is a greater chance of finding them per unit volume at the point r0 than at any other point (Feynman et al., 1963, p. 40-3).”
The probability of
finding a molecule at a given separation r from another is governed by the intermolecular
potential V(r) and the Boltzmann factor. This probability can be understood in
three characteristic regions:
1. Very close (r ≈ 0):
At extremely small separations, V(r) → +∞ due to strong repulsion. The exponent −V(r)/kT
→ −∞, so e−V(r)/kT ≈ 0. Thus, the probability of molecules
overlapping is close to zero.
2. Equilibrium
position (r = r0): At the equilibrium distance, V(r₀) reaches its
minimum (most negative value), corresponding to maximum attractive interaction. Here, −V(r0)/kT
is maximally positive, making e−V(r)/kT largest. Molecules
are most likely to be found near r0.
3. Far apart (r >>
r0): At large separations, V(r)→0, meaning negligible interaction. The
exponent −V(r)/kT → 0, so e−V(r)/kT → 1. The probability is
lower than at r₀ but greater than at r ≈ 0, and it increases with temperature
as thermal motion overcomes attractive forces.
In short, the
Boltzmann factor provides a probabilistic map of molecular separations, showing
how intermolecular potentials and temperature jointly determine the spatial
distribution of molecules in a liquid.
Summary:
- r ≈ 0: Probability ≈ 0 (Molecules are
separated by repulsion).
- r = r0: Maximum probability
(most favorable separation).
- r→∞: Probability moderate; molecular
separation increases with temperature.
Note: The Lennard–Jones (LJ) potential is one of the most extensively studied intermolecular potentials.
“Now, if the temperature is very high, so that kT>>|V(r0)|, the exponent is relatively small almost everywhere, and the probability of finding a molecule is almost independent of position… As the temperature falls, the atoms fall together, clump in lumps, and reduce to liquids, and solids, and molecules, and as you heat them up they evaporate (Feynman et al., 1963, p. 40-3).”
Evaporation does
not require a liquid to reach a high overall temperature because it is governed
by the statistical distribution of molecular energies, not the average alone. Specifically,
at any temperature above absolute zero, molecules in a liquid have a spread of
kinetic energies described by the Maxwell–Boltzmann distribution. A fraction of
the surface molecules will always possess enough energy to overcome
intermolecular attractions and escape into the air. At very high temperatures,
where kT >> |V(r0)|, this collective excitation leads to
boiling rather than mere surface evaporation. Thus, evaporation can occur at
any T>0, without the liquid needing to reach its boiling point. Even if the
mean kinetic energy is relatively small, the high-energy tail of the
distribution allows some molecules exceed the binding energy (or equivalently,
the latent heat of vaporization) and are able to leave the liquid. That said,
at very low temperatures the fraction of such energetic molecules becomes
vanishingly small, so the evaporation rate is extremely slow.
“The requirements for the determination of exactly how things evaporate, exactly how things should happen in a given circumstance, involve the following. First, to discover the correct molecular-force law V(r), which must come from something else, quantum mechanics, say, or experiment (Feynman et al., 1963, p. 40-4).”
The stringent
requirements for determining the exact mechanism of evaporation make the task
effectively impossible. A more practical approach is provided by the
Hertz–Knudsen equation, which predicts evaporation rates under the assumption
of an ideal gas in thermal equilibrium. Derived from the Maxwell–Boltzmann
distribution of molecular speeds, this relation—together with experimentally
measured saturation pressures and an empirical condensation coefficient—yields
reliable estimates without requiring detailed knowledge of the intermolecular
potential. On the other hand, the distribution function obtained from realistic
models such as the Lennard–Jones (LJ) potential and Monte Carlo simulations
exhibits multiple maxima and minima (Hansen & McDonald, 2006). This
behavior differs from the simpler two-body model, where the distribution shows
only a single maximum at the equilibrium separation and no oscillatory
structure. Thus, even though detailed microscopic calculations reveal the
complexity of molecular organization, simple models like the Hertz–Knudsen
relation can still provide insights into evaporation without the need to
satisfy such stringent requirements.
![]() |
Source: (Hansen & McDonald, 2006) |
3. Many-Body Problem:
“It is often called an example of a “many-body problem,” and it really has been a very interesting thing. In that single formula must be contained all the details, for example, about the solidification of gas, or the forms of the crystals that the solid can take, and people have been trying to squeeze it out, but the mathematical difficulties are very great, not in writing the law, but in dealing with so enormous a number of variables (Feynman et. al, 1963, p. 40-4).”
In classical
mechanics, the many-body problem generalizes the three-body problem—the
challenge of predicting the motion of three mutually interacting objects.
Unlike the two-body case, which has exact closed-form solutions (Kepler’s
laws), the three-body problem admits no general analytic solution. This shows
that even Newton’s deterministic laws can lead to motions too complex for exact
solutions. While special cases—such as the equilateral triangle configuration—can
be solved, and numerical simulations can track orbits with high precision for
finite times, a universal closed-form solution remains impossible. The
difficulty arises not simply from the large number of variables but from
intrinsic nonlinear interactions, where tiny differences in initial conditions
cause chaotic divergence.
“That then, is the distribution of particles in space. That is the end of classical statistical mechanics, practically speaking, because if we know the forces, we can, in principle, find the distribution in space, and the distribution of velocities is something that we can work out once and for all, and is not something that is different for the different cases (Feynman et. al, 1963, p. 40-4).”
Feynman’s statement
captures the “minimalist” foundation of statistical mechanics: if the
intermolecular forces are known and the Boltzmann principle is invoked, you can
deduce the distribution. The shortcoming lies in presenting this foundation as
the “end,” when it is better understood as the beginning. For example, Monte
Carlo methods enriched classical statistical mechanics by overcoming its central
mathematical obstacle—the high-dimensional integral—thus enabling predictive modeling
of complex many-body systems. More recently, artificial intelligence and machine learning
models have advanced the field further by recognizing patterns, improving
sampling, and exploring emergent behaviors. What Feynman called the “end” is,
in practice, the starting point for some of the most important and exciting
modern challenges.
Historical note:
Classical
statistical mechanics, developed by Maxwell, Boltzmann, and Gibbs in the 19th
century, provided the framework to derive macroscopic properties from
microscopic particle interactions. However, solving the modeling equations
exactly for real many-body systems is analytically impossible. This led to the
development of modern computational tools:
1. Monte Carlo
Methods (Metropolis et al., 1953): Building on earlier random sampling
ideas from the 1940s Los Alamos project, the Metropolis algorithm introduced
stochastic sampling of configurations to evaluate equilibrium properties. Using
“fast electronic computing machines” and bypassing direct integration of
Newton’s equations of motion, it provided an efficient means of analyzing
collective phenomena such as phase transitions.
2. Molecular
Dynamics (Alder & Wainwright, 1957): Using electronic computers, Alder
and Wainwright developed a method by directly integrating Newton’s equations of
motion for many interacting particles. This computational approach enabled the
simulation of physical trajectories, allowing for the study of time-dependent
behavior, transport properties, and microscopic mechanisms in gases, liquids,
and solids.
3. AI and Machine
Learning (2010s–present): Advances in algorithms and computing power have
enabled AI to transform approaches to statistical mechanics. Recent studies
link AI approaches to classical frameworks, such as the Yang–Lee theory of
phase transitions, which is directly relevant to processes like evaporation. For
example, artificial neural-network (Carleo & Troyer, 2017) marked a
breakthrough in representing many-body systems, while later work (Noé et al.,
2019) showed how machine learning can accelerate the study of phase transitions
and collective behaviors.
In summary, the
field has evolved from theoretical foundations (Maxwell, Boltzmann, Gibbs), to
computational simulation (Monte Carlo and Molecular Dynamics), to AI-enhanced
prediction and analysis. Each stage overcame the calculational barriers of its
predecessor, extending the reach of statistical mechanics into previously
inaccessible domains.
Review
Questions:
1. Do you agree
with Feynman’s expression of intermolecular potential in relation to the
probability of finding molecules?
2. How does the
intermolecular potential determine the separation of molecules? Can you provide
a physical intuition or real-world analogy?
3. Feynman
describes the distribution of particles in space as “the end of classical
statistical mechanics.” Do you agree with this view? How do modern developments
extend the scope beyond Feynman’s characterization?
Key Takeaway (in
Feynman’s spirit):
The point of this
whole discussion isn't really about evaporation—that's just a
specific example that comes later. What we're really getting at is something
much more beautiful and fundamental: the molecular “dance.” Once you understand
this dance—how the potential tells the molecules where to go—sometimes
spreading into a gas, clustering as a liquid, or locking into the regular
structure of a solid. Evaporation? That’s just what happens when some dancers
get so energetic that some of them break away from the group. But the real
story, the music they're all dancing to, is the intermolecular potential.
The Moral of the
Lesson: Feynman presented the spatial distribution of particles as the end
of classical statistical mechanics, but in reality, it is only the beginning
Education is not about memorizing final results, but about entering the
never-ending process of inquiry. Foundations are not endpoints—they are
starting points, laid down so that future understanding may be built upon them.
In short: what may appear to
be the “end of knowledge” is, in truth, the foundation for the next discovery.
References:
Alder, B. J., &
Wainwright, T. E. (1957). Phase transition for a hard sphere system. The
Journal of chemical physics, 27(5), 1208–1209.
Carleo, G., &
Troyer, M. (2017). Solving the quantum many-body problem with artificial
neural networks. Science, 355(6325), 602–606.
Feynman, R. P. (1972). Statistical mechanics: a
set of lectures. W. A. Benjamin.
Feynman,
R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on
Physics, Vol I: Mainly
mechanics, radiation, and heat. Reading, MA: Addison-Wesley.
Noé, F., Olsson,
S., Köhler, J., & Wu, H. (2019). Boltzmann generators: Sampling
equilibrium states of many-body systems with deep learning. Science,
365(6457), eaaw1147.
Hansen, J. P.,
& McDonald, I. R. (2006). Theory of Simple Liquids (3rd ed.).
Academic Press
Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6), 1087-1092.
No comments:
Post a Comment