Saturday, October 11, 2025

Section 40–4 The distribution of molecular speeds

 Molecular collisions / Independent of direction / Relativistic effects

 

This section can be examined from the perspectives of molecular collisions, directional independence, and relativistic effects. Strictly speaking, Feynman is not explaining the distribution of molecular speeds, but the distribution of molecular velocities. For example, Figure 40-5 depicts a velocity distribution function with a Gaussian, bell-shaped form, which differs from the speed distribution function (chi distribution). However, the section could be titled “The Maxwell–Boltzmann distribution,” which refers broadly to the probability governing molecular motion in an ideal gas at thermal equilibrium.

 

1. Molecular collisions:

“Now we return to the question about the neglect of collisions: Why does it not make any difference? We could have pursued the same argument, not with a finite height h, but with an infinitesimal height h, which is so small that there would be no room for collisions between 0 and h. But that was not necessary: the argument is evidently based on an analysis of the energies involved, the conservation of energy, and in the collisions that occur there is an exchange of energies among the molecules. However, we do not really care whether we follow the same molecule if energy is merely exchanged with another molecule. So it turns out that even if the problem is analyzed more carefully (and it is more difficult, naturally, to do a rigorous job), it still makes no difference in the result (Feynman et al., 1963).”

 

Feynman emphasizes that neglecting collisions in deriving the molecular velocity distribution does not alter the final result, because what matters is the total molecular energy, not which molecule carries it. Even though collisions continuously redistribute energy among molecules, total energy is conserved, so the statistical distribution of velocities remains unchanged. In Illustrations of the dynamical theory of gases, Maxwell (1860) writes: “the mean distance travelled over by a particle between consecutive collisions, = 1/447000th of an inch, and each particle makes 8,077,200,000 collisions per second (p. 32).” This shows that Maxwell did not simply ignore collisions; rather, he derived the mean free path—the average distance of a molecule traveled between collisions—without tracking individual trajectories. However, Maxwell did not elaborate on how collisions drive gases toward thermal equilibrium.


In contrast, Boltzmann placed collisions at the heart of his model: they enable the transfer of energy among molecules, gradually reshaping the distribution of energies and velocities until the equilibrium distribution is established is reached. This is related to Boltzmann’s H-theorem, which shows that when collisions between molecules are allowed, such distributions tend to irreversibly seek towards the minimum value of H. Molecular collisions are therefore essential for explaining not just the form of the distribution, but also its dynamic emergence. Feynman’s discussion, which downplays collisions, is better seen as a pedagogical simplification to focus on intuition and the final distribution. However, it is worthwhile to recognize the distinct contributions of Maxwell and Boltzmann in shaping our understanding of the molecular velocity distribution.

 

Note: In Further Studies on the Thermal Equilibrium of Gas Molecules, Boltzmann (1872) writes “This is essentially the result already obtained in another way by Maxwell: once this velocity distribution has been reached, it will not be disturbed by collisions (p. 263)……. As a result of collisions, many molecules will acquire larger velocities and others will come to have smaller velocities, until finally a distribution of velocities among the molecules is established such that it is not changed by further collisions. In this final distribution, in general all possible velocities from zero up to a very large velocity will occur (p. 265).” It shows that Boltzmann explicitly identified collisions as the mechanism by which gases approach thermal equilibrium.

 

2. Independent of direction:

“So far we have, of course, only the distribution of the velocities “vertically.” We might want to ask, what is the probability that a molecule is moving in another direction? Of course these distributions are connected, and one can obtain the complete distribution from the one we have, because the complete distribution depends only on the square of the magnitude of the velocity, not upon the z-component. It must be something that is independent of direction, and there is only one function involved, the probability of different magnitudes (Feynman et al., 1963).”

 

In Illustrations of the Dynamical Theory of Gases, Maxwell (1860) assumed “the existence of the velocity x does not in any way affect that of the velocities y or z, since these are all at right angles to each other and independent (p. 22).” This independence assumption (or intuition) allowed him to factorize the probability distribution function. However, his contemporaries were uneasy with his approach: while isotropy could be accepted as a consequence of spatial symmetry, statistical independence of perpendicular components seemed a less obvious claim (Brush, 1976, pp. 177–179). In On the Dynamical Theory of Gases, Maxwell (1867) responded to their concerns by providing a more rigorous justification, but it was not a direct use of Newton’s laws of motion. With this statistical proof, Maxwell strengthened the foundations of kinetic theory despite its limitations of applicability.

 

It is important to recognize that isotropy (independence of direction) and the statistical independence of velocity components are distinct concepts. Isotropy means all directions of motion are equally probable in an ideal gas and the distribution is rotationally invariant. Statistical independence means the probability distribution for one component of velocity (e.g., vx) is independent of the distributions for the perpendicular components (vy, vz). Currently, the Maxwell–Boltzmann distribution is derived from the canonical ensemble, where the Gaussian form of the velocity distribution leads to both isotropy and statistical independence as results rather than assumptions. In a sense, Maxwell effectively inverted the modern reasoning by assuming statistical independence right at the beginning. However, neither directional independence nor statistical independence are universally valid features of all velocity distributions.

 

Note: Maxwell’s derivation of the distribution of molecular velocities rested on two key assumptions: (1) Isotropy (independent of direction) and (2) Statistical Independence (Brush, 1976). Maxwell later recognized that the second assumption was precarious. The assumption of isotropy does not necessarily imply the statistical independence of the variables along different directions (Walstad, 2013). To be precise, the Maxwell–Boltzmann distribution was derived under the assumptions of an ideal gas with no external force fields or significant gravitational effects.

 

Of course these distributions are connected, and one can obtain the complete distribution from the one we have, because the complete distribution depends only on the square of the magnitude of the velocity, not upon the z-component (Feynman et al., 1963).”

 

The word connected used by Feynman is potentially misleading, since in an ideal gas at equilibrium the velocity components are statistically independent. One possible clarification is that the connection is mathematical rather than physical. Mathematically, the Gaussian form of the full distribution f(vx, vy, vz) implies the probability law for any single component (e.g., vz), so the “connection” follows from probability theory and symmetry. Physically, however, the velocity components are uncorrelated—random collisions do not link vx and vz, e.g., and no force would continuously connect them together. Recognizing this distinction prevents the potential misconception that the distribution of  molecular velocities arises from some inherent physical linkage between the velocity components, rather than from statistical symmetry.

 

3. Relativistic effects:

“Since velocity and momentum are proportional, we may say that the distribution of momenta is also proportional to e−K.E./kT per unit momentum range. It turns out that this theorem is true in relativity too, if it is in terms of momentum, while if it is in velocity it is not, so it is best to learn it in momentum instead of in velocity: f(p)dp=Ce−K.E./kTdp (Feynman et al., 1963).”

 

Feynman emphasizes that the Maxwell–Boltzmann distribution can be expressed in terms of momentum rather than velocity. In the non-relativistic case, momentum and velocity are proportional p = mv, so the probability distributions in either variable are equivalent, with the distribution proportional to e−K.E./kT. In the relativistic case, the proportionality between momentum and velocity breaks down, so the velocity distribution cannot be represented by a simple exponential form. However, expressing the distribution in terms of momentum, f(p)dp=Ce−K.E./kTdp, still preserves the exponential form even relativistically. Therefore, Feynman suggests that the distribution in terms of momentum is more fundamental and general, allowing it to be applied beyond classical speeds. This approach highlights that momentum-based distributions provide a consistent description of thermal equilibrium for both classical and relativistic particles.

 

Feynman’s statement requires careful interpretation in the context of special relativity, where momentum and velocity are no longer simply proportional and directional effects become significant. The correct relativistic generalization is the Maxwell–Jüttner distribution. Unlike the Maxwell–Boltzmann distribution, which is isotropic in the rest frame of the gas, the Maxwell–Jüttner distribution exhibits apparent directional dependence when viewed from a moving frame because of relativistic transformations. In this respect, Feynman’s explanation oversimplifies the relativistic case and risks leaving the impression that the classical Maxwell–Boltzmann form remains valid in special relativity. However, the Maxwell–Jüttner distribution reduces to the Maxwell–Boltzmann distribution in the non-relativistic limit, thereby unifying the description of thermal equilibrium for both classical and relativistic gases.

 

“Of course these distributions are connected, and one can obtain the complete distribution from the one we have, because the complete distribution depends only on the square of the magnitude of the velocity, not upon the z-component. (Feynman et al., 1963).”

 

Feynman’s claim that the full velocity distribution can be built from one component by assuming independence of directions overlooks a key caution raised by Maxwell himself: the assumption that velocity components along x, y, and z are statistically independent is not self-evident. Maxwell introduced this hypothesis in his 1860 paper Illustrations of the Dynamical Theory of Gases, but explicitly remarked that its validity was “dubious,” since collisions or hidden correlations might couple motion in different directions. While in the classical, nonrelativistic regime this assumption works and leads to the Maxwell–Boltzmann form, in special relativity it fails outright: the finite speed limit c ties the velocity components together, so they cannot be treated as independent Gaussians. Thus, both Maxwell’s original caution and relativistic constraints suggest that Feynman’s simplified statement about independence of directions is not generally correct.

 

Difference between distribution of molecular velocities and molecular speeds:

The distribution of molecular velocities gives the probability that a molecule has a specific velocity vector, i.e., that its components lie between vx and vx + dvx, vy and vy + dvy, and vz and vz + dvz. By contrast, the distribution of molecular speeds gives the probability that a molecule has a certain speed, regardless of direction. In this case, the probability of finding a molecule with speed between v and v + dv is obtained by summing over all velocity vectors whose magnitude is v. Geometrically, this corresponds to integrating over the spherical shell of radius  in velocity space. Thus, the velocity distribution is directional and vector-based, while the speed distribution is scalar and direction-independent.

 

Analogy: To see the difference between the distribution of molecular velocities and the distribution of molecular speeds, imagine throwing darts at a flat target. The velocity distribution is like asking for the probability of a dart landing in a tiny square at some coordinates (x, y) on the target—it tracks direction as well as magnitude. The speed distribution, by contrast, is like asking for the probability that a dart lands at a given distance from the bullseye, regardless of angle; what matters is the radius, not the coordinates. There is only one point lies at the bullseye (speed = 0), but entire rings of points exist at larger radii (higher speeds), so higher speeds could be more likely. The velocity components follow Gaussian (bell-curve) distributions, while the speed follows a chi distribution that rises from zero, peaks, and then tails off asymmetrically.

 

Historical Note:

In his 1867 paper On the Dynamical Theory of Gases, Maxwell reflected on earlier mistakes: “I also gave a theory of diffusion of gases, which I now know to be erroneous, and there were several errors in my theory of the conduction of heat in gases which M. Clausius has pointed out in an elaborate memoir on that subject (p. 51).” Later in the same paper, he addressed a key assumption in his derivation: “1 have given an investigation of this case, founded on the assumption that the probability of a. molecule having a velocity resolved parallel to x lying between given limits is not in any way affected by the knowledge that the molecule has a given velocity resolved parallel to y. As this assumption may appear precarious, I shall now determine the form of the function in a different manner (p. 62).” His remark reveals an early awareness that the independence of velocity components was not self-evident but required justification. The fact that he attempted to resolve the potential problem within the same work shows both his humility and scientific thoroughness.

 

Key Takeaway:

The behavior of gas molecules can be described only statistically, not deterministically, with equilibrium distributions emerging from symmetry, randomness, and conservation laws. Maxwell’s work shows that assumptions such as isotropy and statistical independence—when carefully justified—lead naturally to the Gaussian form of the velocity distribution (the Maxwell–Boltzmann distribution), which successfully explains macroscopic properties like pressure and temperature. His treatment also highlights the importance of making assumptions explicit, testing their validity, and, where possible, providing multiple lines of justification, as he did with the independence of velocity components. Importantly, the Maxwell–Boltzmann distribution is not obvious—without Maxwell’s insight, one might wrongly expect velocities to be uniformly distributed, missing the statistical order in molecular chaos.

 

The Moral of the lesson: Maxwell’s willingness to acknowledge weaknesses in his earlier paper, and to replace them with a more rigorous derivation, reflects both intellectual honesty and scientific integrity.

 

Review Questions:

1. Should molecular collisions be neglected when deriving the distribution of molecular velocities, and what role do they play in reaching equilibrium?

2. Are molecular velocities truly independent of direction in the context of the Maxwell–Boltzmann distribution?

3. Does Maxwell’s distribution of molecular velocities remain the same or valid at relativistic speeds if it is expressed in terms of momentum?

 

References

Boltzmann, L. (1872). Further studies on the thermal equilibrium of gas molecules. In The kinetic theory of gases: an anthology of classic papers with historical commentary (pp. 262-349).

Brush, S. G. (1976). The Kind of Motion We Call Heat: A History of the Kinetic Theory of Gases in the 19th Century. Amsterdam: North-Holland.

Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.

Maxwell, J. C. (1860). V. Illustrations of the dynamical theory of gases.—Part I. On the motions and collisions of perfectly elastic spheres. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science19(124), 19-32, 281–291.

Maxwell, J. C. (1867). On the dynamical theory of gases. Philosophical Transactions of the Royal Society of London, 157, 49–88.

Walstad, A. (2013). On deriving the Maxwellian velocity distribution. American Journal of Physics81(7), 555-557.

Saturday, September 20, 2025

Section 40–3 Evaporation of a liquid

Intermolecular potentials / Molecular separations / Many-Body problem

 

In this section, Feynman analyzes intermolecular potentials, molecular separations, and the complexity of many-body problems in liquids. A more accurate title could be “The Role of Intermolecular Potentials in Determining Molecular Separations”—or, more briefly, “Intermolecular Potentials and Molecular Separations.” By contrast, the title “Evaporation of a Liquid” would be misleading, since the section is not specifically about the evaporation process. Evaporation refers to the phase transition from liquid to vapor at a surface, governed by vapor pressure, surface interactions, and the molecular energy distribution. However, evaporation appears only briefly as a side remark connected to the many-body problem, rather than as the main focus. In fact, Feynman discusses evaporation directly in Chapter 42, under the section titled “Evaporation.

 

1. Intermolecular potentials:

“Let us take the case of just two molecules: the e−P.E./kT would be the probability of finding them at various mutual distances r (Feynman et al., 1963, p. 40-3).”

 

A shortcoming in Feynman’s description is that he equates the Boltzmann factor directly with probability, while it only provides relative probability weights. Strictly speaking, proper probability requires normalization through the partition function. In Statistical Mechanics, Feynman (1972) writes: “The key principle of statistical mechanics is as follows: If a system in equilibrium can be in one of N states, then the probability of the system having energy En is (1/Q)e-En/kT, where… Q is called the partition (p. 1).” In other words, the relative probability of finding two molecules separated by a distance r is proportional to e−U(r)/kT. To obtain the actual probability distribution, the Boltzmann factor must be normalized by the partition function. Thus, the actual probability density of finding the molecules at separation r is: P(r) = (1/Z)e−V(r)/kT where Z is the normalization factor (partition function) that ensures the total probability sums to 1.

 

“The total potential energy in this case would be the sum over all the pairs, supposing that the forces are all in pairs... Then the probability for finding molecules in any particular combination of rij’s will be proportional to exp[−∑i,jV(rij)/kT] (Feynman et al., 1963, p. 40-3).”

 

Feynman’s expression i,jV(rij) counts each pair of molecules twice: once as (i, j) and again as (j, i). Since the interaction is symmetric, this double counts the total potential energy. A simple analogy is counting handshakes in a group: if you let everyone record “I shook hand with you,” each handshake is counted twice. To correct this, we can either divide the total by two or sum over unique pairs only. In statistical mechanics, the intermolecular potential can be written as ½ ∑i≠jV(rij) or equivalently i<jV(rij). This fixes the summation index problem, making it clear that U is the sum over unique pairs (i<j), and accurately defines U before substituting it into the Boltzmann factor. With this correction, we can account for pairwise interactions without double counting.

 

2. Molecular separations:

“Let us take the case of just two molecules: the e−P.E./kT would be the probability of finding them at various mutual distances r. Clearly, where the potential goes most negative, the probability is largest, and where the potential goes toward infinity, the probability is almost zero, which occurs for very small distances. That means that for such atoms in a gas, there is no chance that they are on top of each other, since they repel so strongly. But there is a greater chance of finding them per unit volume at the point r0 than at any other point (Feynman et al., 1963, p. 40-3).”

 

The probability of finding a molecule at a given separation r from another is governed by the intermolecular potential V(r) and the Boltzmann factor. This probability can be understood in three characteristic regions:

1. Very close (r ≈ 0): At extremely small separations, V(r) → +∞ due to strong repulsion. The exponent −V(r)/kT → −∞, so e−V(r)/kT ≈ 0. Thus, the probability of molecules overlapping is close to zero.

2. Equilibrium position (r = r0): At the equilibrium distance, V(r₀) reaches its minimum (most negative value), corresponding to maximum attractive interaction. Here, −V(r0)/kT is maximally positive, making e−V(r)/kT largest. Molecules are most likely to be found near r0​.

3. Far apart (r >> r0​): At large separations, V(r)→0, meaning negligible interaction. The exponent −V(r)/kT → 0, so e−V(r)/kT → 1. The probability is lower than at r₀ but greater than at r ≈ 0, and it increases with temperature as thermal motion overcomes attractive forces.

In short, the Boltzmann factor provides a probabilistic map of molecular separations, showing how intermolecular potentials and temperature jointly determine the spatial distribution of molecules in a liquid.

 

Summary:

  • r ≈ 0: Probability ≈ 0 (Molecules are separated by repulsion).
  • r = r0​: Maximum probability (most favorable separation).
  • r→∞: Probability moderate; molecular separation increases with temperature.


Note: The Lennard–Jones (LJ) potential is one of the most extensively studied intermolecular potentials.

 

“Now, if the temperature is very high, so that kT>>|V(r0)|, the exponent is relatively small almost everywhere, and the probability of finding a molecule is almost independent of position… As the temperature falls, the atoms fall together, clump in lumps, and reduce to liquids, and solids, and molecules, and as you heat them up they evaporate (Feynman et al., 1963, p. 40-3).”

 

Evaporation does not require a liquid to reach a high overall temperature because it is governed by the statistical distribution of molecular energies, not the average alone. Specifically, at any temperature above absolute zero, molecules in a liquid have a spread of kinetic energies described by the Maxwell–Boltzmann distribution. A fraction of the surface molecules will always possess enough energy to overcome intermolecular attractions and escape into the air. At very high temperatures, where kT >> |V(r0)|, this collective excitation leads to boiling rather than mere surface evaporation. Thus, evaporation can occur at any T>0, without the liquid needing to reach its boiling point. Even if the mean kinetic energy is relatively small, the high-energy tail of the distribution allows some molecules exceed the binding energy (or equivalently, the latent heat of vaporization) and are able to leave the liquid. That said, at very low temperatures the fraction of such energetic molecules becomes vanishingly small, so the evaporation rate is extremely slow.

 

“The requirements for the determination of exactly how things evaporate, exactly how things should happen in a given circumstance, involve the following. First, to discover the correct molecular-force law V(r), which must come from something else, quantum mechanics, say, or experiment (Feynman et al., 1963, p. 40-4).”

 

The stringent requirements for determining the exact mechanism of evaporation make the task effectively impossible. A more practical approach is provided by the Hertz–Knudsen equation, which predicts evaporation rates under the assumption of an ideal gas in thermal equilibrium. Derived from the Maxwell–Boltzmann distribution of molecular speeds, this relation—together with experimentally measured saturation pressures and an empirical condensation coefficient—yields reliable estimates without requiring detailed knowledge of the intermolecular potential. On the other hand, the distribution function obtained from realistic models such as the Lennard–Jones (LJ) potential and Monte Carlo simulations exhibits multiple maxima and minima (Hansen & McDonald, 2006). This behavior differs from the simpler two-body model, where the distribution shows only a single maximum at the equilibrium separation and no oscillatory structure. Thus, even though detailed microscopic calculations reveal the complexity of molecular organization, simple models like the Hertz–Knudsen relation can still provide insights into evaporation without the need to satisfy such stringent requirements.

Source: (Hansen & McDonald, 2006)

 

3. Many-Body Problem:

“It is often called an example of a “many-body problem,” and it really has been a very interesting thing. In that single formula must be contained all the details, for example, about the solidification of gas, or the forms of the crystals that the solid can take, and people have been trying to squeeze it out, but the mathematical difficulties are very great, not in writing the law, but in dealing with so enormous a number of variables (Feynman et. al, 1963, p. 40-4).

 

In classical mechanics, the many-body problem generalizes the three-body problem—the challenge of predicting the motion of three mutually interacting objects. Unlike the two-body case, which has exact closed-form solutions (Kepler’s laws), the three-body problem admits no general analytic solution. This shows that even Newton’s deterministic laws can lead to motions too complex for exact solutions. While special cases—such as the equilateral triangle configuration—can be solved, and numerical simulations can track orbits with high precision for finite times, a universal closed-form solution remains impossible. The difficulty arises not simply from the large number of variables but from intrinsic nonlinear interactions, where tiny differences in initial conditions cause chaotic divergence.

 

“That then, is the distribution of particles in space. That is the end of classical statistical mechanics, practically speaking, because if we know the forces, we can, in principle, find the distribution in space, and the distribution of velocities is something that we can work out once and for all, and is not something that is different for the different cases (Feynman et. al, 1963, p. 40-4).”

 

Feynman’s statement captures the “minimalist” foundation of statistical mechanics: if the intermolecular forces are known and the Boltzmann principle is invoked, you can deduce the distribution. The shortcoming lies in presenting this foundation as the “end,” when it is better understood as the beginning. For example, Monte Carlo methods enriched classical statistical mechanics by overcoming its central mathematical obstacle—the high-dimensional integral—thus enabling predictive modeling of complex many-body systems. More recently, artificial intelligence and machine learning models have advanced the field further by recognizing patterns, improving sampling, and exploring emergent behaviors. What Feynman called the “end” is, in practice, the starting point for some of the most important and exciting modern challenges.

 

Historical note:

Classical statistical mechanics, developed by Maxwell, Boltzmann, and Gibbs in the 19th century, provided the framework to derive macroscopic properties from microscopic particle interactions. However, solving the modeling equations exactly for real many-body systems is analytically impossible. This led to the development of modern computational tools:

1. Monte Carlo Methods (Metropolis et al., 1953): Building on earlier random sampling ideas from the 1940s Los Alamos project, the Metropolis algorithm introduced stochastic sampling of configurations to evaluate equilibrium properties. Using “fast electronic computing machines” and bypassing direct integration of Newton’s equations of motion, it provided an efficient means of analyzing collective phenomena such as phase transitions.

2. Molecular Dynamics (Alder & Wainwright, 1957): Using electronic computers, Alder and Wainwright developed a method by directly integrating Newton’s equations of motion for many interacting particles. This computational approach enabled the simulation of physical trajectories, allowing for the study of time-dependent behavior, transport properties, and microscopic mechanisms in gases, liquids, and solids.

3. AI and Machine Learning (2010s–present): Advances in algorithms and computing power have enabled AI to transform approaches to statistical mechanics. Recent studies link AI approaches to classical frameworks, such as the Yang–Lee theory of phase transitions, which is directly relevant to processes like evaporation. For example, artificial neural-network (Carleo & Troyer, 2017) marked a breakthrough in representing many-body systems, while later work (Noé et al., 2019) showed how machine learning can accelerate the study of phase transitions and collective behaviors.

In summary, the field has evolved from theoretical foundations (Maxwell, Boltzmann, Gibbs), to computational simulation (Monte Carlo and Molecular Dynamics), to AI-enhanced prediction and analysis. Each stage overcame the calculational barriers of its predecessor, extending the reach of statistical mechanics into previously inaccessible domains.

 

Review Questions:

1. Do you agree with Feynman’s expression of intermolecular potential in relation to the probability of finding molecules?

2. How does the intermolecular potential determine the separation of molecules? Can you provide a physical intuition or real-world analogy?

3. Feynman describes the distribution of particles in space as “the end of classical statistical mechanics.” Do you agree with this view? How do modern developments extend the scope beyond Feynman’s characterization?

 

Key Takeaway (in Feynman’s spirit):

The point of this whole discussion isn't really about evaporation—that's just a specific example that comes later. What we're really getting at is something much more beautiful and fundamental: the molecular “dance.” Once you understand this dance—how the potential tells the molecules where to go—sometimes spreading into a gas, clustering as a liquid, or locking into the regular structure of a solid. Evaporation? That’s just what happens when some dancers get so energetic that some of them break away from the group. But the real story, the music they're all dancing to, is the intermolecular potential.

 

The Moral of the Lesson: Feynman presented the spatial distribution of particles as the end of classical statistical mechanics, but in reality, it is only the beginning Education is not about memorizing final results, but about entering the never-ending process of inquiry. Foundations are not endpoints—they are starting points, laid down so that future understanding may be built upon them.

 

In short: what may appear to be the “end of knowledge” is, in truth, the foundation for the next discovery.

 

References:

Alder, B. J., & Wainwright, T. E. (1957). Phase transition for a hard sphere system. The Journal of chemical physics27(5), 1208–1209.

Carleo, G., & Troyer, M. (2017). Solving the quantum many-body problem with artificial neural networks. Science, 355(6325), 602–606.

Feynman, R. P. (1972). Statistical mechanics: a set of lectures. W. A. Benjamin.

Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.

Noé, F., Olsson, S., Köhler, J., & Wu, H. (2019). Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning. Science, 365(6457), eaaw1147.

Hansen, J. P., & McDonald, I. R. (2006). Theory of Simple Liquids (3rd ed.). Academic Press

Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of state calculations by fast computing machines. The journal of chemical physics21(6), 1087-1092.