Saturday, February 14, 2026

Section 41–4 The random walk

Mean square displacement / Langevin equation / Einstein-Smoluchowski relation


In this section, Feynman provides the conceptual scaffolding for the Einstein-Smoluchowski relation by analyzing the Brownian motion in terms of mean square displacement and Langevin equation. The analysis is physically sound and captures the essence of random walk, but it stops at the immediate result <R^2> = 6kTt/m rather than proceeding to the diffusion coefficient D = hkT, which is known as the Einstein-Smoluchowski relation. In a sense, the section almost functions as a derivation of the Einstein-Smoluchowski relation, but it is also an exploration of the concept of random walk underpinning it.  

 

1. Mean square displacement

“And so, by the same kind of mathematics, we can prove immediately that if RN is the vector  distance from the origin after N steps, the mean square of the distance from the origin is proportional to the number N of steps. That is, <RN2 > = NL2, where L is the length of each step. Since the number of steps is proportional to the time in our present problem, the mean square distance is proportional to the time: <R2 > = αt (Feynman et al., 1963, p. 41-9).”

 

A central insight of Einstein's theory of Brownian motion is that a particle’s net displacement scales with time in a different way from the total path distance it travels. In his 1905 paper, Einstein introduced the mean square displacement (MSD) and showed that it grows with time, a defining signature of diffusive motion. By contrast, the word “distance” can be misleading, as it may be interpreted as the cumulative length of the particle’s random trajectory rather than its net displacement from an initial position. The term “mean square displacement” was subsequently adopted in the seminal works of Smoluchowski (1906) and Perrin (1908–1909), who followed Einstein’s approach. The MSD is defined as the squared vector displacement relative to the initial position; its root-mean-square value therefore scales as Öt, not t. In addition, the MSD is a scalar obtained through statistical averaging: it carries no directional information, a property that is essential to its role in statistical physics.

 

Feynman’s discussion focuses on the mean square displacementi.e., on how far, on average, the sailor goes from the initial position. However, he did not derive the underlying probability density function (PDF), which determines the shape of the "cloud" of possible particle positions. Historically, Einstein went further by obtaining the diffusion equation, P/t = D(2P/x2), where P(x,t) is the probability density and D is the diffusion coefficient. Solving the diffusion equation with the initial condition

P(x,0) = d(x) yields the familiar Gaussian distribution. Currently, physicists may use the Fokker-Planck Equation, which governs the time evolution of the probability density. Evaluating the Gaussian integral gives the standard result for 1-D diffusion: <x^2 > = 2Dt.

 

2. Langevin equation

“If x is positive, there is no reason why the average force should also be in that direction. It is just as likely to be one way as the other. The bombardment forces are not driving it in a definite direction. So the average value of x times F is zero. On the other hand, for the term mx(d2x/dt2) we will have to be a little fancy, and write this as mxd2x/dt2 = md[x(dx/dt)]dt m(dx/dt)2 (Feynman et al., 1963, p. 41-10).”

 

Feynman’s derivation of the MSD equation could be explained as follows:





Feynman’s derivation, though physically insightful, lacks mathematical rigor. It treats the stochastic force F(t) as if it were a differentiable function, but the Brownian paths are nowhere differentiable. Some may prefer stochastic (Ito or Stratonovich) calculus, in which the chain rule is modified and integrals are defined in a non-classical sense. Moreover, Feynman’s approach implicitly assumes that the system has reached a steady state, so that <xv> does not change with time. This skips over the early stages of motion—when inertia and short-time effects still matter—and focuses only on the long-time diffusive behavior. While his use of the Equipartition Theorem to substitute kinetic energy with kT is physically sound, it bypasses the rigorous derivation of a full probability density function—such as solving the Fokker–Planck equation. In a sense, Feynman sacrifices mathematical completeness for pedagogical clarity, offering a shortcut that captures the core idea of the random walk.

 

3.  Einstein-Smoluchowski relation

“Therefore the object has a mean square distance R2, at the end of a certain amount of t, equal to R2=6kTt/μ…… This equation was of considerable importance historically, because it was one of the first ways by which the constant k was determined (Feynman et al., 1963, p.41-10).

 

Feynman did not explicitly mention the Fluctuation-Dissipation relation (or theorem), but his method in obtaining the mean square distance involved the random force (fluctuation) and the friction coefficient (dissipation). However, the equation that was of considerable importance should be the Einstein-Smoluchowski Relation, which could be obtained by two more steps as shown below:


Feynman begins with the stochastic concept of a random walk, using the "drunken sailor" analogy to show that the mean-square displacement of a jiggling particle grows linearly with time. He then introduces the concept of dissipation via a simplified Langevin equation, in which the macroscopic friction coefficient (m) represents the viscous drag opposing the particle's motion. By applying the equipartition theorem, Feynman demonstrates that the random thermal “kicks” and the physical “drag” are two sides of the same microscopic molecular bombardment.

This synthesis leads to the formula < R^2 > = 6kTt/m, which links the rate of microscopic spreading to the measurable macroscopic dissipation. It reveals the deep unity between fluctuation and dissipation, showing that the seemingly erratic motion of a particle is governed by the same physical principles that determine macroscopic friction and thermal equilibrium.

 

Note: In the formula < R^2 > = 6kTt/m, Feynman uses m as the friction coefficient. On the other hand, Einstein’s m refers to "mass of the particle."


“Besides the inertia of the fluid, there is a resistance to flow due to the viscosity and the complexity of the fluid. It is absolutely essential that there be some irreversible losses, something like resistance, in order that there be fluctuations. There is no way to produce the kT unless there are also losses. The source of the fluctuations is very closely related to these losses (Feynman et al., 1963, p. 41-9).


The Unity of Loss and Noise: Einstein’s Symmetry

Einstein’s derivation of the relation D = mkT established a fundamental "Statistical Principle of Equivalence" between two seemingly distinct phenomena: macroscopic dissipation (viscosity) and microscopic fluctuation (thermal noise). This equation reveals that viscosity (quantified by mobility) is far more than a mere hindrance to motion; it is a necessary source of motion (quantified by the diffusion constant). This represented a revolutionary shift in which "Loss" and "Noise" were no longer viewed as separate accidents of nature, but as a new perspective on the reality of molecular motion. This principle dictates a profound symmetry: there can be no dissipation without fluctuation, and no fluctuation without dissipation. In essence, Einstein revealed that at the molecular level, Dissipation and Fluctuation are two sides of the same thermodynamic coin.

 

Key Takeaways:

1. Operationalizing the Unobservable: From Metaphysics to Measurement

Einstein did not treat atoms as a matter of belief. Instead, he effectively posed an operational question: If matter consists of molecules in perpetual motion, what measurable quantities account for the random motion of particles suspended in a liquid?

This shifted the debate from "Do atoms exist?" to "What numerical value emerges when we measure this jitter?"

By linking the invisible (molecules) to the visible (pollen grains) via a quantitative relation involving Avogadro's number, Einstein transformed an abstract hypothesis into an operational definition. Certain properties of atoms were no longer inferred; they became measurable. His work therefore did more than support atomism; it redefined what counted as scientific proof for a theoretical entity. The reality of atoms was established not by philosophical argument, but by the convergence of statistical mechanics and empirical verification. This approach exemplifies his broader “grand principle”: where postulates from thermodynamics limit permissible descriptions of nature.

 

2. The Fluctuation-Dissipation Connection

The Einstein-Smoluchowski relation is sometimes regarded as the first expression of the Fluctuation-Dissipation Theorem because it links two historically distinct frameworks: Statistical Mechanics (stochastic description) and Classical Thermodynamics (physical laws). Before 1905, the Stochastic concept of diffusion (random walk) and physical concept of diffusion (viscous drag) were treated as separate subjects. Einstein’s insight was to recognize the thermal "jiggling" (fluctuation) and the fluid "dragging" (Dissipation) were caused by the same thing: molecular collisions.

 

3. The "Agnostic" Opening: A Tactical Masterstroke

In the opening paragraph of his 1905 paper, Einstein deliberately distanced himself from the phenomenon he was explaining:

“It is possible that the motions to be discussed here are identical with the so-called 'Brownian molecular motion'; however, the information available to me... is so imprecise that I could form no definite judgment.”

This was not genuine ignorance, but strategic restraint. By presenting his goal as the prediction of a new phenomenon required by molecular-kinetic theory, he ensured that if his math was right, the presence of molecules (or "atoms") followed as only coherent conclusion. He was not solving a 19th-century puzzle; he was establishing the empirical inevitability of molecular reality.

A Semantic Shield: 55 to 1

A revealing detail lies in Einstein’s word choice. In the paper:

  • "Particle" (Teilchen): appears 55 times, anchoring the analysis in observable entities.
  • "Atom": appears only once, and even then only in a parenthetical example.

By grounding his work in the "established" (though still debated) kinetic theory, he avoided the philosophical baggage that came with the word “atom.” There is no direct evidence that Ernst Mach publicly attacked Einstein’s theory of Brownian motion, despite Mach’s anti-atomist position—indeed, Einstein sent him reprints requesting evaluation. Einstein’s approach to Brownian motion was a model of conceptual diplomacy: he did not argue for atoms, but he provided a method to count them.

 

The Moral of the Lesson:

Life's trajectory often resembles a random walk: our path is continually shaped by countless unseen variables. Recognizing this helps us avoid the trap of "just-world" thinking—the belief that outcomes are always precise rewards or punishments for our choices. However, this was true even for one of the sharpest minds of the 20th century, Richard Feynman. His restless curiosity led him to explore ideas, but chance intervened more than once. In September 1972, while traveling to a physics conference in Chicago, he tripped on a sidewalk hidden by tall grass and fractured his kneecap (Feynman, 2005). Over a decade later, in March 1984, eager to pick up a new personal computer, he stumbled over a curb in a parking lot. This second fall caused a severe head injury that required emergency surgery to relieve pressure on his head.

       Feynman’s story illustrates a humbling lesson: we cannot control every step in our personal random walk. Careful preparation and wise decisions reduce risk, but they cannot abolish the role of sheer chance. The goal, then, is not to live a perfectly safe, risk-free life, but to cultivate resilience—to accept that stumbles are part of the path, and to keep walking with curiosity nonetheless. True stability comes not from eliminating randomness, but from learning how to rise after we fall.

 

Fun facts: From Brownian Motion to Blood Sugar Control

Feynman realized that nature does not only allow one possible path; in a sense, it explores all of them at once. The Feynman-Kac formula is the mathematical way of saying: “If you want to know where the jiggling is going, don't watch one atom; solve the equation that describes the average of all possible jiggles.” This formula provides a rigorous mathematical bridge between two completely different frameworks: Stochastic Calculus (random "jiggling" paths) and Partial Differential Equations (PDEs) (smooth, deterministic "clouds" of probability). In modern diabetes management, a patient’s blood glucose level can be modeled as a stochastic process—mathematically analogous to the Brownian motion of particles. In a sense, Type 2 Diabetes can be effectively reversed by managing what we eat (Low-Carb, High-healthy-Fat), how we eat (whole foods, cooking process), and critically, when we eat (intermittent fasting), thereby lowering insulin levels.

 

By lowering the insulin baseline, it is possible to change the "magnetic north" of the system, so lesser "drunken sailors" (glucose) wander around and achieve a healthier equilibrium. Below are 10 Strategies for Glucose Stability:

 

1. Master the "Food Order"

The sequence in which you eat matters. Starting a meal with fiber (vegetables), followed by protein and fats, and leaving starches and sugars for the end can significantly blunt the post-meal glucose spike. Fiber and protein slow down gastric emptying, preventing a "flood" of sugar into the bloodstream.

2. Never Eat "Naked" Carbohydrates

Avoid eating simple carbohydrates (like an apple or a piece of bread) on their own. Instead, "clothe" them with healthy fats or proteins (like peanut butter or cheese). This pairing slows the digestion of the carbohydrate, leading to a more gradual rise in blood sugar.

3. Prioritize Soluble Fiber

Focus on foods high in soluble fiber, such as beans, oats, Brussels sprouts, and flaxseeds. Soluble fiber dissolves in water to form a gel-like substance that interferes with the absorption of sugar and cholesterol.

4. Utilize the "Vinegar Trick"

Consuming a tablespoon of apple cider vinegar (diluted in water) before a high-carb meal has been shown to improve insulin sensitivity and reduce the glucose response. The acetic acid in vinegar temporarily slows the breakdown of starches into sugars.

5. Opt for Low Glycemic Index (GI) Foods

Choose complex carbohydrates that sit low on the Glycemic Index. Whole grains (barley, quinoa), legumes, and non-starchy vegetables provide a "slow burn" of energy compared to the "flash fire" of refined grains and sugary snacks.

6. Embrace Resistant Starch

When you cook and then cool certain starches (like potatoes, rice, or pasta), they undergo "retrogradation," turning some of the digestible starch into resistant starch. This starch acts more like fiber, feeding your gut microbiome rather than immediately spiking your glucose.

7. Hydrate to Dilute

When blood sugar is high, the body attempts to flush out excess glucose through urine, which requires water. Staying properly hydrated helps the kidneys filter out excess sugar and prevents the concentration of glucose in the bloodstream.

8. Focus on Magnesium-Rich Foods

Magnesium is a critical co-factor for the enzymes involved in glucose metabolism. Incorporate magnesium-heavy hitters like spinach, pumpkin seeds, almonds, and dark chocolate (at least 70% cocoa) to support your body's natural insulin signaling.

9. Incorporate "Warm" Spices

Spices like cinnamon and turmeric have shown potential in improving insulin sensitivity. Cinnamon, in particular, may mimic the effects of insulin and increase glucose transport into cells, though it works best as a consistent dietary addition rather than a "quick fix."

10. Use the "Plate Method" for Portion Control

Visual cues are often more effective than calorie counting. Aim to fill half your plate with non-starchy vegetables, one-quarter with lean protein, and one-quarter with high-fiber carbohydrates. This naturally limits glucose-heavy inputs while ensuring satiety.

 

The Grand Principle of glucose stability is mastering the 'what', ‘when’ and 'how' of your meals, as the food you choose—and the order in which you eat it—are the significant factors in blood sugar spikes (Fung, 2018).

 

Review questions:

1. Feynman refers to the "mean-square-distance" traveled by a Brownian particle, whereas the standard term is "mean square displacement" (MSD). Explain the conceptual difference between these two terms and evaluate whether Feynman's choice is pedagogically better?

2. How would you derive the MSD equation?

3. How would you explain that some irreversible losses (or resistance) are needed in order to have fluctuations? Would you relate it to Fluctuation-Dissipation theorem?

 

References:

Einstein, A. (1905).  Über die von der molekularkinetischen Theorie der Wärme geforderte Bewegung von in ruhenden Flüssigkeiten suspendierten Teilchen [On the Movement of Small Particles Suspended in Stationary Liquids Required by the Molecular-Kinetic Theory of Heat]. Annalen der Physik (in German). 322(8), 549–560.

Feynman, R. P. (2005). Perfectly reasonable deviations from the Beaten track: The letters of Richard P. Feynman (M. Feynman, ed.). New York: Basic Books.

Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on PhysicsVol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.

Fung, J. (2018). The diabetes code: prevent and reverse type 2 diabetes naturally (Vol. 2). Greystone Books Ltd.

Smoluchowski, M. (1906). Essai d'une théorie cinétique du mouvement Brownien et des milieux troubles[Test of a kinetic theory of Brownian motion and turbid media]. Bulletin International de l'Académie des Sciences de Cracovie (in French): 577-602.

No comments:

Post a Comment