Search

 

A Massive Problem All of Physics Completely Missed

How Unresolved Foundations in Quantum Mechanics and General Relativity Could Reshape Our Understanding of Reality

In the pantheon of scientific revolutions, few discoveries have challenged our conception of reality as profoundly as quantum mechanics and Einstein’s theory of general relativity. Quantum mechanics describes the world of the very small—atoms, electrons, photons, and other fundamental particles—while general relativity explains gravity as the curvature of spacetime itself. Both frameworks are successful at their respective scales. Yet, attempts to unify them under one umbrella, often termed “quantum gravity,” have remained stubbornly out of reach.

In a recently discussed approach known as the Indivisible Stochastic Approach, we encounter an illuminating lens through which we can re-examine quantum mechanics. This approach highlights how defining probability in a universe with fluctuating spacetime becomes highly non-trivial. Specifically, it foregrounds a major oversight: the absence of a consistent theory of probabilistic general relativity as a stepping stone to full-blown quantum gravity. While we have deterministic equations for curved spacetime and quantum formalisms that revolve around measurement outcomes, the bridging scaffolding of a fully probabilistic version of Einstein’s theory is curiously missing.

This blog post sets out to do two things. First, it will dissect the major points made in the video transcript about the structural and conceptual gaps in modern physics. Second, it will build on those points, offering additional examples, historical context, and reflections from contemporary research at the philosophical frontiers of theoretical physics. By the end, you’ll see why the challenge of defining time slices in a fluctuating universe, the meaning of “expectation values,” and the often-ignored question of how measurement is conflated with phenomena all converge into a single, massive problem. It is a problem that, once confronted, may help us reshape the foundations of physics—and could even redefine how we pursue a unification of quantum mechanics and gravity.

So, if you’ve ever wondered why decades of trying to “quantize” gravity have been fraught with conceptual snags, or why standard quantum mechanics remains philosophically incomplete, read on. This discussion goes beyond the formulae to the very heart of what we claim to know—and perhaps more importantly, what we have never properly questioned.

Defining the Challenge: Fluctuating Spacetime and the Nature of Probability

The Indivisible Stochastic Approach in a Nutshell

The video begins by describing the Indivisible Stochastic Approach to quantum mechanics. While standard quantum theory often frames itself in terms of measurement outcomes and wave functions (or state vectors in Hilbert space), the Indivisible Stochastic Approach strives to provide a probabilistic picture of reality without necessarily invoking wave function collapse or multiple worlds. It treats phenomena as inherently probabilistic—events unfold in a manner that can be understood through dynamic probabilities that evolve in time.

But immediately, a problem arises: What do we mean by time in a universe where spacetime itself is not fixed but dynamically fluctuating? In standard quantum theory, one often relies on a background notion of time. You set up “initial conditions at time t = 0” and see how they evolve into “final conditions at time t = T.” Yet, in general relativity, time is not universal; it is woven together with space, forming a four-dimensional spacetime manifold whose geometry changes in the presence of matter and energy.

Slices of Time in a Curved Universe

To talk about probabilities evolving, we normally slice the universe into “moments” of time—think of them as snapshots of all space at a particular instant. In Newtonian physics, it’s straightforward: everyone can agree on a universal time. In special relativity, though the notion of simultaneity becomes relative, we still have a well-defined flat spacetime structure that separates space-like directions from time-like directions.

However, in dynamical spacetime, that very structure is itself part of the physics. If the shape of spacetime—the metric tensor—is not only dynamic but also potentially “quantum” or “stochastic,” how do we decide which direction is time-like at any given moment? This challenge confronts any attempt to write down a probability rule—like “the probability that the system transitions from configuration A at time t1 to configuration B at time t2.” If time directions and space directions fluctuate, we no longer have a stable scaffolding on which to hang our probabilities.

This conceptual tension, as the video highlights, underscores how deeply intertwined the notion of probability is with the structure of spacetime. It throws down the gauntlet: Is it even possible to phrase a probabilistic theory when the time direction is not well-defined? For many, this is the core conundrum of quantum gravity.

Why Probabilistic General Relativity? A Gap in the Canon

Deterministic vs. Probabilistic Theories

Einstein’s General Relativity (GR) is classically deterministic. The Einstein field equations, in principle, allow you to predict how spacetime curvature evolves given an initial mass-energy distribution (under certain conditions). There is no inherent randomness in GR at the classical level—any randomness we talk about is either an artifact of incomplete information or statistical approximations in complex astrophysical scenarios.

By contrast, many physical theories outside of gravity are formulated in probabilistic or stochastic terms. Newtonian mechanics, while deterministic in its pure form, has spurred numerous effective models—Markov processes, stochastic differential equations, and so forth—to handle everything from Brownian motion to complex thermodynamic systems. These frameworks offer tools to deal with irreducible randomness or approximate real systems too complicated to solve exactly.

The Missing Piece: A Fully Stochastic General Relativity

Now, here’s the big puzzle: Why don’t we have a widely used, fully stochastic formulation of general relativity? Why did we leap straight into trying to quantize GR—endeavoring to unify it with quantum mechanics—without first exploring a simpler intermediate step: “probabilizing” it?

If we want “quantum gravity,” we’re presumably going to have a theory of spacetime that includes quantum uncertainty or some form of fundamental probabilistic behavior. A more straightforward stepping stone might be a version of GR that is “just” probabilistic, i.e., a theory that modifies Einstein’s field equations to include randomness in how spacetime evolves, without adopting the entire quantum apparatus of Hilbert spaces, operators, etc.

Surprisingly, as the speaker points out, this is a neglected avenue—there is only limited modern work on it. Although individuals like Jonathan Oppenheim and others have begun exploring partial steps, there is no century-long lineage of fully fleshed-out “stochastic general relativity” akin to how we’ve developed “stochastic Newtonian mechanics.” If we want a robust quantum gravity theory, we might benefit from understanding how any gravitational theory can be cast in probabilistic terms.

Historical Blind Spots and the Timing of Ideas

One reason for this oversight is historical. General relativity was completed by Einstein in 1915, just as the world was embroiled in World War I. The rigorous mathematical underpinnings of probability theory—particularly Kolmogorov’s axiomatization—weren’t published until 1933. The advanced theory of stochastic processes that we use today became more widespread and formalized largely in the mid-20th century. By then, the physics community was already trying to wrestle GR into a quantum framework. The notion that we might start by “stochasticizing” GR first never took off in any mainstream way. People had other priorities—chiefly, how to reconcile quantum phenomena with the geometry of spacetime.

In the modern era, focusing directly on quantum field theoretic approaches, such as string theory or loop quantum gravity, overshadowed simpler “intermediate” possibilities. Despite many bold attempts, a consistent quantum theory of gravity remains elusive, and so perhaps we need to circle back and ask: Should we do the simpler step first?

Conceptual Foundations: Measurement, Expectation Values, and the Dirac–von Neumann Axioms

 Quantum Mechanics as a Theory of Measurement

At the heart of standard quantum mechanics are the Dirac–von Neumann axioms. In a nutshell:

  • States are represented by vectors (or wave functions) in a Hilbert space.
  • Observables (physical quantities) are represented by self-adjoint operators on that Hilbert space.
  • Measurements produce specific eigenvalues of these operators.
  • The Born rule connects the wave function to a probability distribution over possible measurement outcomes.

Crucially, these axioms treat quantum mechanics as fundamentally about measurements. A “state” evolves unitarily until a measurement occurs; then a “collapse” or a “projection” happens, giving you a random eigenvalue.

Expectation Values: A Perpetual Source of Confusion

In quantum mechanics, if you have an observable A^ (often read “A-hat”) and a state |ψ⟩, you can compute:

〈A〉 = 〈ψ | A^ | ψ〉

This “expectation value” is, by the standard Dirac–von Neumann interpretation, the statistical mean of many measurements of A^ on identically prepared states. But note: it’s about measurement outcomes. If no one is measuring, then by the standard postulates, 〈A〉 is not automatically a real, physical average of “stuff happening out there.” It’s only an average if you do the measurement.

Yet, in typical physics applications, especially in textbooks, we often loosely conflate the expectation value with “the actual average value of a physical quantity in the system,” ignoring that the standard formalism ties it explicitly to measurement. This conflation becomes more jarring in contexts like semi-classical gravity, where we talk about plugging 〈Tμν〉 (the expectation value of the stress-energy tensor operator) into Einstein’s field equations to get a classical curvature. Are we implying the Universe is “measuring” the matter fields? Or does the Universe “see” only the average? The standard axioms don’t straightforwardly justify this.

Getting from Measurement to Phenomena

Alternative interpretations—Bohmian mechanics, the Many-Worlds interpretation, or an Indivisible Stochastic approach—try to do away with the notion that quantum theory is exclusively about measurement. Instead, they aim to describe how phenomena unfold in the real world, measured or not. In these interpretations, expectation values can be re-understood as actual statistical averages of underlying variables or branching processes.

But if you stay strictly within the traditional measurement-based quantum postulates, it becomes logically tenuous to treat 〈A〉 as “what’s really happening.” This is not just a pedantic quibble; it’s a fundamental interpretational gap that becomes acutely problematic in quantum gravity scenarios—where we have to say how a presumably “classical-like” geometry emerges from a quantum matter distribution if the matter is never “measured” in the usual sense.

The Problem with Semi-Classical Quantum Gravity

 The Einstein Field Equations, Revisited

Einstein’s field equations relate spacetime curvature Gμν to the stress-energy tensor Tμν. Symbolically:

Gμν = (8πG / c4) Tμν.

In a purely classical setting, Tμν is a classical object describing the distribution and flux of energy and momentum in spacetime. However, in a semi-classical approach, matter is treated via quantum mechanics (with some quantum field theory elements), and spacetime is still classical. Physicists then write:

Gμν = (8πG / c4) 〈Tμν〉,

where 〈Tμν〉 is the quantum mechanical expectation value of the stress-energy tensor operator. This is done as if 〈Tμν〉 were simply the “average energy-momentum in the region,” but that rests on the assumption we just critiqued—that the expectation value is physically equivalent to “stuff actually happening.”

A Conceptual Mismatch

The mismatch is glaring: classical curvature is being sourced by what is ultimately a “measurement average” in the standard quantum formalism, but there is no measurement apparatus in outer space “observing” the fields. This is not an idle puzzle. Much of early research into black hole evaporation, Hawking radiation, and even early-universe cosmology depends heavily on such semi-classical equations. While they have sometimes led to approximate predictions that match phenomena, they remain conceptually incomplete.

That incompleteness emerges from the deeper difficulty: How do we unify a measurement-centric quantum formalism with a classical field equation that evolves deterministically? The Indivisible Stochastic Approach, or any approach that tries to define events regardless of measurement, is attempting to fix that mismatch at its root.

Non-Markovian Processes and the Nature of Time

Markov vs. Non-Markov Approaches

A Markov process is one where the future state depends only on the present state, not on the detailed history. Many physical systems—like random walks, certain chemical reactions—are well-modeled by Markov processes. However, quantum mechanics in general can exhibit memory effects and entanglement that are non-Markovian. The Indivisible Stochastic Approach suggests that if we want to accurately capture quantum phenomena, we need these non-Markovian features, meaning that the process describing the unfolding of events retains some memory of the past.

Why Non-Markovian for Gravity?

General relativity is highly non-linear: the presence of mass-energy modifies the geometry of spacetime, and that geometry in turn affects how mass-energy moves. A naive Markovian viewpoint might fail to account for the historical constraints of how the geometry formed. If you add in quantum or stochastic elements, you have to consider that the geometry’s evolution at any moment might be influenced by the entire past and the indefinite ways in which quantum states can overlap or entangle over time.

An indivisible approach to stochasticity might unify those aspects. “Indivisible” in this context means we cannot simply break the process into smaller, independently evolving time steps; each step depends on the entire context. In a gravitational setting, the difference between space and time can shift as geometry fluctuates, so specifying a truly local time evolution becomes that much trickier.

The Philosophical and Methodological Reboot

Debugging the Foundations

The speaker uses a computer programming analogy: sometimes, to solve the bug, you must “go down into the deep programming” of the model rather than keep adding patches or new modules on top. In physics, a patch often looks like: “Let’s just assume the Universe can measure itself,” or “We’ll treat 〈Tμν〉 as if it’s real, even though the axioms say it is a measurement average.” This can yield workable approximations but leaves the foundational confusion unresolved.

A more radical solution might be: rewrite the entire structure of quantum theory and general relativity from the ground up in a purely probabilistic or stochastic language that does not rely on external measurement concepts. Instead, we define the ontology—what fundamentally exists—and the laws for how that ontology probabilistically evolves, even if the geometry of spacetime is dynamic and uncertain.

A Minimalist Strategy

In scientific practice, simpler stepping stones are invaluable. We seldom teach quantum field theory to a first-year physics student; we begin with Newton’s laws, then wave mechanics, and so forth. Analogously, it might prove more fruitful to approach quantum gravity by first cracking the puzzle of probabilistic general relativity. The moment we can handle small fluctuations in geometry stochastically—without contradictory assumptions—may shed considerable light on the correct path to a fully quantum gravitational framework.

That, in turn, might align with alternative quantum interpretations that already treat quantum states as describing real configurations or real events—rendering moot the special role of measurement that currently muddles the logic of the classical-quantum interface.

Historical Anecdotes and Where They Lead Us

 A Century of Ambiguity

Albert Einstein finalized the field equations in November 1915, culminating a race of intellectual developments. Within a year, Karl Schwarzschild found the first exact solution describing spherical gravitational fields—famously, while involved in wartime service (though not literally in a muddy trench as popular lore suggests). Meanwhile, quantum theory began gestating in the early 20th century. By the 1920s, Pauli, Heisenberg, and Dirac were grappling with how to quantize fields, including attempts to apply the new quantum principles to gravity.

Yet, formal probability theory was still nascent. Andrey Kolmogorov’s 1933 axiomatization gave mathematicians a robust foundation to define random variables and probability spaces, but that was well after the initial impetus to “quantize everything.” In the subsequent rush to unify quantum and relativistic ideas, we neglected the simpler question: “What if general relativity itself were re-cast probabilistically?”

Potential Trails of Research

While not mainstream, there are interesting pockets of work that might be relevant:

  • Stochastic Gravity: Some researchers have tried adding small “noise” terms to the Einstein field equations, modeling fluctuations in the stress-energy tensor. However, these are often treated as approximations or effective theories, rather than a fundamental rethinking.
  • Relational Approaches: Philosophers and physicists like Emily Adlam, Eddie Chen, and Shelley Goldstein look for global or relational formulations of laws—ways that might circumvent the need for a strict separation into “initial conditions at time t.”
  • Non-local Realist Theories: The broader field of “hidden-variable” theories or non-local realist frameworks (e.g., Bohmian mechanics) tries to extend quantum mechanics beyond measurement-based axioms. Some of these might be compatible with a reinterpretation of how spacetime emerges or evolves.

A fully consistent Indivisible Stochastic approach to general relativity might be the key to bridging these disparate ideas.

Making the Step Before Quantum Gravity

Outline of a Probabilistic GR Program

Let us imagine the broad strokes of how you would systematically build a probabilistic version of GR. One approach could be:

  • Start with the Einstein–Hilbert Action: The classical action that defines how the spacetime metric evolves.
  • Introduce Stochastic Variables: Instead of metric fields satisfying deterministic equations, allow them to have a probability distribution. Possibly define “transition probabilities” for how geometry changes from one configuration to another.
  • Non-Markovian Constraints: Ensure that the process respects the constraint equations of GR (the Gauss–Codazzi equations, etc.) across “slices” of spacetime, keeping in mind that slicing is itself dynamic.
  • Recovering Classical GR: In a suitable limit, the probabilities must become sharply peaked, reproducing deterministic classical solutions.
  • Comparisons with Observations: Check whether this approach can replicate known tests of GR (gravitational lensing, orbital precession, gravitational waves) in a regime where fluctuations are negligible.

Such a framework might already pave the way for bridging into a quantum domain. The leaps to operators, Hilbert spaces, and so forth could, in principle, follow from the specific forms of stochastic laws you adopt.

Could This Be Quantum Gravity in Disguise?

The speaker conjectures that a well-crafted, fully probabilistic GR might effectively be quantum gravity. This is a bold hypothesis: once you allow non-Markovian, globally constrained probabilities for the metric field, you might not need to separately add the standard quantum mechanical formalism. You could discover “quantum-like” interference or superposition emerges naturally from the underlying probabilities of geometry.

While this might sound radical, it aligns with an idea that quantum mechanics, stripped of measurement assumptions, is a very general theory of certain types of non-local, contextual probabilities. If gravity modifies the structure of space and time, perhaps that is precisely the missing ingredient in forging a more coherent interpretation of quantum phenomena.

Re-evaluating the Measurement Problem: The Universe as the Experimenter?

The Universe Doesn’t Measure Itself—Or Does It?

A recurring tension in quantum mechanics is: who or what constitutes a “measurement”? Copenhagen-like interpretations often require a classical measuring apparatus external to the system. But for cosmic scales, it seems bizarre to imagine the universe measuring itself. So how do the standard axioms hold up when we’re talking about entire cosmological evolutions, black holes forming and evaporating, or geometry dancing on the Planck scale?

One vantage is to say: “Everything is a measurement,” and that “environmental decoherence” essentially stands in for a measuring device. But this may beg more questions than it resolves, especially if we are not comfortable anthropomorphizing or reifying “the environment” as a measuring agent.

Indivisible Stochastic Realism

The Indivisible Stochastic Approach would treat the unfolding of events probabilistically at the fundamental level—whether or not we, as humans, put a measuring device in the system. There is no special moment where a wave function collapses; rather, the system is always in some real, though probabilistically evolving, state. This eliminates the measurement conundrum by unifying “measurement” events with “natural” events as part of the same story.

From this vantage, any manifestation we call “measurement” is just a particular instance of the more general phenomenon: the unstoppable evolution of a probabilistic reality. The result is that macroscopic objects (including measuring devices) register specific, well-defined outcomes because that is one branch or solution in the overarching stochastic evolution.

Lessons for Future Theories

Avoiding Category Mistakes

One takeaway: We must carefully distinguish measurement-based axioms from statements about phenomena. Substituting a quantum-mechanical expectation value into a classical field equation may be mathematically convenient, but conceptually it conflates a measurement average with a classical property. Avoiding such category mistakes might require rethinking quantum axioms or adopting an interpretation that legitimizes talking about “expectation values” as real phenomena, no measurement needed.

Embrace Non-Markovian, Global Constraints

A second lesson is that truly unifying quantum ideas with a dynamical spacetime may force us to adopt global or non-Markovian constraints on how states evolve. If time slicing is not universal, the usual step-by-step evolution might be replaced with a more holistic, “all-at-once” approach. Philosophers of physics are already investigating such possibilities, exploring whether fundamental laws are best understood in a four-dimensional block-universe perspective.

History shaped modern physics in a way that largely skipped a thorough examination of probabilistic GR. That path might still hold untapped riches for bridging the gap to quantum gravity. As the speed of new proposals in quantum gravity (strings, loops, spin foams, causal sets) continues, it may be prudent to take stock: do any of these effectively embed a robust, consistent stochastic model of spacetime? Or are they layering quantum rules on a conceptual foundation that was never fully clarified?

The riddle of how to reconcile quantum mechanics with general relativity has vexed physicists for nearly a century. We have grown comfortable with the idea that quantum theory is about measurement outcomes, even as we talk about expectation values “as if” they reflect real properties in the absence of measurement. We have poured energies into quantum gravity programs that either rely on partial “classical” assumptions (semi-classical approaches) or rework quantum theory from the ground up (string theory, loop quantum gravity). All the while, a critical intermediate step—a fully stochastic version of general relativity—remains largely unexplored.

This gap becomes particularly acute once we recognize that general relativity is a non-linear, dynamical theory of spacetime geometry. If we can’t even define what we mean by a probability distribution over different geometries, how can we claim to have a complete quantum theory of gravity? The Indivisible Stochastic Approach and related lines of thought open the door to re-defining our basic categories. Rather than confining quantum mechanics to measurement outcomes, they offer an ontology where events and configurations evolve probabilistically at a fundamental level. Once we adopt that mindset, the notion of “expectation value” as a stand-in for “stuff happening on average” seems far more natural.

Ultimately, stepping back to debug our conceptual foundations may be the most significant challenge—and the greatest opportunity—in modern theoretical physics. If we come to see the Universe as inherently stochastic, with geometry and matter dancing according to probability distributions that are neither purely classical nor standardly quantum, we might find a simpler, more direct path to the theory of quantum gravity. Such a reformation in how we approach physics could spark new lines of research, bridging age-old divides and clarifying phenomena from black hole thermodynamics to early-universe cosmology.

Call to Action: If you are a researcher, consider investigating ways to recast general relativity in a fully probabilistic framework. If you are a student of physics or a curious outsider, don’t hesitate to question the standard textbook accounts—especially when they casually treat quantum expectation values as real physical averages without specifying who or what is measuring. By confronting these conceptual oversights head-on, we stand to gain deeper insight into the nature of reality and perhaps finally clear the path toward a consistent theory of quantum gravity.

In the end, wrestling with these unanswered questions is not just an academic exercise; it is how science moves forward. The next paradigm-shifting discovery could well emerge from taking that single conceptual step we skipped a century ago. By embracing the cosmic dice at the heart of spacetime, we might rewrite the cosmic code itself.

We Ship Worldwide

Tracking Provided On Dispatch

Easy 30 days returns

30 days money back guarantee

Replacement Warranty

Best replacement warranty in the business

100% Secure Checkout

AMX / MasterCard / Visa