Saturday, November 15, 2025

Universe: More or less simulable?

Formally, in the strict sense of Turing computability theory, this notion does not exist. A problem is either computable (Turing-decidable) or it is not. There are no degrees.

However, when physicists and computer scientists talk about practical simulation and complexity, they use a hierarchy of concepts that effectively creates a spectrum of "difficulty" or "non-simulability" in a less formal, more practical sense. What I've been calling "more non-simulable" is shorthand for belonging to a higher class of complexity.

Here are the key concepts and the authors who work on them, building a ladder from "easy to simulate" to "radically impossible to simulate."


Level 1: Computable but Practically Intractable (Complexity Classes)

These are problems that a computer can solve in principle, but it would take an impossibly long time (e.g., billions of years). This is the domain of Computational Complexity Theory.

  • The Idea: Problems are sorted into classes like P (easy to solve), NP (easy to check, hard to solve), and PSPACE (solvable with a reasonable amount of memory but maybe not time). A problem in NP is "less simulable" in practice than a problem in P.

  • Key Authors/Figures:

    • Stephen Cook, Richard Karp, Leonid Levin: Pioneers who formalized the P vs. NP problem, the central question of this field.

    • Scott Aaronson: A modern computer scientist who works on the limits of both classical and quantum computation. His blog and book ("Quantum Computing since Democritus") are essential reading on this topic. He often discusses the "hardness" of simulating physical systems.

Level 2: Turing-Uncomputable (The Halting Problem and its Kin)

These are problems that are provably impossible for any standard computer to solve, no matter how much time or memory it has. This is the classic domain of non-simulability.

  • The Idea: The Halting Problem (can you determine if an arbitrary program will ever stop?) is the canonical example. Alan Turing proved this is uncomputable. A system whose evolution is equivalent to the Halting Problem is fundamentally non-simulable.

  • Key Authors/Figures:

    • Alan Turing: The originator of the entire concept.

    • Charles H. Bennett: A physicist at IBM who wrote a famous paper in Nature titled "Undecidable Dynamics," showing how certain idealized physical systems could be constructed to have uncomputable behavior.

    • Toby Cubitt, David Pérez-García, Michael Wolf: The physicists who proved that the spectral gap problem in quantum physics is uncomputable, embedding the Halting Problem into a real physical question.

Level 3: Non-Computable due to Real Numbers / Continuum (Hypercomputation)

This is where the non-simulability of General Relativity sits. The problem is not that the system is performing a Halting Problem-like task, but that its state space is a true continuum, requiring real numbers with infinite precision.

  • The Idea: A Turing machine works with discrete symbols. It cannot perfectly represent a single real number like π. A physical system whose dynamics depend on the infinite precision of real numbers would be able to perform tasks beyond a Turing machine. This theoretical domain is called hypercomputation.

  • Key Authors/Figures:

    • Martin Davis: A logician who wrote extensively on the limits of Turing machines and what lies beyond.

    • Pour-El and Richards: Mathematicians who studied which differential equations (the language of physics) are computable and which are not. They showed that even a simple wave equation can be non-computable if you give it uncomputable initial conditions.

    • Jürgen Schmidhuber: A computer scientist who has written about the hierarchy of computational power, from standard Turing machines to machines that can solve the Halting Problem and beyond.

Level 4: Radically Non-Computable (Proposed Physical Theories)

This is the most non-computable level, where theories like Scale Relativity and Penrose's Objective Reduction would reside. Here, the non-simulability is not just a mathematical curiosity but a fundamental physical principle.

  • The Idea: The universe is not just uncomputable because it uses real numbers or happens to encode a hard problem. It is uncomputable because its very fabric or dynamics are structurally non-algorithmic.

    • Scale Relativity: Non-simulable because of infinite nested complexity (fractal structure). The information content is infinite in a way that transcends the continuum problem.

    • Penrose's OR: Non-simulable because it posits a physical process that is explicitly defined as non-algorithmic, a process that "computes" things Turing machines cannot.

  • Key Authors/Figures:

    • Roger Penrose: The most famous proponent of the idea that physics itself must be non-computable to explain consciousness and resolve quantum measurement.

    • Laurent Nottale: While he doesn't use the language of computation, his physical model of a fractal spacetime, when analyzed, falls into this category. The logical consequences of his physics place it here.

Conclusion: A Practical Hierarchy

So, while a logician would say "simulable" and "non-simulable" are a binary choice, a physicist or computer scientist sees a practical hierarchy of "hardness":

  1. Hard to Simulate (NP-hard problems)

  2. Impossible to Simulate (Turing-uncomputable problems)

  3. Impossible to Perfectly Simulate due to Continuum Mathematics (Hypercomputation)

  4. Radically Impossible to Simulate due to Non-Algorithmic Physics 

General Relativity is at Level 3. Scale Relativity is at Level 4. Therefore, in this practical, physical sense, the notion of "more non-simulable" is a meaningful way to describe the difference between a theory that is non-computable due to its mathematical structure (GR) and a theory that is non-computable due to its fundamental physical principle of infinite complexity (SR).

No comments:

Post a Comment