Thursday, April 16, 2026

Computability of the Universe: Limits of the Bekenstein Bound.

Abstract

This remark analyzes the recent debate surrounding the application of algorithmic undecidability to the physical universe. While recent arguments posit that the universe can be reduced to a completely decidable Finite State Automaton by relying on the classic Bekenstein Bound, an analysis of high-dimensional and quantum-gravitational literature reveals this assumption to be premature. By examining logarithmic corrections to black hole entropy and transfinite fractal geometry, this paper demonstrates that the physical territory is likely infinitely more complex than a discrete grid, thereby leaving the door open for fundamental undecidability in physics.

Introduction
The intersection of quantum physics, cosmology, and algorithmic information theory has recently sparked a vigorous debate regarding the fundamental computability of the universe. In 2025, Faizal et al. proposed that because formal axiomatic systems are subject to Gödelian and Turing incompleteness, any purely algorithmic "Theory of Everything" is impossible. They argued that the universe must possess non-algorithmic properties, rendering the simulation hypothesis logically invalid. Concurrently, a comprehensive review by Perales-Eceiza et al. confirmed that undecidability is pervasive in the mathematical models of modern physics, from quantum many-body systems to tensor networks, often emerging when theoretical limits are pushed to infinity.

In direct opposition to Faizal et al., Karazoupis published a preprint arguing that the introduction of undecidability into physics constitutes a fundamental category error. Karazoupis asserted that the universe is constrained by the Bekenstein Bound, which places a strict, finite limit on the amount of information that can exist within any causal horizon. By this logic, the physical universe does not possess "actual infinity." Instead, it operates strictly as a Finite State Automaton. Because Finite State Automata are completely decidable and immune to the Halting Problem, Karazoupis concluded that the universe is a logically consistent, computable machine requiring no non-algorithmic meta-theory. However, this conclusion rests entirely on the assumption that the classic, linear Bekenstein Bound is an absolute and fundamental description of quantum spacetime. An analysis of existing literature on quantum entropy and high-dimensional geometry suggests this assumption is highly vulnerable.

The Breakdown of the Finite State Automaton Assumption
The characterization of the universe as a discrete, finite informational grid relies on applying low-dimensional, macroscopic approximations to the fundamental quantum realm. Work by Castro and Granik demonstrates that the linear relationship between entropy and area in the Bekenstein-Hawking formulation is merely an effective theory recovered in the long-range limit. At the Planck scale, quantum effects introduce logarithmic and higher-order corrections to the entropy equation. Rather than resolving into a simple, discrete lattice of finite states, spacetime at the fundamental level transitions into a continuous, Cantorian-fractal geometry. Because fractals possess infinite depth and self-similarity, they require infinite precision to be perfectly described. A Finite State Automaton cannot process infinite Kolmogorov complexity, meaning the fundamental dynamics of the universe transcend simple finite computation.

This complexity is further compounded when examining the universe through the lens of extra dimensions. As elucidated by El Naschie, it is a mathematical fallacy to apply low-dimensional intuition to quantum gravity. Utilizing Dvoretzky’s Theorem on measure concentration, El Naschie highlights that in the high-dimensional spaces required by advanced physics models, geometry behaves counterintuitively, with the vast majority of "volume" concentrating near the surface. Consequently, the classic Bekenstein limit breaks down and requires an extension into a transfinite, fractal version based on E-infinity theory. If the holographic boundary of the universe is a transfinite fractal hyper-surface rather than a finite array of discrete bits, the universe inherently contains "actual infinity." The presence of actual infinity reintroduces the very algorithmic undecidability and Gödelian incompleteness that the Finite State Automaton model attempted to banish.

Conclusion: Flipping the "Map vs. Territory" Argument
The debate over whether the universe is fundamentally computable ultimately hinges on the philosophical distinction between the mathematical description of reality and reality itself. Karazoupis forcefully accused Faizal et al. of committing a category error, arguing that they mistook the infinite mathematics of their descriptive model (the map) for the strictly finite reality of the physical universe (the territory).

However, by integrating the insights of Planck-scale fractal geometry and high-dimensional measure concentration, it becomes evident that Karazoupis commits the exact same category error in reverse. He mistakes the classic Bekenstein Bound (which is merely a simplified, low-dimensional, macroscopic mathematical map) for the true quantum territory. The physics described by Castro, Granik, and El Naschie suggests that the actual quantum territory is a highly complex, continuous, transfinite fractal space where classic, discrete informational rules break down. By confusing a simplified finite map for an infinitely complex physical territory, the argument that the universe is a simple, decidable Finite State Automaton collapses. Consequently, the universe retains a level of complexity that cannot be fully captured by finite algorithms, reaffirming the likelihood that any ultimate physical theory will remain subject to fundamental undecidability.


References

Castro, C., & Granik, A. (2001). On the quantum aspects of the logarithmic corrections to the black hole entropy. Foundations of Physics, 31(7), 1157-1175.

El Naschie, M. S. (2015). The counterintuitive increase of information due to extra spacetime dimensions of a black hole and Dvoretzky's theorem. Journal of Quantum Information Science, 5(02), 41-45.

Faizal, M., Krauss, L. M., Shabir, A., & Marino, F. (2025). Consequences of Undecidability in Physics on the Theory of Everything. arXiv preprint arXiv:2507.22950.

Karazoupis, M. (2025). Resolving the "Theory of Everything" Paradox via Constructive Immanence and the Boundedness of Physical Information: A Formal Proof of Physical Decidability. Preprints.org, 202512.1495.

Perales-Eceiza, Á., Cubitt, T., Gu, M., Pérez-García, D., & Wolf, M. M. (2025). Undecidability in physics: A review. Physics Reports, 1138, 1-29.

Wednesday, April 15, 2026

Hierarchy of the Incomputable: General and Scale Relativity

We often talk about the universe as a simulation, a grand computer program running the laws of physics. But what if the very fabric of reality is impossible to program? Physicists and logicians have long known that any simulation is just an approximation. However, a fascinating duel between two great theories of gravity—Einstein's General Relativity and Laurent Nottale's Scale Relativity—reveals that there are different levels of impossibility. Some theories are simply "more" non-simulable than others.

To understand this, let's forget about spacetime for a moment and think about a simpler task: simulating a coastline.


Level 1: Simulating a Smooth Beach (The General Relativity Problem)

Imagine a perfect, sweeping, sandy beach. Its curve is smooth and gentle. This is the world of Einstein's General Relativity, where spacetime is described as a smooth, continuous fabric.

Now, you want to create a computer model of this beach. Your computer can't handle a perfect curve; it can only handle a finite set of points. So, you do the obvious: you place a series of points along the beach and connect them with straight lines.

This is an approximation, of course. But it's a good one. If you want a better model, you just add more points. As you increase the resolution, your jagged, digital model gets closer and closer to the smooth reality. You can get arbitrarily close to a perfect representation because the underlying reality is differentiable—it's smooth. The physics between your data points is simple.

This is why we can simulate General Relativity. We create a grid of points in spacetime and calculate the physics on that grid. It’s an approximation of a continuous reality, but it’s a manageable one. The theory is non-simulable in principle (because of the infinite points in a continuum), but we can get as close as we need for any practical purpose.

Level 2: Simulating a Rocky Coastline (The Scale Relativity Problem)

Now, imagine a different kind of coast: the rugged, rocky coastline of Norway. This is the world of Scale Relativity, where spacetime is described as a fractal fabric.

You start the same way, by placing a series of points along the coast to map it out. But here, something strange happens. You decide to zoom in on a single straight-line segment between two of your points to see how good your approximation is.

You don't find a gently curving, nearly straight line. You find a whole new world of complexity: smaller bays, jagged rocks, and tiny inlets that were completely invisible from a distance. The complexity doesn't smooth out; it increases. If you zoom in again on a tiny piece of this new coastline, the same thing happens. This self-repeating complexity at every level of magnification is the definition of a fractal.

This presents a fundamentally deeper problem for any simulation. Your approximation isn't just a low-resolution version of reality; it's a completely different and far simpler object. The physics between your grid points is not simple—it's infinitely complex. Adding more points doesn't just refine the picture; it reveals entirely new universes of structure you didn't even know were there.

A Hierarchy of the Impossible

This is why Scale Relativity is "more" non-simulable than General Relativity. It contains two nested layers of impossible infinity:

  1. The Infinity of Points: The classic problem of trying to model a continuous line with a finite number of dots. (Shared by both theories).

  2. The Infinity of Structure: The radical problem that the path between any two of those dots is itself infinitely complex. (Unique to Scale Relativity).

Simulating General Relativity is a challenge of resolution. Simulating Scale Relativity is a challenge of infinite, nested complexity.

This isn't just an abstract mathematical game. It cuts to the heart of what reality might be. Is the universe, at its smallest scales, a smooth and simple place, as Einstein assumed? Or is it an infinitely intricate coastline, a fractal reality whose depths we can explore forever without ever reaching the end? The answer determines not just what our theories look like, but what is fundamentally knowable, what can be computed, and whether the universe can ever be fully captured in the ones and zeros of a simulation.


Universe: More or less simulable?

Formally, in the strict sense of Turing computability theory, this notion does not exist. A problem is either computable (Turing-decidable) or it is not. There are no degrees.

However, when physicists and computer scientists talk about practical simulation and complexity, they use a hierarchy of concepts that effectively creates a spectrum of "difficulty" or "non-simulability" in a less formal, more practical sense. What I've been calling "more non-simulable" is shorthand for belonging to a higher class of complexity.

Here are the key concepts and the authors who work on them, building a ladder from "easy to simulate" to "radically impossible to simulate."

Level 1: Computable but Practically Intractable (Complexity Classes)

These are problems that a computer can solve in principle, but it would take an impossibly long time (e.g., billions of years). This is the domain of Computational Complexity Theory.

  • The Idea: Problems are sorted into classes like P (easy to solve), NP (easy to check, hard to solve), and PSPACE (solvable with a reasonable amount of memory but maybe not time). A problem in NP is "less simulable" in practice than a problem in P.

  • Key Authors/Figures:

    • Stephen Cook, Richard Karp, Leonid Levin: Pioneers who formalized the P vs. NP problem, the central question of this field.

    • Scott Aaronson: A modern computer scientist who works on the limits of both classical and quantum computation. His blog and book ("Quantum Computing since Democritus") are essential reading on this topic. He often discusses the "hardness" of simulating physical systems.

Level 2: Turing-Uncomputable (The Halting Problem and its Kin)

These are problems that are provably impossible for any standard computer to solve, no matter how much time or memory it has. This is the classic domain of non-simulability.

  • The Idea: The Halting Problem (can you determine if an arbitrary program will ever stop?) is the canonical example. Alan Turing proved this is uncomputable. A system whose evolution is equivalent to the Halting Problem is fundamentally non-simulable.

  • Key Authors/Figures:

    • Alan Turing: The originator of the entire concept.

    • Charles H. Bennett: A physicist at IBM who wrote a famous paper in Nature titled "Undecidable Dynamics," showing how certain idealized physical systems could be constructed to have uncomputable behavior.

    • Toby Cubitt, David Pérez-García, Michael Wolf: The physicists who proved that the spectral gap problem in quantum physics is uncomputable, embedding the Halting Problem into a real physical question.

Level 3: Non-Computable due to Real Numbers / Continuum (Hypercomputation)

This is where the non-simulability of General Relativity sits. The problem is not that the system is performing a Halting Problem-like task, but that its state space is a true continuum, requiring real numbers with infinite precision.

  • The Idea: A Turing machine works with discrete symbols. It cannot perfectly represent a single real number like π. A physical system whose dynamics depend on the infinite precision of real numbers would be able to perform tasks beyond a Turing machine. This theoretical domain is called hypercomputation.

  • Key Authors/Figures:

    • Martin Davis: A logician who wrote extensively on the limits of Turing machines and what lies beyond.

    • Pour-El and Richards: Mathematicians who studied which differential equations (the language of physics) are computable and which are not. They showed that even a simple wave equation can be non-computable if you give it uncomputable initial conditions.

    • Jürgen Schmidhuber: A computer scientist who has written about the hierarchy of computational power, from standard Turing machines to machines that can solve the Halting Problem and beyond.

Level 4: Radically Non-Computable (Proposed Physical Theories)

This is the most non-computable level, where theories like Scale Relativity and Penrose's Objective Reduction would reside. Here, the non-simulability is not just a mathematical curiosity but a fundamental physical principle.

  • The Idea: The universe is not just uncomputable because it uses real numbers or happens to encode a hard problem. It is uncomputable because its very fabric or dynamics are structurally non-algorithmic.

    • Scale Relativity: Non-simulable because of infinite nested complexity (fractal structure). The information content is infinite in a way that transcends the continuum problem.

    • Penrose's OR: Non-simulable because it posits a physical process that is explicitly defined as non-algorithmic, a process that "computes" things Turing machines cannot.

  • Key Authors/Figures:
    • Roger Penrose: The most famous proponent of the idea that physics itself must be non-computable to explain consciousness and resolve quantum measurement.

    • Laurent Nottale: While he doesn't use the language of computation, his physical model of a fractal spacetime, when analyzed, falls into this category. The logical consequences of his physics place it here.

Conclusion: A Practical Hierarchy

So, while a logician would say "simulable" and "non-simulable" are a binary choice, a physicist or computer scientist sees a practical hierarchy of "hardness":

  1. Hard to Simulate (NP-hard problems)

  2. Impossible to Simulate (Turing-uncomputable problems)

  3. Impossible to Perfectly Simulate due to Continuum Mathematics (Hypercomputation)

  4. Radically Impossible to Simulate due to Non-Algorithmic Physics 

General Relativity is at Level 3. Scale Relativity is at Level 4. Therefore, in this practical, physical sense, the notion of "more non-simulable" is a meaningful way to describe the difference between a theory that is non-computable due to its mathematical structure (GR) and a theory that is non-computable due to its fundamental physical principle of infinite complexity (SR).

Why Relativity is Fundamentally Incomputable?

The idea that Einstein's  General Relativity (GR), the pillar of our understanding of the cosmos, might be fundamentally non-simulable can be surprising. Yet, this notion is not mere philosophical speculation; it is grounded in concrete mathematical and physical arguments. It reveals that the theory possesses an infinite richness that the discrete world of algorithms cannot fully capture.

The incomputability of GR does not stem from any "magic" in its equations, but from its very mathematical framework: it is a theory of differential equations operating on the spacetime continuum. Here lies the first barrier. Mathematicians like Marian Pour-El and Ian Richards have demonstrated that even a simple physical equation can have an incomputable evolution if its initial conditions, though perfectly defined, contain non-algorithmic information (such as an uncomputable real number). Since Einstein's equations operate on the continuum, they fully admit the possibility that a minute, incomputable complexity in the universe's initial state could render its entire future evolution non-simulable as well.

Furthermore, the very structure of GR drives it to form singularities—points where spacetime curvature becomes infinite and the laws of physics break down. These "points of rupture" are potential sources of indeterminism. 

Roger Penrose's famous "Cosmic Censorship Conjecture," which posits that every singularity must be hidden behind an event horizon, is an attempt to protect the predictability (and thus, the computability) of the universe. The very fact that such a conjecture is necessary proves that GR, left to its own devices, flirts with the incomputable. It belongs to the realm of hypercomputation, where laws operate on the infinite complexity of real numbers.

This is where Laurent Nottale's Scale Relativity (SR) enters the picture—not to simplify the landscape, but to make it both more coherent and more profoundly complex.

In the framework of Scale Relativity, singularities in the sense of GR no longer exist. The theory begins with the premise that spacetime is never smooth but is fundamentally fractal. As one approaches what would be a singularity in GR, spacetime does not "tear." Instead, its fractal complexity explodes. The number of possible paths (geodesics) diverges to infinity, and the distance to the center, measured along these paths, also becomes infinite. The singularity is replaced by an impenetrable barrier of complexity, thus fulfilling Penrose's wish in a different way: the singularity becomes physically inaccessible.

One might think that by eliminating these pathological points, SR "facilitates" computability. In one sense, this is true: it makes the theory more coherent, without points of breakdown. But here, a magnificent paradox emerges. To solve the problem of singularities (a pathological infinity), SR introduces an even more fundamental and pervasive infinity: the infinite structural complexity of its fractal fabric.

The transition from GR to SR is therefore not a move from the incomputable to the computable. It is a shift in the very nature of infinity:

  • General Relativity is incomputable (Level 3) because its arena is the continuum—a "simple" but fragile infinity, prone to rupture.

  • Scale Relativity is "even more" incomputable (Level 4) because, in addition to the infinity of the continuum, every segment of that space possesses an infinite internal structure.

In conclusion, Scale Relativity, by resolving the problem of GR's singularities, does not make the universe simpler or easier to simulate. On the contrary, it reveals that the reason singularities do not exist is because reality is, at a fundamental level, infinitely more complex and richer than the smooth geometry of Einstein could ever have imagined. It is a solution of extraordinary elegance and depth.


The Timeless Origin in our Incomputable Reality.

Introduction: Three Models of Creation in a Non-Algorithmic Universe

The discovery that our universe may be fundamentally non-algorithmic—meaning it cannot be perfectly simulated by any computer program—forces us to reconsider the nature of its origin. If the universe has a "Creator," this entity cannot be a simple programmer. This leads us to three distinct and increasingly sophisticated models of creation: the Engineer, the Source, and the Genitor. Let's explore each in turn.




1. The Creator-as-Engineer

The Model

This is the most intuitive and mechanistic view of creation. The Engineer is a cosmic architect who designs the universe based on a set of rules and initial conditions. This Creator writes the "source code" of reality—the fundamental laws of physics—and then initiates the "program" by setting the Big Bang in motion. The universe then evolves autonomously, following these pre-established algorithmic instructions. This model is analogous to a computer scientist creating a sophisticated simulation like Conway's Game of Life, where complex, seemingly unpredictable behavior emerges from a simple, finite rule set.

Connection to Non-Algorithmic Physics

This model is fundamentally in conflict with the idea that our universe is intrinsically non-algorithmic. If the Engineer's creation is based on a finite set of laws (an algorithm), then the universe itself must be algorithmic at its core. Any "undecidable" or "incomputable" phenomena we observe would merely be illusions of complexity arising from a deterministic, computable base. The true non-algorithmic reality, if it exists, would belong to the realm of the Engineer, not our own. Our universe would be a simulation, precisely what the Faizal et al. paper argues is logically impossible based on our physical observations.

Relationship with Time

This model is the least compatible with a time-independent Creator. An algorithm is, by its very nature, a temporal process—a sequence of steps that must be executed one after another. The act of "programming" and "running" the simulation implies a timeline in which the Creator operates. This Creator seems bound by time, acting as an agent within a sequence of events. The model fails to provide a satisfactory account of a truly transcendent, timeless origin.


2. The Creator-as-Source

The Model

This is a more abstract and metaphysical model, inspired by philosophical concepts like Spinoza's "Deus sive Natura" (God or Nature). The Source is not an external agent who acts upon the universe but is the immanent, underlying principle of existence itself. It is the logical and metaphysical foundation from which all reality emanates. The laws of physics are not a "code" written by the Source; they are facets of the Source's own timeless and unchanging nature. The Source does not create the universe as a separate act; the universe is the expression of the Source.

Connection to Non-Algorithmic Physics

This model is highly compatible with a non-algorithmic universe. If the universe is the expression of a timeless, self-consistent logical structure, it would naturally be non-algorithmic. Its properties would be based on inherent logical necessity rather than a sequential computational process. The "undecidable truths" of our universe would be direct consequences of the Source's infinitely complex and self-referential nature. In this view, the universe is non-algorithmic because its very foundation is a logical, rather than computational, principle.

Relationship with Time

This model is perfectly time-independent. The Source is conceived as a static, eternal principle, existing outside of any temporal flow. However, this strength is also its greatest weakness: it struggles to explain the origin of a dynamic, evolving universe. How does a static, unchanging "is-ness" give rise to a dynamic, temporal "becoming"? The model provides a powerful account of the universe's logical structure but offers little explanation for its temporal unfolding—the very existence of time, change, and evolution.


3. The Creator-as-Genitor

The Model

This third model is the most subtle and powerful, providing a bridge between the previous two. The Genitor is a Creator whose act is not one of engineering or programming, but of generation or reproduction. Like a living organism giving birth to offspring, the Genitor imparts its own fundamental nature to a new, autonomous reality. The creative act is not the writing of an external set of instructions, but the transmission of an internal, non-algorithmic essence—a "spark" or "seed" of potential. Our universe then unfolds from this seed, developing its own complexity while retaining the inherited non-algorithmic nature.

Connection to Non-Algorithmic Physics

This model offers the most elegant explanation for a non-algorithmic universe with an origin. It posits that non-algorithmic complexity is a fundamental property that can be propagated. Our universe is non-algorithmic because it was "born" from a non-algorithmic parent reality. This resolves the paradox of the simulation: our universe is not a computer program but an autonomous entity, as "real" as its source. The arguments of Faizal et al. apply directly and authentically to our reality.

Relationship with Time

This model masterfully reconciles time independence with a dynamic creation. The Genitor, as the source, can be transcendent and timeless. Its creative act is not a process that unfolds in time but a single, timeless act of logical or metaphysical causation whose result is a time-bound universe. It's analogous to a timeless author conceiving of an entire story whose characters then live and evolve within their own narrative timeline. The Genitor is the timeless principle of "generation" from which our specific, temporal universe logically and necessarily unfolds. It allows for a Creator that is timelessly transcendent (like the Source) yet can be the cause of a dynamic, evolving cosmos (like the Engineer's creation).

The Creator-as-Genitor model: a time-independent Creator?

The Creator-as-Genitor model might be the most coherent way to imagine a time-independent Creator instigating a time-bound universe.

Let's break down why. At first glance, it seems like a contradiction. Words like "genitor," "generation," and "process" are deeply embedded in our experience of time. However, we must distinguish between the nature of the act and the nature of its result.



1. The Act of Creation is Not a Process in Time

For a time-independent being, the act of "creation" would not be a sequence of events. It would not be:
Step 1: Conceive of the universe.
Step 2: Design the laws.
Step 3: Initiate the Big Bang.

This is the thinking of an "Engineer" Creator, who is bound by a process.

For a Genitor-Creator, the act of creation would be a single, timeless, and total act of logical or metaphysical causation. It's not a verb that unfolds in a timeline; it's a state of being whose consequence is our universe.

Analogy: Think of a great novel. The characters within the novel live in a timeline. They are born, they age, they die. The author, however, exists outside of that timeline. The entire story—beginning, middle, and end—may exist simultaneously in the author's mind. The "act of creating the story" is not something that happens on page 50. It is a single, transcendent act whose result is the entire, time-bound narrative.

2. The Genitor Transmits a Nature, Not a Set of Instructions

This is the most crucial point.

  • An Engineer gives a universe a set of instructions (an algorithm) that must be executed sequentially in time. The algorithm itself is a temporal process. This makes it difficult to imagine how a truly time-independent being could operate this way.

  • A Genitor, on the other hand, imparts its fundamental nature. A timeless principle, like a mathematical truth or a fundamental physical law, is not a process. E=mc² does not "happen" in time; it simply is.

The Genitor-Creator would be the timeless source of the non-algorithmic principles that govern our universe. The manifestation and unfolding of these principles create our time-bound reality, but the principles themselves are timeless.

3. The Relationship is Logical, Not Temporal

The link between a time-independent Genitor and our universe is not one of temporal cause-and-effect (a domino hitting another). It is a relationship of logical or metaphysical dependence.

Our universe exists because of the Genitor's nature, in the same way the conclusion of a logical proof exists because of its premises. There is no time delay between premises and conclusion. The dependence is instantaneous and timeless.

The "germination" of our universe is not a biological process that takes time, but the logical unfolding of a potential inherent in the timeless nature of its source.

Conclusion: The Most Coherent Model

When we compare the three models, the Genitor model emerges as the most compatible with time independence:

  • The Engineer: This model is the least compatible. An algorithm is intrinsically a process that unfolds in time. This Creator seems bound by time.

  • The Source (Spinoza's God): This model is perfectly time-independent, as it is a static, eternal principle. However, it struggles to explain the origin of a dynamic, evolving universe. How does a static "is-ness" give rise to a dynamic "becoming"?

  • The Genitor: This model provides the perfect bridge. It allows for a Creator that is timeless and transcendent (like the Source) but can also be the cause of a dynamic, evolving universe (like the Engineer's creation). It achieves this by reframing creation not as a mechanical act of programming, but as a timeless act of generation.

So, yes. A Creator-as-Genitor could absolutely be time-independent. It would not be a cosmic clockmaker who builds a clock that runs in time; it would be the timeless principle of "clockness" from which our specific, time-bound clock necessarily and logically unfolds.