Enter a chronicle of the near future spanning the entirety of the 21st century. As technological breakthroughs cascade and multiply, follow the lives of those trapped within the web of this runaway progress.
Thursday, April 16, 2026
Computability of the Universe: Limits of the Bekenstein Bound.
Wednesday, April 15, 2026
Hierarchy of the Incomputable: General and Scale Relativity
We often talk about the universe as a simulation, a grand computer program running the laws of physics. But what if the very fabric of reality is impossible to program? Physicists and logicians have long known that any simulation is just an approximation. However, a fascinating duel between two great theories of gravity—Einstein's General Relativity and Laurent Nottale's Scale Relativity—reveals that there are different levels of impossibility. Some theories are simply "more" non-simulable than others.
To understand this, let's forget about spacetime for a moment and think about a simpler task: simulating a coastline.
Level 1: Simulating a Smooth Beach (The General Relativity Problem)
Imagine a perfect, sweeping, sandy beach. Its curve is smooth and gentle. This is the world of Einstein's General Relativity, where spacetime is described as a smooth, continuous fabric.
Now, you want to create a computer model of this beach. Your computer can't handle a perfect curve; it can only handle a finite set of points. So, you do the obvious: you place a series of points along the beach and connect them with straight lines.
This is an approximation, of course. But it's a good one. If you want a better model, you just add more points. As you increase the resolution, your jagged, digital model gets closer and closer to the smooth reality. You can get arbitrarily close to a perfect representation because the underlying reality is differentiable—it's smooth. The physics between your data points is simple.
This is why we can simulate General Relativity. We create a grid of points in spacetime and calculate the physics on that grid. It’s an approximation of a continuous reality, but it’s a manageable one. The theory is non-simulable in principle (because of the infinite points in a continuum), but we can get as close as we need for any practical purpose.
Level 2: Simulating a Rocky Coastline (The Scale Relativity Problem)
Now, imagine a different kind of coast: the rugged, rocky coastline of Norway. This is the world of Scale Relativity, where spacetime is described as a fractal fabric.
You start the same way, by placing a series of points along the coast to map it out. But here, something strange happens. You decide to zoom in on a single straight-line segment between two of your points to see how good your approximation is.
You don't find a gently curving, nearly straight line. You find a whole new world of complexity: smaller bays, jagged rocks, and tiny inlets that were completely invisible from a distance. The complexity doesn't smooth out; it increases. If you zoom in again on a tiny piece of this new coastline, the same thing happens. This self-repeating complexity at every level of magnification is the definition of a fractal.
This presents a fundamentally deeper problem for any simulation. Your approximation isn't just a low-resolution version of reality; it's a completely different and far simpler object. The physics between your grid points is not simple—it's infinitely complex. Adding more points doesn't just refine the picture; it reveals entirely new universes of structure you didn't even know were there.
A Hierarchy of the Impossible
This is why Scale Relativity is "more" non-simulable than General Relativity. It contains two nested layers of impossible infinity:
The Infinity of Points: The classic problem of trying to model a continuous line with a finite number of dots. (Shared by both theories).
The Infinity of Structure: The radical problem that the path between any two of those dots is itself infinitely complex. (Unique to Scale Relativity).
Simulating General Relativity is a challenge of resolution. Simulating Scale Relativity is a challenge of infinite, nested complexity.
This isn't just an abstract mathematical game. It cuts to the heart of what reality might be. Is the universe, at its smallest scales, a smooth and simple place, as Einstein assumed? Or is it an infinitely intricate coastline, a fractal reality whose depths we can explore forever without ever reaching the end? The answer determines not just what our theories look like, but what is fundamentally knowable, what can be computed, and whether the universe can ever be fully captured in the ones and zeros of a simulation.
Universe: More or less simulable?
Formally, in the strict sense of Turing computability theory, this notion does not exist. A problem is either computable (Turing-decidable) or it is not. There are no degrees.
However, when physicists and computer scientists talk about practical simulation and complexity, they use a hierarchy of concepts that effectively creates a spectrum of "difficulty" or "non-simulability" in a less formal, more practical sense. What I've been calling "more non-simulable" is shorthand for belonging to a higher class of complexity.
Here are the key concepts and the authors who work on them, building a ladder from "easy to simulate" to "radically impossible to simulate."
Level 1: Computable but Practically Intractable (Complexity Classes)
These are problems that a computer can solve in principle, but it would take an impossibly long time (e.g., billions of years). This is the domain of Computational Complexity Theory.
The Idea: Problems are sorted into classes like P (easy to solve), NP (easy to check, hard to solve), and PSPACE (solvable with a reasonable amount of memory but maybe not time). A problem in NP is "less simulable" in practice than a problem in P.
Key Authors/Figures:
Stephen Cook, Richard Karp, Leonid Levin: Pioneers who formalized the P vs. NP problem, the central question of this field.
Scott Aaronson: A modern computer scientist who works on the limits of both classical and quantum computation. His blog and book ("Quantum Computing since Democritus") are essential reading on this topic. He often discusses the "hardness" of simulating physical systems.
Level 2: Turing-Uncomputable (The Halting Problem and its Kin)
These are problems that are provably impossible for any standard computer to solve, no matter how much time or memory it has. This is the classic domain of non-simulability.
The Idea: The Halting Problem (can you determine if an arbitrary program will ever stop?) is the canonical example. Alan Turing proved this is uncomputable. A system whose evolution is equivalent to the Halting Problem is fundamentally non-simulable.
Key Authors/Figures:
Alan Turing: The originator of the entire concept.
Charles H. Bennett: A physicist at IBM who wrote a famous paper in Nature titled "Undecidable Dynamics," showing how certain idealized physical systems could be constructed to have uncomputable behavior.
Toby Cubitt, David Pérez-García, Michael Wolf: The physicists who proved that the spectral gap problem in quantum physics is uncomputable, embedding the Halting Problem into a real physical question.
Level 3: Non-Computable due to Real Numbers / Continuum (Hypercomputation)
This is where the non-simulability of General Relativity sits. The problem is not that the system is performing a Halting Problem-like task, but that its state space is a true continuum, requiring real numbers with infinite precision.
The Idea: A Turing machine works with discrete symbols. It cannot perfectly represent a single real number like π. A physical system whose dynamics depend on the infinite precision of real numbers would be able to perform tasks beyond a Turing machine. This theoretical domain is called hypercomputation.
Key Authors/Figures:
Martin Davis: A logician who wrote extensively on the limits of Turing machines and what lies beyond.
Pour-El and Richards: Mathematicians who studied which differential equations (the language of physics) are computable and which are not. They showed that even a simple wave equation can be non-computable if you give it uncomputable initial conditions.
Jürgen Schmidhuber: A computer scientist who has written about the hierarchy of computational power, from standard Turing machines to machines that can solve the Halting Problem and beyond.
Level 4: Radically Non-Computable (Proposed Physical Theories)
This is the most non-computable level, where theories like Scale Relativity and Penrose's Objective Reduction would reside. Here, the non-simulability is not just a mathematical curiosity but a fundamental physical principle.
The Idea: The universe is not just uncomputable because it uses real numbers or happens to encode a hard problem. It is uncomputable because its very fabric or dynamics are structurally non-algorithmic.
Scale Relativity: Non-simulable because of infinite nested complexity (fractal structure). The information content is infinite in a way that transcends the continuum problem.
Penrose's OR: Non-simulable because it posits a physical process that is explicitly defined as non-algorithmic, a process that "computes" things Turing machines cannot.
- Key Authors/Figures:
Roger Penrose: The most famous proponent of the idea that physics itself must be non-computable to explain consciousness and resolve quantum measurement.
Laurent Nottale: While he doesn't use the language of computation, his physical model of a fractal spacetime, when analyzed, falls into this category. The logical consequences of his physics place it here.
Conclusion: A Practical Hierarchy
So, while a logician would say "simulable" and "non-simulable" are a binary choice, a physicist or computer scientist sees a practical hierarchy of "hardness":
Hard to Simulate (NP-hard problems)
Impossible to Simulate (Turing-uncomputable problems)
Impossible to Perfectly Simulate due to Continuum Mathematics (Hypercomputation)
Radically Impossible to Simulate due to Non-Algorithmic Physics
Why Relativity is Fundamentally Incomputable?
The idea that Einstein's General Relativity (GR), the pillar of our understanding of the cosmos, might be fundamentally non-simulable can be surprising. Yet, this notion is not mere philosophical speculation; it is grounded in concrete mathematical and physical arguments. It reveals that the theory possesses an infinite richness that the discrete world of algorithms cannot fully capture.
The incomputability of GR does not stem from any "magic" in its equations, but from its very mathematical framework: it is a theory of differential equations operating on the spacetime continuum. Here lies the first barrier. Mathematicians like Marian Pour-El and Ian Richards have demonstrated that even a simple physical equation can have an incomputable evolution if its initial conditions, though perfectly defined, contain non-algorithmic information (such as an uncomputable real number). Since Einstein's equations operate on the continuum, they fully admit the possibility that a minute, incomputable complexity in the universe's initial state could render its entire future evolution non-simulable as well.
Furthermore, the very structure of GR drives it to form singularities—points where spacetime curvature becomes infinite and the laws of physics break down. These "points of rupture" are potential sources of indeterminism.
Roger Penrose's famous "Cosmic Censorship Conjecture," which posits that every singularity must be hidden behind an event horizon, is an attempt to protect the predictability (and thus, the computability) of the universe. The very fact that such a conjecture is necessary proves that GR, left to its own devices, flirts with the incomputable. It belongs to the realm of hypercomputation, where laws operate on the infinite complexity of real numbers.
This is where Laurent Nottale's Scale Relativity (SR) enters the picture—not to simplify the landscape, but to make it both more coherent and more profoundly complex.
In the framework of Scale Relativity, singularities in the sense of GR no longer exist. The theory begins with the premise that spacetime is never smooth but is fundamentally fractal. As one approaches what would be a singularity in GR, spacetime does not "tear." Instead, its fractal complexity explodes. The number of possible paths (geodesics) diverges to infinity, and the distance to the center, measured along these paths, also becomes infinite. The singularity is replaced by an impenetrable barrier of complexity, thus fulfilling Penrose's wish in a different way: the singularity becomes physically inaccessible.
One might think that by eliminating these pathological points, SR "facilitates" computability. In one sense, this is true: it makes the theory more coherent, without points of breakdown. But here, a magnificent paradox emerges. To solve the problem of singularities (a pathological infinity), SR introduces an even more fundamental and pervasive infinity: the infinite structural complexity of its fractal fabric.
The transition from GR to SR is therefore not a move from the incomputable to the computable. It is a shift in the very nature of infinity:
General Relativity is incomputable (Level 3) because its arena is the continuum—a "simple" but fragile infinity, prone to rupture.
Scale Relativity is "even more" incomputable (Level 4) because, in addition to the infinity of the continuum, every segment of that space possesses an infinite internal structure.
In conclusion, Scale Relativity, by resolving the problem of GR's singularities, does not make the universe simpler or easier to simulate. On the contrary, it reveals that the reason singularities do not exist is because reality is, at a fundamental level, infinitely more complex and richer than the smooth geometry of Einstein could ever have imagined. It is a solution of extraordinary elegance and depth.
The Timeless Origin in our Incomputable Reality.
Introduction: Three Models of Creation in a Non-Algorithmic Universe
The discovery that our universe may be fundamentally non-algorithmic—meaning it cannot be perfectly simulated by any computer program—forces us to reconsider the nature of its origin. If the universe has a "Creator," this entity cannot be a simple programmer. This leads us to three distinct and increasingly sophisticated models of creation: the Engineer, the Source, and the Genitor. Let's explore each in turn.
1. The Creator-as-Engineer
The Model
This is the most intuitive and mechanistic view of creation. The Engineer is a cosmic architect who designs the universe based on a set of rules and initial conditions. This Creator writes the "source code" of reality—the fundamental laws of physics—and then initiates the "program" by setting the Big Bang in motion. The universe then evolves autonomously, following these pre-established algorithmic instructions. This model is analogous to a computer scientist creating a sophisticated simulation like Conway's Game of Life, where complex, seemingly unpredictable behavior emerges from a simple, finite rule set.
Connection to Non-Algorithmic Physics
This model is fundamentally in conflict with the idea that our universe is intrinsically non-algorithmic. If the Engineer's creation is based on a finite set of laws (an algorithm), then the universe itself must be algorithmic at its core. Any "undecidable" or "incomputable" phenomena we observe would merely be illusions of complexity arising from a deterministic, computable base. The true non-algorithmic reality, if it exists, would belong to the realm of the Engineer, not our own. Our universe would be a simulation, precisely what the Faizal et al. paper argues is logically impossible based on our physical observations.
Relationship with Time
This model is the least compatible with a time-independent Creator. An algorithm is, by its very nature, a temporal process—a sequence of steps that must be executed one after another. The act of "programming" and "running" the simulation implies a timeline in which the Creator operates. This Creator seems bound by time, acting as an agent within a sequence of events. The model fails to provide a satisfactory account of a truly transcendent, timeless origin.
2. The Creator-as-Source
The Model
This is a more abstract and metaphysical model, inspired by philosophical concepts like Spinoza's "Deus sive Natura" (God or Nature). The Source is not an external agent who acts upon the universe but is the immanent, underlying principle of existence itself. It is the logical and metaphysical foundation from which all reality emanates. The laws of physics are not a "code" written by the Source; they are facets of the Source's own timeless and unchanging nature. The Source does not create the universe as a separate act; the universe is the expression of the Source.
Connection to Non-Algorithmic Physics
This model is highly compatible with a non-algorithmic universe. If the universe is the expression of a timeless, self-consistent logical structure, it would naturally be non-algorithmic. Its properties would be based on inherent logical necessity rather than a sequential computational process. The "undecidable truths" of our universe would be direct consequences of the Source's infinitely complex and self-referential nature. In this view, the universe is non-algorithmic because its very foundation is a logical, rather than computational, principle.
Relationship with Time
This model is perfectly time-independent. The Source is conceived as a static, eternal principle, existing outside of any temporal flow. However, this strength is also its greatest weakness: it struggles to explain the origin of a dynamic, evolving universe. How does a static, unchanging "is-ness" give rise to a dynamic, temporal "becoming"? The model provides a powerful account of the universe's logical structure but offers little explanation for its temporal unfolding—the very existence of time, change, and evolution.
3. The Creator-as-Genitor
The Model
This third model is the most subtle and powerful, providing a bridge between the previous two. The Genitor is a Creator whose act is not one of engineering or programming, but of generation or reproduction. Like a living organism giving birth to offspring, the Genitor imparts its own fundamental nature to a new, autonomous reality. The creative act is not the writing of an external set of instructions, but the transmission of an internal, non-algorithmic essence—a "spark" or "seed" of potential. Our universe then unfolds from this seed, developing its own complexity while retaining the inherited non-algorithmic nature.
Connection to Non-Algorithmic Physics
This model offers the most elegant explanation for a non-algorithmic universe with an origin. It posits that non-algorithmic complexity is a fundamental property that can be propagated. Our universe is non-algorithmic because it was "born" from a non-algorithmic parent reality. This resolves the paradox of the simulation: our universe is not a computer program but an autonomous entity, as "real" as its source. The arguments of Faizal et al. apply directly and authentically to our reality.
Relationship with Time
This model masterfully reconciles time independence with a dynamic creation. The Genitor, as the source, can be transcendent and timeless. Its creative act is not a process that unfolds in time but a single, timeless act of logical or metaphysical causation whose result is a time-bound universe. It's analogous to a timeless author conceiving of an entire story whose characters then live and evolve within their own narrative timeline. The Genitor is the timeless principle of "generation" from which our specific, temporal universe logically and necessarily unfolds. It allows for a Creator that is timelessly transcendent (like the Source) yet can be the cause of a dynamic, evolving cosmos (like the Engineer's creation).The Creator-as-Genitor model: a time-independent Creator?
The Creator-as-Genitor model might be the most coherent way to imagine a time-independent Creator instigating a time-bound universe.
Let's break down why. At first glance, it seems like a contradiction. Words like "genitor," "generation," and "process" are deeply embedded in our experience of time. However, we must distinguish between the nature of the act and the nature of its result.
1. The Act of Creation is Not a Process in Time
For a time-independent being, the act of "creation" would not be a sequence of events. It would not be:
Step 1: Conceive of the universe.
Step 2: Design the laws.
Step 3: Initiate the Big Bang.
This is the thinking of an "Engineer" Creator, who is bound by a process.
For a Genitor-Creator, the act of creation would be a single, timeless, and total act of logical or metaphysical causation. It's not a verb that unfolds in a timeline; it's a state of being whose consequence is our universe.
Analogy: Think of a great novel. The characters within the novel live in a timeline. They are born, they age, they die. The author, however, exists outside of that timeline. The entire story—beginning, middle, and end—may exist simultaneously in the author's mind. The "act of creating the story" is not something that happens on page 50. It is a single, transcendent act whose result is the entire, time-bound narrative.
2. The Genitor Transmits a Nature, Not a Set of Instructions
This is the most crucial point.
An Engineer gives a universe a set of instructions (an algorithm) that must be executed sequentially in time. The algorithm itself is a temporal process. This makes it difficult to imagine how a truly time-independent being could operate this way.
A Genitor, on the other hand, imparts its fundamental nature. A timeless principle, like a mathematical truth or a fundamental physical law, is not a process. E=mc² does not "happen" in time; it simply is.
The Genitor-Creator would be the timeless source of the non-algorithmic principles that govern our universe. The manifestation and unfolding of these principles create our time-bound reality, but the principles themselves are timeless.
3. The Relationship is Logical, Not Temporal
The link between a time-independent Genitor and our universe is not one of temporal cause-and-effect (a domino hitting another). It is a relationship of logical or metaphysical dependence.
Our universe exists because of the Genitor's nature, in the same way the conclusion of a logical proof exists because of its premises. There is no time delay between premises and conclusion. The dependence is instantaneous and timeless.
The "germination" of our universe is not a biological process that takes time, but the logical unfolding of a potential inherent in the timeless nature of its source.
Conclusion: The Most Coherent Model
When we compare the three models, the Genitor model emerges as the most compatible with time independence:
The Engineer: This model is the least compatible. An algorithm is intrinsically a process that unfolds in time. This Creator seems bound by time.
The Source (Spinoza's God): This model is perfectly time-independent, as it is a static, eternal principle. However, it struggles to explain the origin of a dynamic, evolving universe. How does a static "is-ness" give rise to a dynamic "becoming"?
The Genitor: This model provides the perfect bridge. It allows for a Creator that is timeless and transcendent (like the Source) but can also be the cause of a dynamic, evolving universe (like the Engineer's creation). It achieves this by reframing creation not as a mechanical act of programming, but as a timeless act of generation.