homenowprojectswritingresearchabout

On this page

  • A Floor You Cannot Engineer Away
  • The Good News: Most Quantum Computers Are Safe
  • The Interesting News: Massive Quantum Systems Hit the Wall
  • Three Unmistakable Signatures
  • What This Means for Quantum Error Correction
  • The Testing Zone
  • The $10^{35}$ Test
  • A Recommended Roadmap
  1. Home
  2. /Writing
  3. /Is Gravity the Ultimate Limit on Quantum Computers?

Is Gravity the Ultimate Limit on Quantum Computers?

March 22, 2026·6 min read
quantum computinggravitydecoherencequantum technologyoptomechanicsphysics

Paper J — Quantum-Geometric Duality Series

Read PDFHTMLDOI: 10.5281/zenodo.19181597All papers →
Quantum-Geometric DualityPart 11 of 12
Previous

A Smoking Gun for Gravitational Decoherence: Watching Entanglement Disappear

Next

Solving Physics' 10³⁵ Discrepancy: Why the Initial State Matters

On this page
  • A Floor You Cannot Engineer Away
  • The Good News: Most Quantum Computers Are Safe
  • The Interesting News: Massive Quantum Systems Hit the Wall
  • Three Unmistakable Signatures
  • What This Means for Quantum Error Correction
  • The Testing Zone
  • The $10^{35}$ Test
  • A Recommended Roadmap

Paper J: Quantum Technology Limits from the Quantum-Geometric Duality Series


A Floor You Cannot Engineer Away

Every quantum technology faces the same enemy: decoherence. Quantum states are fragile, and interactions with the environment cause them to decay into classical states, destroying the quantum advantage.

Decades of extraordinary engineering have pushed decoherence further and further back. Better vacuums, lower temperatures, cleaner materials, more sophisticated error correction---the field has gone from nanoseconds of coherence to seconds.

But what if there is a decoherence floor that no amount of engineering can eliminate?

The Diosi-Penrose hypothesis predicts exactly this. If gravity itself causes decoherence, then any massive object in spatial superposition will lose coherence at a rate

Γgrav=GM2ℏd\Gamma_{grav} = \frac{GM^2}{\hbar d}Γgrav​=ℏdGM2​

where MMM is the mass and ddd is the superposition separation. This rate depends on nothing but mass, separation, and fundamental constants. It cannot be reduced by better shielding, lower temperatures, or purer materials. It is a law of nature.

The Good News: Most Quantum Computers Are Safe

Before anyone panics: the gravitational decoherence floor is astronomically low for conventional quantum computing platforms.

PlatformEffective massSeparationτgrav\tau_{grav}τgrav​
Photonic qubits∼10−36\sim 10^{-36}∼10−36 kg∼1\sim 1∼1 m104810^{48}1048 seconds
Trapped ions∼10−25\sim 10^{-25}∼10−25 kg∼10\sim 10∼10 μ\muμm102110^{21}1021 seconds
Neutral atoms∼10−25\sim 10^{-25}∼10−25 kg∼1\sim 1∼1 μ\muμm102010^{20}1020 seconds
Superconducting qubits∼10−23\sim 10^{-23}∼10−23 kg∼1\sim 1∼1 μ\muμm101610^{16}1016 seconds
NV centers∼10−26\sim 10^{-26}∼10−26 kg∼1\sim 1∼1 nm101910^{19}1019 seconds

The age of the universe is about 4×10174 \times 10^{17}4×1017 seconds. Superconducting qubits, the workhorse of current quantum computing, have gravitational decoherence times ten times longer than the age of the universe. Trapped ions are a thousand times safer still. Photonic qubits, being essentially massless, would need to wait 103010^{30}1030 times the age of the universe.

For these platforms, gravitational decoherence is not a concern, not now and not ever.

The Interesting News: Massive Quantum Systems Hit the Wall

The story changes dramatically for emerging massive quantum technologies. Optomechanical systems---levitated nanoparticles, electromechanical resonators, acoustic resonators---use much heavier objects to achieve quantum behavior.

The quadratic mass scaling (Γ∝M2\Gamma \propto M^2Γ∝M2) is unforgiving. Double the mass and the decoherence rate quadruples. Cross from picograms to nanograms and you lose six orders of magnitude in coherence time.

For a levitated nanosphere of mass 10−1210^{-12}10−12 kg (one nanogram) in a 10 micrometer superposition:

τgrav≈16 microseconds\tau_{grav} \approx 16 \text{ microseconds}τgrav​≈16 microseconds

This is comparable to the best coherence times achieved in current optomechanical experiments. For these systems, gravity may already be the dominant decoherence mechanism---or will become so as environmental noise is further reduced.

Three Unmistakable Signatures

How would you know if you have hit the gravitational floor? Three experimental signatures uniquely distinguish gravitational from environmental decoherence:

1. Inverse separation scaling. This is the sharpest discriminant. Gravitational decoherence predicts Γ∝1/d\Gamma \propto 1/dΓ∝1/d---larger superpositions decohere slower. This is the opposite of every environmental mechanism (photon scattering, gas collisions, blackbody radiation), which all predict Γ∝d2\Gamma \propto d^2Γ∝d2. Vary the separation at fixed mass: if the rate goes down as separation increases, it is gravitational.

2. Material independence. Prepare superpositions of the same mass and separation using different materials---silica, silicon, diamond. Gravitational decoherence depends only on mass and geometry, not on material properties. Environmental decoherence depends on dielectric constants, absorption cross-sections, and surface properties. Identical rates across materials would point to gravity.

3. Temperature independence. As you cool the system, environmental decoherence drops. Gravitational decoherence does not. Below a critical temperature where environmental contributions fall below the gravitational rate, the total decoherence plateaus:

Γtotal(T)=Γgrav+Γenv(T)→Γgrav as T→0\Gamma_{total}(T) = \Gamma_{grav} + \Gamma_{env}(T) \rightarrow \Gamma_{grav} \text{ as } T \rightarrow 0Γtotal​(T)=Γgrav​+Γenv​(T)→Γgrav​ as T→0

A temperature-independent decoherence plateau would be powerful evidence for a gravitational origin.

What This Means for Quantum Error Correction

If the gravitational decoherence floor is real, it sets hard limits on quantum error correction for massive systems. The gravitational error rate per gate is

ϵgrav=tgateτgrav=GM2tgateℏd\epsilon_{grav} = \frac{t_{gate}}{\tau_{grav}} = \frac{GM^2 t_{gate}}{\hbar d}ϵgrav​=τgrav​tgate​​=ℏdGM2tgate​​

For fault-tolerant quantum computing, this must stay below the error correction threshold (∼10−3\sim 10^{-3}∼10−3). With gate times of about 1 microsecond, this imposes a maximum mass:

SeparationMaximum mass
1 μ\muμm40 picograms
10 μ\muμm130 picograms
100 μ\muμm400 picograms
1 mm1.3 nanograms

Above these masses, quantum error correction cannot keep up with gravitational decoherence, regardless of all other noise sources. This is a fundamental constraint on the scalability of massive quantum technologies.

The Testing Zone

A beautiful irony emerges: testing gravitational decoherence is self-limiting. You need massive objects to see the effect, but more massive objects decohere faster, making them harder to prepare in superposition.

The experimental window is bounded from both sides:

  • Lower bound: the mass must be large enough that gravitational decoherence exceeds environmental noise
  • Upper bound: the superposition must survive long enough to be prepared and measured

For typical experimental parameters, this yields a "gravitational decoherence testing zone" spanning masses from about 10−1510^{-15}10−15 to 10−1210^{-12}10−12 kg---femtograms to nanograms. This is precisely the regime that next-generation optomechanical experiments are targeting.

The crossover mass---where gravitational and environmental decoherence are equal---sits at about 4 femtograms for a 10 micrometer separation with 1 second environmental coherence time. Above this mass, gravity dominates.

The 103510^{35}1035 Test

The most dramatic aspect is the sheer size of the experimental discriminant. The G1G^1G1 (Diosi-Penrose) and G2G^2G2 (perturbative QFT) predictions differ by a factor of ∼1035\sim 10^{35}∼1035 at the microgram scale. An experiment sensitive enough to see G1G^1G1 decoherence either confirms Diosi-Penrose or rules it out decisively. There is no ambiguous middle ground.

The Diosi-Penrose model is maximally falsifiable: its single parameter (GGG) is fixed by independent measurements, leaving zero freedom to adjust predictions. It either matches experiment or it does not.

A Recommended Roadmap

The paper proposes a three-phase experimental program:

  • Near-term (1--5 years): Precision decoherence measurements on electromechanical resonators (M∼10−13M \sim 10^{-13}M∼10−13 kg). Material-independence tests across different substrates.
  • Medium-term (5--10 years): Levitated nanosphere experiments sweeping mass from femtograms to nanograms. Systematic measurements of separation and mass dependence.
  • Long-term (10--15 years): Space-based optomechanical experiments for definitive G1G^1G1 versus G2G^2G2 discrimination.

The bottom line: conventional quantum computers have nothing to worry about. But the frontier of massive quantum technologies may be approaching a wall that no engineer can breach---because it is built into the fabric of spacetime itself.


This is Paper J of the Quantum-Geometric Duality series, analyzing gravitational decoherence as a fundamental limit on massive quantum technologies.

Share this post

MS
Marc Sperzel

Builder and independent researcher. MSci Physics, King's College London. Writing about quantum mechanics, gravity, and information theory.

GitHubORCIDEmail

Stay updated

Get notified when I publish new writing or research.

Related posts

A Smoking Gun for Gravitational Decoherence: Watching Entanglement Disappear

March 22, 2026

Does Gravity Operate at Nature's Speed Limit?

March 22, 2026

Why Do Big Things Act Classical? Gravity Might Be the Answer

January 22, 2026