Coherence Theory
Emergent Physics from the Selection of Stable Patterns
11 axioms. 3 budgets. The entire universe.
We propose a unified framework in which quantum mechanics, general relativity, the Standard Model, and cosmological structure all emerge as equilibrium outcomes of a single selection principle: persistence of coherent patterns under finite multidimensional budgets. From minimal metaphysical priors — that patterns exist, poke one another, and survive or fail under bounded resources — we derive the full mathematical scaffolding of modern physics.
Constants such as ℏ, G, and Λ arise as stationary budget multipliers. Gauge symmetry, spacetime curvature, and quantum probabilistic structure are shown to be the coherence-optimal configurations of surviving patterns.
Computational validation on the canonical TD6 13-node contact graph produces parameter-free predictions matching observations across 9 orders of magnitude in energy scale, including the reactor neutrino mixing angle θ13 = 8.67° (observed 8.61 ± 0.12°, tension 0.48σ) and the CMB scalar amplitude As ≈ 2.4 × 10−9.
The Claim
This paper derives quantum mechanics, general relativity, the Standard Model of particle physics, and cosmological structure from a single selection principle: patterns persist if and only if their coherence exceeds their weighted cost.
A pattern survives when its coherence exceeds the total price of its budgets.
From this one inequality and eleven metaphysical priors — statements so basic they are almost trivially true — every constant, every force, every particle, and the dimensionality of spacetime itself is derived. Not assumed. Not fitted. Derived.
- 1Neutrino mixing angle θ13 = 8.67° predicted, 8.61 ± 0.12° observed (0.48σ tension)
- 2CMB scalar amplitude As ≈ 2.4 × 10−9 predicted, Planck observed 2.1 × 10−9
- 3Gauge coupling unification at ~1017 GeV without supersymmetry
- 4Quark mass hierarchies and CKM mixing from graph-theoretic leakage distances
Every result in this paper has a finite chain back to the 11 axioms. Hover on any node to see its dependencies. Click to jump to that section. Nodes light up as you scroll through the paper.
The Priors
Patterns inhabit a locally finite contact graph G = (V, E). Each pattern occupies a finite support. All influences are pokes: bounded-reach disturbances. A tick is a repeatable reference poke — clocks are not assumed but selected by synchronization stability.
The power of these axioms is not in what they assert — each is almost trivially true — but in what they forbid. From these prohibitions, all of physics follows.
| Prior | Plain | Kernel | Falsifier |
|---|---|---|---|
| A1 | Patterns are fundamental | There exists a class P of re-identifiable regularities at finite support on a contact graph G = (V, E). | No regularity can be re-identified across windows on any lens. |
| A2 | Some patterns persist | There exists a pattern A with positive coherence on some lens. | All patterns decohere immediately under admissible pokes. |
| A3 | Relational existence | Coherence is evaluated relative to a neighborhood in G providing pokes and reference ticks. | There exist two configurations with distinct populations on the neighborhood for which the measured coherence is identical on all lenses. |
| A4 | Pokes are local; ticks are pokes | Admissible pokes form a bounded-reach family on G (poke cone); ticks are repeatable instances thereof. | Observed instantaneous nonlocal influence (no finite reach). |
| A5 | Selection | Survival frequencies differ on realized windows; coherence discriminates. | All patterns survive equally under admissible pokes. |
| A6 | Persistence has a cost | There exist operational budgets B >= 0 that constrain persistence. | Patterns maintain identity under arbitrary pokes without any constraint. |
| A7 | Budgets are finite | For any lens, there is a finite bound on admissible stress before coherence drops. | A pattern endures unlimited pokes indefinitely with no adaptation. |
| A8 | Budgets are multidimensional | The budget space has dimension d >= 2 (proven to equal 3 by the Hodge decomposition theorem). | A single scalar explains all persistence behaviors across poke classes. |
| A9 | Irreducible openness | At any nontrivial lens, there exist disturbance directions not captured by current essentials. | There exists a pattern whose empirical deviation remains exactly zero under the full admissible poke family. |
| A10 | Observations are patterns | Lenses obey the same rules: finite budgets, must cohere with their neighborhood, compete with alternatives. | A costless universal lens that never destabilizes selection. |
| A11 | Adaptation required | Over windows, survivors reorganize (tiling/scaffolding) to keep selection score non-negative. | Fixed, non-adapting patterns survive indefinitely across environments. |
CT does NOT assume: Hilbert spaces, C*-algebras, metric geometry, Lorentzian signature, field equations, gauge groups, or particle content. Operator algebras and geometry are derived later from these priors alone.
The Auditable Chain
The priors organize into four layers, each building on the previous:
Contact graph, neighborhoods, bounded-reach pokes, ticks as repeatable pokes via mutual synchronization.
Realized profiles are convex and closed on each lens. Budgets are constructed as Minkowski gauges on unit frontiers.
Finite headroom and multidimensional trade-offs yield a concave value function and supporting prices (multipliers). Symmetry reduction produces exactly three canonical budget directions.
Lenses are patterns with cost. Adaptation yields quasi-local tiling and inductive limits. Finite propagation precedes geometry and selects Lorentzian signature.
Selection and Budget Geometry
The central theorem. One inequality governs all persistence.
Every trade-off in nature — speed versus quality, features versus simplicity, growth versus stability — is one of exactly three fundamental budget dimensions. Not two, not five. Three. And this is a theorem, not a guess.
Selection Inequality
A pattern persists if and only if its coherence exceeds its weighted cost. Patterns on the boundary (Sel = 0) define the coherence frontier — the edge between survival and extinction.
The selection value equals coherence minus the total price of all budgets. If this is negative, the pattern dies.
Exchange Equalization at SEP
At the Selected Equalization Point (SEP), marginal gains per unit budget are equalized across all active dimensions. No cost-neutral reallocation can increase coherence.
The trade ratio between any two active budgets equals the ratio of their prices.
Three Canonical Budget Directions
Any admissible budget functional decomposes into exactly three orthogonal components via a discrete Hodge decomposition on the contact graph: Throughput, Complexity, and Leakage. This is a mathematical theorem, not an empirical observation. Fewer cannot stabilize open systems. More would contradict finiteness.
Every budget is a non-negative combination of exactly three independent types.
| Budget | Hodge Component | Measures | Minimized By |
|---|---|---|---|
| B_th | Gradient flow (im D_T) | Net routing, transport, I/O, API calls | Caching, parallelism, canceling redundant flows |
| B_cx | Cycle-space (ker D_T^T) | Internal coordination, abstractions, dependencies | Cycle-free wiring, inlining, deleting code |
| B_leak | Boundary flux (R_partial) | Boundary exposure, unhandled errors, trust loss | Pointer alignment, insulation, error handling |
The Selected Equalization Point (SEP)
SEP is the unique point on the coherence frontier where the system is maximally efficient — no cost-neutral budget reallocation can improve coherence. Physical constants are the exchange rates at this equilibrium.
On the canonical TD6 tile, the SEP multipliers are computationally verified:
With the selection inequality and three budgets established, two sectors emerge: a fast sector where leakage dominates throughput (quantum mechanics) and a slow sector where throughput dominates leakage (general relativity). The same budgets, different regimes, different physics.
Quantum Mechanics from Fast-Sector Coherence
Quantum mechanics is not weird. It is the cheapest possible bookkeeping system for tracking fast-moving patterns under finite budgets.
In the fast sector — where events happen faster than the scaffold can stabilize — the system needs an algebra of observables. CT does not assume Hilbert spaces, wave functions, or operator algebras. Instead, it derives them from the selection inequality and the budget structure at SEP.
Homogeneous Self-Dual Effect Cone
At SEP, the cone of feasible effects on any coherent region is homogeneous and self-dual. This is not assumed — it follows from pointer alignment, ampliation invariance (B3), and the quadratic metric. It means the mathematical structure of quantum observables is selected by coherence optimization.
Local neutrality yields a symmetric bilinear form. Pointer alignment selects a leakage-minimizing eigenbasis. SEP supplies interior transitivity by cost-neutral relabelings (B5). Polarity identifies the cone with its dual.
No C*-algebra or Hilbert space is assumed. The complex Hermitian PSD cone is selected by budget optimization. A9 (irreducible openness) rules out classical (Boolean) lattices. The complex field is cheaper than real or quaternionic alternatives.
Why Complex Numbers?
A9 (irreducible openness) forces a non-Boolean effect lattice, ruling out classical (real diagonal) structure. Under pointer drift, a central continuous controller is needed to cancel leakage. The unique budget-optimal controller is a one-dimensional U(1) phase — which selects complex over real or quaternionic algebras.
Real: No central U(1) phase. Cannot cancel pointer drift. Deselected.
Quaternionic: Central controller is SU(2), 3 generators. Strictly higher B_cx than complex. Deselected.
Complex: Central U(1), 1 generator. Minimal B_cx. SELECTED
GKSL Generator Form (Lindblad Dynamics)
Any uniformly continuous, trace-preserving evolution that minimizes leakage under pointer alignment takes the Lindblad form. The Schrodinger equation and decoherence dynamics are not postulated — they are the unique coherence-optimal evolution for fast-sector patterns.
The generator of time evolution has exactly two parts: a reversible Hamiltonian part (unitary throughput) and an irreversible dissipative part (leakage to environment).
Planck’s Constant as a Budget Multiplier
Planck's constant is the inverse of the throughput multiplier. It is not a mysterious fundamental constant — it is the price of throughput in the fast sector.
Heisenberg Uncertainty as a Budget Constraint
The uncertainty principle is not a fundamental mystery but a consequence of KKT stationarity at SEP. You cannot simultaneously minimize both position and momentum budgets.
Quantum mechanics emerges as the fast-sector bookkeeping system. But what about the slow sector — where patterns are large, persistent, and their scaffolds dominate? That is where geometry and gravity emerge.
The Relativistic Sector — Slow Geometry
Gravity is the aggregate routing cost of persistent scaffolds. Spacetime geometry is not a stage on which physics happens — it is a derived consequence of patterns optimizing their budgets.
In the slow sector, patterns persist long enough to form scaffolds — stable configurations that other patterns organize around. The budgets of these scaffolds, when taken to a continuum limit, reproduce Einstein’s general relativity exactly.
Finite Propagation and the Causal Cone
Finite throughput per hop (A7) means influence cannot propagate infinitely fast. This creates a Lieb-Robinson type causal cone — a maximum speed of influence propagation — which is the foundation of special relativity. No spacetime metric is assumed; the cone emerges from the budget constraint.
Why Second-Order Dynamics (k = 2)
Dynamics of order k ≥ 3 create multiple causal cones (deselected — ambiguous causality). First-order (k = 1) dynamics allow instantaneous propagation (deselected by finite throughput). Only k = 2 survives: second-order differential equations, which is exactly what we observe in both Newton’s and Einstein’s formulations.
Dimensional Optimality: d = 3
We live in three spatial dimensions because 3 is the unique minimum of the total budget cost function C(d). For d < 3, routing costs explode (too few paths). For d > 3, leakage explodes (too much boundary surface). Only d = 3 balances both.
The total cost of maintaining coherent patterns at scale R has a unique minimum at d = 3.
- Stable orbits (gravity ~ 1/r²)
- Rich chemistry
- Complex 3D structures
- Life possible
The cost function C(d) = λthBth(d) + λcxBcx(d) + λleakBleak(d) has a unique minimum at d = 3. Low dimensions (d ≤ 2) have routing bottlenecks that inflate Bth. High dimensions (d ≥ 4) have excessive boundary surface that inflates Bleak as Rd−1. Only d = 3 balances both. This is a theorem, not a guess.
Einstein-Hilbert as the Coherence Gamma-Limit
When the slow-sector selection functional is taken to a continuum limit (as tiles become infinitesimally small), it converges to the Einstein-Hilbert action — the functional whose stationarity gives Einstein's field equations. General relativity is not postulated; it is the unique continuum limit of coherence optimization.
The discrete budget functional on tiles converges to the Einstein-Hilbert action, with Newton's constant and the cosmological constant appearing as budget multipliers.
Einstein’s Field Equations
Stationarity of the Einstein-Hilbert action yields Einstein's field equations with cosmological term.
The fast sector gives quantum mechanics. The slow sector gives general relativity. But these two sectors must interact — and the interface between them selects the gauge structure of the Standard Model.
Gauge Structure and the Standard Model
The Standard Model is not a collection of empirical facts. It is the cheapest possible way to organize persistent patterns under three budget constraints. There is no other gauge group that works.
The three orthogonal budget directions — throughput, complexity, and leakage — each select one independent gauge factor. The result is unique: SU(3) × SU(2) × U(1), the gauge group of the Standard Model.
Gauge Groups from Budget Roles
Three budgets uniquely select three gauge factors: SU(3) for transport symmetry (throughput), SU(2) for coordination control (complexity), and U(1) for residual leakage calibration. No other combination passes all budget constraints.
Each budget constraint selects one gauge factor. Click through candidates to find the one that passes all checks.
- ✓Faithful 3D complex rep
- ✓Pair singlet exists
- ✓Cubic singlet exists
- ✓Non-abelian (rich transport)
- ✓Non-abelian controller
- ✓Faithful complex doublet
- ✓Minimal generators
- ✓Central phase
- ✓Lens-neutral calibrator
- ✓Minimal
Chiral Selection: Why Left-Handed Doublets
Minimizing complexity (B_cx) under the constraint of non-removable CP violation selects left-handed SU(2) doublets and right-handed singlets. The weak force couples only to left-handed particles not because of an arbitrary choice but because this is the budget-minimal configuration that allows matter-antimatter asymmetry.
Three Generations (N_g = 3)
Why are there exactly three copies (generations) of quarks and leptons? Because you need at least 3 for CP violation (matter-antimatter asymmetry), and 4 or more adds quadratic complexity cost with zero additional coherence gain. Three is the minimum viable count.
The complexity cost of N generations grows quadratically, but CP violation requires at least 3. Selection peaks exactly at N = 3.
N < 3: no CP violation (infeasible). N = 3: minimum viable (1 Jarlskog invariant). N > 3: B_cx grows as N2 with zero additional coherence. Selection peaks at N = 3.
Mass Hierarchies and Yukawa Textures
Fermion masses follow an exponential hierarchy determined by leakage distances on the contact graph. The Yukawa coupling between two fermion types is exponentially suppressed by the graph-theoretic distance between their pointer bases. This is why quarks mix weakly (small CKM angles) while neutrinos mix strongly (large PMNS angles) — same mechanism, different graph distances.
Each Yukawa coupling is exponentially suppressed by the leakage distance between the two fermion pointer bases.
Any rotation mixing two sector axes costs a complexity surcharge proportional to the square of the mixing angle.
From eleven axioms, we have derived: quantum mechanics, general relativity, d = 3 dimensions, and the complete Standard Model. One question remains: what does this framework say about the universe as a whole — the big bang, dark matter, dark energy?
Cosmology and the Dark Sector
The dark sector is a phase alignment problem, not new particles.
The Coherence Bounce
Inflation is not a separate force driven by a hypothetical inflaton field. It is the collective acceleration of coherence density as surviving scaffolds synchronize. The exponential expansion law arises from compounding multiplicative survival probabilities per tick — the same mathematics as compound interest.
When environmental pokes are uncorrelated, CL → 0 for all patterns (noise). Once a sub-ensemble finds a self-reinforcing rhythm satisfying Sel ≥ 0, its coherence probability increases exponentially as neighboring patterns synchronize ticks. This exponential cascade is what we call inflation.
Metric Expansion from Throughput
The Hubble parameter is one-third the instantaneous relative growth rate of the throughput budget. The universe expands because surviving scaffolds claim more throughput.
Dark Matter as Subcoherent Scaffolds
Dark matter is not a mysterious new particle. It consists of patterns that are internally coherent but weakly phase-aligned with our measurement apparatus. They gravitate (because gravity cares about throughput, which is phase-independent) but do not radiate (because electromagnetism cares about phase alignment).
A subcoherent scaffold has high local CL but weak alignment coefficient r with respect to observational lenses. Its visible brightness scales as r2Neff, making it dark for high-coherence detectors.
The visible brightness of a subcoherent pattern is exponentially suppressed by the detector's internal coherence.
Dark Energy as Residual Leakage
The cosmological constant is the slow-sector leakage multiplier at SEP. It represents the irreducible cost of maintaining cosmic coherence.
Arrow of Time from Selection
Time has a direction because selection does. Surviving pokes monotonically increase coherence density — the universe cannot “un-select.” The arrow of time is not an assumption; it is a consequence of A5 (selection) and A11 (adaptation).
From axioms to quantum mechanics, gravity, the Standard Model, and cosmology. Now for the synthesis: one equation, all of physics, and the computational validation that proves it works.
Unification and Computational Validation
Physics is an economy of coherence. Every persistent structure — from quarks to galaxies — is a survivor of the same selection game under finite budgets.
One equation. Three budgets. All of physics.
Physical Constants as Stationary Exchange Rates
Physical constants are not mysterious numbers handed down by nature. They are the stationary exchange rates of the coherence economy at SEP. Each constant is a ratio of budget multipliers, determined by optimization.
Physical constants are not mysterious numbers. They are prices in a coherence economy — exchange rates at the Selected Equalization Point.
The same invariant that governs quark masses also predicts the amplitude of the CMB. Adjust the inputs and watch the prediction change. The “canonical tile” preset gives the validated values.
The Canonical Tile TD6
All quantitative predictions come from a single graph: a 13-node contact graph with dihedral D6 symmetry. Six interior “pointer” nodes form a hexagon (representing left/right chiral bases). Six boundary nodes form a leakage ring. One central node mediates complexity. This is the minimal graph that satisfies all coherence guardrails.
- Graph Construction: Build TD6 with Dirichlet conductances
- Hodge Decomposition: Compute projectors separating the three budget sectors
- W-Spectrum: Construct pointer-weight matrix from leakage operator (non-circular)
- SEP Multipliers: Computed after spectrum as diagnostic equilibrium indicators
- Yukawa Textures: Leakage distances → exponential couplings with η* = 2.0
- Mixing Angles: CKM and PMNS from SVD of Yukawa matrices
- RG Flows: One-loop beta functions from electroweak to GUT scale
The Quantum-Cosmic Link
This is the theory’s signature achievement: a concrete, falsifiable formula connecting the Yukawa invariant η* that governs quark and lepton masses to the amplitude of primordial scalar fluctuations in the cosmic microwave background.
The CMB scalar amplitude is determined by the same invariant that governs quark masses, the cosmic mixing angle from flavor physics, and the density ratio from graph spectroscopy. Zero free parameters.
Empirical Evidence
CT has transitioned from a purely formal framework to a computationally validated theory with quantitative predictions spanning nine orders of magnitude in energy scale.
The Polycrystalline Universe
CT predicted that the universe has a grain structure — coherent domains of ~100-200 Mpc, each with a uniform scaffold orientation, separated by domain walls with quantized misorientations. When SDSS DR19 data was analyzed, the prediction was confirmed at >5 sigma.
This resolves the Hubble tension: local measurements sample an anisotropic flow within one domain, while CMB measurements give the domain-averaged isotropic value. The disagreement between H0 measurements is not an error — it is a predicted feature of polycrystalline structure.
Speculative Predictions
Dark matter's interaction cross-section depends on the detector's internal coherence complexity. Higher-coherence sensors should measure systematically smaller cross-sections.
An evolving dark energy equation of state (w(z) not equal to -1) must be quantitatively correlated with variations in fundamental constants, as both arise from the same leakage-budget dynamics.
Distinctive "mosaic-like" primordial non-Gaussianity from tile-boundary coherence discontinuities, rather than standard inflationary signatures.
Hexapolar (l=6) anisotropy in the Hubble flow within the local domain (~100-200 Mpc), with quantized tilt at pi/6 between domains.
Falsifiers and Future Tests
A theory that cannot be wrong is not science. Here are eight specific, testable predictions that would kill Coherence Theory if falsified. There are no escape hatches.
TEST: Find a persistence phenomenon that requires a fourth independent budget, irreducible to throughput, complexity, or leakage.
TEST: Discover a stable gauge interaction not embeddable in the SM gauge group at accessible energies.
TEST: Discover a 4th generation of fermions at accessible energies that does not decouple.
TEST: Demonstrate a quantum system with zero decoherence under perfect isolation for arbitrarily long times.
TEST: Coordinated cross-calibration between low-N_eff (XENONnT) and high-N_eff (TES) detectors in overlapping mass ranges.
TEST: Measure w(z) and fundamental constant variation; CT predicts quantitative correlation.
TEST: Measure r_tensor from CMB B-modes. CT predicts negligible primordial gravitational waves.
TEST: Future precision measurements (CMB-S4, LiteBIRD) confirm or refute the predicted relationship to sub-percent accuracy.
Coherence Theory has planted its flag on the falsification surface. It has declared: here we stand. If future observations place these quantities far from the predicted relationship, CT falls. If they confirm it, we will have witnessed the unification of microphysics and cosmology through coherence principles — without grand unified theories, without supersymmetry, without extra dimensions. Just patterns, pokes, and budgets, optimized under selection.