Deriving Thermodynamics as a Theorem of the McGucken Principle dx₄/dt = ic: Resolving Einstein’s Unease — The Probability Measure as Haar Measure, Ergodicity as Huygens-Wavefront Identity, and the Second Law as Strict dS/dt > 0

Dr. Elliot McGucken — Light Time Dimension Theory — elliotmcguckenphysics.com — April 2026

“More intellectual curiosity, versatility and yen for physics than Elliot McGucken’s I have never seen in any senior or graduate student. Originality, powerful motivation, and a can-do spirit make me think that McGucken is a top bet…” — John Archibald Wheeler, Joseph Henry Professor of Physics, Princeton University

Abstract

After 154 years, the McGucken Principle — which states that the fourth dimension is expanding at the rate of cdx4/dt = ic — at long last derives thermodynamics as a theorem of a foundational physical principle. This paper demonstrates that the simple principle underlying thermodynamics, which Einstein hinted at but never saw, has been found.

Einstein declared classical thermodynamics to be the one theory of universal content he was convinced would never be overthrown. His confidence rested, paradoxically, on a dissatisfaction: thermodynamics had never been fully derived from mechanical first principles. Three gaps in the Boltzmann–Gibbs program remain unresolved in the orthodox account: (i) the probability measure on phase space is postulated rather than derived; (ii) ergodicity must be assumed, despite being unproven or false on positive-measure sets for realistic systems (KAM); and (iii) the Second Law requires an extraordinarily low-entropy past as an unexplained boundary condition — the Past Hypothesis, which Penrose estimates requires one part in 10^10123 fine-tuning of the early-universe Weyl curvature.

This paper closes all three gaps as theorems of the single geometric principle dx₄/dt = ic. The phase-space measure is derived as the unique Haar measure on the spatial isometry group ISO(3) of x4‘s spherically-symmetric expansion, forced rather than postulated (Proposition V.1). Ergodicity becomes a geometric identity: the time-vs-ensemble equality holds because the Huygens wavefront emanating from every event along the trajectory physically realizes the ensemble, independent of metric transitivity and unaffected by KAM-tori obstruction (Proposition VI.1). The arrow of time is derived as dS/dt = (3/2)kB/t > 0 strict for all t > 0 for massive-particle ensembles (Theorem VII.1) and dS/dt = 2kB/t > 0 for photons on the McGucken Sphere (Proposition VII.2) — a strict geometric theorem rather than a statistical tendency. Loschmidt’s reversibility objection is dissolved structurally; the Past Hypothesis is derived rather than imposed; Einstein’s 10^-10123 figure quantifies an improbability under a uniform prior that the geometry of x4-expansion does not select.

The framework adds a single falsifiable laboratory prediction: the Compton-coupling diffusion Dx^{McG} = ε2 c2 Ω/(2γ2), temperature-independent, mass-independent after damping, sharply distinguishable from ordinary thermal diffusion which scales as D_{thermal} = kB T/(mγ) and vanishes as T → 0 (Proposition VII.3). Cold-atom experiments at JILA, NIST, and MIT, trapped-ion experiments, ultracold-neutron storage, and precision atomic clocks each provide a sharp laboratory signature.

The three resolutions descend from the two informational contents of dx₄/dt = ic — its algebraic-symmetry content (the spherical-isotropy group of x4‘s expansion) and its geometric-propagation content (the Huygens wavefront on the McGucken Sphere) — which are the two faces of a single Kleinian object. The probability measure derives from the algebraic content, ergodicity and the arrow of time from the geometric content; the three derivations share nothing beyond the starting principle. The structural framework is that of the seven Kleinian dualities of physics established in [MG-KNC]; the present paper develops the pairing of the time-symmetric Noether conservation laws with the time-asymmetric Second Law, against Einstein’s 1949 admission of incompleteness. The three-part claim — that dx₄/dt = ic is the uniquecomplete, and one and only physical specification from which the thermodynamic resolution proceeds — is discharged by reduction to [MG-KNC, Theorem IX.1] (uniqueness), [MG-KNC, §§II–VIII] (completeness), and [MG-KNC, Theorem I.2] (closure).

Keywords: McGucken Principle; dx₄/dt = ic; Einstein’s unease; probability measure; ergodicity; Second Law; Loschmidt reversibility; Past Hypothesis; McGucken Sphere; Compton-coupling diffusion; Haar measure; Huygens propagation; Boltzmann–Gibbs program.


I. Introduction: Einstein’s Unease and the Seven-Level Context

I.1 Einstein’s 1949 Admission

In his 1949 autobiographical notes Einstein wrote that classical thermodynamics “is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts” [Einstein1949]. This sentence is often quoted as a tribute. It is better read as a confession. Einstein had spent 1902–1904 deriving statistical mechanics independently of Gibbs [Einstein1902, Einstein1903], and his 1905 Brownian motion paper [Einstein1905] was the decisive empirical vindication of the molecular-kinetic hypothesis. Yet by 1949 he called thermodynamics a theory of principle — explicitly contrasted with constructive theories built from hypothesized microscopic models. The implication is that thermodynamics survives because it has not been successfully reduced to mechanics, not because the reduction has been completed. Three gaps prevent the reduction:

(i) The probability measure problem. Boltzmann and Gibbs postulate the uniform (Liouville) measure on phase space and the principle of equal a priori probabilities. These are postulates; Liouville’s theorem provides preservation given the choice, not justification for the choice. Jaynes’ maximum-entropy reinterpretation [Jaynes1957] relocates the postulate into epistemology without deriving it from dynamics.

(ii) The ergodicity problem. KAM theory [KAM] demonstrates that generic Hamiltonian perturbations of integrable systems preserve a positive-measure set of invariant tori, so the ergodic hypothesis fails on a set of positive measure. Ergodicity is thus not merely unproven; for typical physical systems it is known to be false.

(iii) The arrow-of-time problem. Loschmidt’s 1876 objection [Loschmidt1876] and Zermelo’s 1896 recurrence objection [Zermelo1896] remain structurally unresolved; Boltzmann’s H-theorem requires the Stosszahlansatz, which smuggles irreversibility into a time-symmetric substrate. The Second Law in the orthodox account is rescued only by the Past Hypothesis [Albert2000, Carroll2010, Wallace2013], with Penrose’s one-part-in-10^10123 fine-tuning of the early-universe Weyl curvature [Penrose2004, PenroseENM] as the honest measure of what this costs.

I.2 The Seven-Level Context

The McGucken Principle dx₄/dt = ic closes all three gaps, and it does so as the Level-2 specialization of a broader structural pattern established in [MG-KNC] — the master synthesis paper demonstrating that the same dual-channel structure generates the Seven McGucken Dualities of Physics from a single geometric principle:

LevelChannel A outputChannel B output
1. Foundational QMHamiltonian operator formulationLagrangian path integral
2. Mechanics/ThermoNoether conservation lawsSecond Law + five arrows of time
3. Dynamical QMHeisenberg pictureSchrödinger picture
4. Ontological QMParticle aspectWave aspect
5. Causal/correlational QMLocal microcausalityNonlocal Bell correlations
6. Mass/energyRest mass (u^μ budget in x4)Energy of spatial motion (budget in x)
7. Space/timeTime t (symmetry parameter)Space x (propagation domain of x4)

Table 1. The Seven McGucken Dualities of Physics from [MG-KNC, Definition I.1]. Level 2 (highlighted) is the subject of the present paper.

Level 2 is structurally the most consequential level for Einstein’s concern, because it is the level at which the dual-channel structure extends beyond quantum mechanics into thermodynamics. Levels 1, 3, 4, and 5 all pair two time-symmetric features within quantum mechanics; Level 2 pairs a time-symmetric feature (conservation laws, Channel A) with a time-asymmetric feature (Second Law, Channel B), and in doing so dissolves the 150-year-old Loschmidt reversibility objection [MG-KNC, §X.2; MG-ConservationSecondLaw]. The present paper takes the dual-channel machinery of [MG-KNC] and applies it specifically to Einstein’s three gaps, with the explicit derivations following [MG-ConservationSecondLaw] — the companion paper on the conservation-laws/Second-Law unification that established Propositions III.1–III.4 and VI.1.

I.3 Reduction of the Three-Part Claim to KNC

Three claims in [MG-KNC] are load-bearing for the present paper, and the present paper does not re-prove them. They are imported wholesale from the master synthesis paper.

Imported from [MG-KNC, Theorem I.1] — Kleinian correspondence at all seven levels. Each of the seven dualities of physics is structurally a Kleinian correspondence between an algebraic-symmetry face and a geometric-propagation face, applied at one specific level of physical description. Level 2 — the conservation-laws / Second-Law pair — is the unique level at which Channel A’s outputs are time-symmetric and Channel B’s outputs are time-asymmetric.

Imported from [MG-KNC, Theorem IX.1] — uniqueness of dx₄/dt = ic among candidate Kleinian foundations. No competing physical principle surveyed in the mathematical-physics literature (Minkowski 1908, Einstein 1915, Yang-Mills, string theory, Loop Quantum Gravity, twistor theory) simultaneously satisfies the four conditions (a)–(d) of [MG-KNC, Theorem IX.1]. dx₄/dt = ic is unique among them.

Imported from [MG-KNC, Theorem I.2] — closure of the Seven McGucken Dualities. Every candidate additional Kleinian-pair duality (Wick rotation, holography, CPT/CP, matter/antimatter, boson/fermion, gauge/matter, classical/quantum, particle/field) either collapses into one of the seven or fails the Kleinian-pair criterion K1–K5. The catalog of seven is closed.

The thermodynamic application of the present paper is, with these three structural claims imported, the explicit Level-2 specialization that closes Einstein’s three gaps. The present paper proves the three Level-2 derivations (measure, ergodicity, arrow of time) on the specific structural foundation that [MG-KNC] has established as unique, complete, and one and only.

I.4 Structure of the Paper

Section II reviews the dual-channel structure and its Kleinian unity, importing the architectural content from [MG-KNC, §§II, X]. Section III surveys the 150-year record of attempts to derive thermodynamics from deeper principles and identifies the structural fault line (time-symmetric substrate cannot generate time-asymmetric output without an external input) that prior programs have failed to cross. Section IV develops the deep mathematical analysis of why dx₄/dt = ic succeeds where prior approaches did not: the kinematic-level asymmetry, the absence of bounded recurrence, and the local isotropy combine to give the unique structural signature that prior programs could not reach. Sections V–VII derive the three resolutions explicitly: §V the probability measure as Channel-A isotropy projected by Channel-B propagation; §VI ergodicity as a geometric identity of Channel-B Huygens wavefront coverage; §VII the Second Law as the strict theorem dS/dt > 0, with explicit rates (3/2)kB/t for massive particles and 2kB/t for photons on the McGucken Sphere. Section VIII shows that under the dual-channel structure Loschmidt’s objection is structurally dissolved and the Past Hypothesis is derived rather than imposed. Section IX closes by demonstrating that Einstein’s 1934 Herbert Spencer Lecture criterion — “the irreducible basic elements as simple and as few as possible without surrendering the adequate representation of a single datum of experience” — is met.


II. The Dual-Channel Structure and Its Kleinian Unity

II.1 The Principle

The McGucken Principle is the single kinematic statement:

dx4 / dt = ic

Here x4 is a fourth spatial dimension perpendicular to the observer’s three-dimensional hyperslice x1 x2 x3t is the observer’s proper time; c is the speed of light; and i is the imaginary unit, encoding the perpendicularity of x4 to the spatial hyperslice. The principle has been under continuous development since the author’s undergraduate work with Wheeler, Peebles, and Taylor at Princeton in the late 1980s; Appendix B of the 1998 UNC Chapel Hill doctoral dissertation [MG-Dissertation] established the 1998 priority on the physical content; the comprehensive five-era chronology is archived at [MG-KNC, Coda; MG-FQXi-2008; MG-FQXi-2009].

II.2 Channel A and Channel B

The structural claim of [MG-KNC, §I.2] is that dx₄/dt = ic carries two logically distinct informational contents that unpack through two distinct derivational channels:

Channel A — algebraic-symmetry content. Temporal uniformity, spatial homogeneity, spherical isotropy as a symmetry statement, Lorentz covariance of the rate, absence of preferred x4-phase origin (U(1)), Clifford-algebraic extensions (SU(2)LSU(3)c), diffeomorphism invariance. Outputs at Level 2: the twelve Noether conservation laws. Time-symmetric.

Channel B — geometric-propagation content. Spherical expansion from every spacetime point at rate c; Huygens’ secondary wavelets as the three-dimensional cross-section of x4‘s expansion; monotonic radial growth of the McGucken Sphere _+(p0) of radius R = ct; isotropic wavefront emission; one-way advance at +ic, not -ic. Outputs at Level 2: Brownian motion, the Second Law, the five arrows of time. Time-asymmetric.

II.3 The Kleinian Unity of the Two Channels

Channel A and Channel B are not two independent mathematical structures that happen to coexist in dx₄/dt = ic. They are the two faces of a single mathematical object under the Klein correspondence between algebra and geometry [MG-KNC, §II]. Klein’s 1872 Erlangen Program [Klein1872] established that every geometry is equivalent to a group — specifically the group of transformations that preserve its characteristic structure — and that the passage between a geometry and its symmetry group runs in both directions because the information content is the same.

Channel A extracts the symmetry group of dx₄/dt = ic (temporal and spatial translation, rotation, Lorentz boost, U(1) phase, SU(2)L and SU(3)c internal gauge, diffeomorphism covariance, and the complex marker i). Channel B extracts the geometric objects that this symmetry group preserves (the forward light cone, the Huygens wavefront, the null hypersurface structure, the time-orientation). These are not two independent structures co-inhabiting the principle. They are the group side and the geometry side of one Kleinian object: the four-dimensional spacetime with perpendicular imaginary fourth axis advancing at rate ic. Noether’s theorem [Noether1918] is the dynamical bridge: each Channel A invariance yields a Channel B-propagated conserved current, and by the inverse Noether theorem (Olver 1986 [Olver1986], Theorem 4.29) propagations with conserved currents force the corresponding symmetries. The two channels exchange content in both directions through this bridge [MG-KNC, §X.2 Remark X.2.1].

Why does this matter for Einstein’s concern? Because the three gaps he identified — measure, ergodicity, arrow of time — are gaps between algebra and geometry in the orthodox statistical-mechanical tradition. The measure is an algebraic object (a Borel measure on phase space) whose geometric source is missing. Ergodicity is a geometric fact (trajectory distribution over an invariant set) whose algebraic warrant is missing. The arrow of time is a geometric monotonicity whose algebraic source in the time-symmetric dynamics is missing. In each case, the missing link is exactly the Kleinian correspondence between group and geometry that the McGucken Principle supplies through its dual-channel structure.


III. Historical Attempts to Derive Thermodynamics from Deeper Principles: A Survey of Failures

Before turning to the McGucken resolution of Einstein’s three gaps, it is essential to survey the 150-year record of attempts to derive thermodynamics from first principles or from deeper foundations. The record is instructive in its consistency: every serious attempt has either smuggled in the very asymmetry it sought to derive, relied on unjustified auxiliary postulates, or produced a partial result that itself requires explanation. The survey is organized chronologically, with each subsection identifying the specific point of failure and the auxiliary content that was imported.

III.1 Boltzmann 1872–1877: The H-Theorem and the Stosszahlansatz

Ludwig Boltzmann’s 1872 paper [Boltzmann1872] introduced the H-function and proved that under the Stosszahlansatz — the assumption that the velocities of colliding molecules are statistically independent immediately before collision — the quantity H = ∫ f f d3 v is monotonically non-increasing. Since H is the negative of entropy (up to additive and multiplicative constants), this appeared to establish dS/dt ≥ 0 from mechanical first principles.

The failure was identified almost immediately. Loschmidt’s 1876 reversibility objection [Loschmidt1876] observed that the underlying Newtonian dynamics are time-reversal symmetric: for every entropy-increasing trajectory there exists, by velocity reversal, an entropy-decreasing trajectory of equal statistical weight. The time-symmetric microscopic laws cannot by themselves produce a time-asymmetric consequence. The Stosszahlansatz — assumed for pre-collision velocities but not for post-collision velocities — is where the asymmetry enters. The argument is circular: Boltzmann assumed molecular chaos to derive the Second Law; the Second Law is equivalent to molecular chaos; nothing is derived that was not already assumed.

Boltzmann’s 1877 response retreated to a statistical interpretation: entropy-decreasing trajectories are overwhelmingly improbable compared to entropy-increasing ones, so for practical purposes the Second Law holds. This resolves the tension by surrendering the derivation. Probability is not necessity; a statistical tendency that is not universal cannot explain an absolute prohibition [Penrose2004]. Zermelo’s 1896 recurrence objection [Zermelo1896] sharpened the failure: any bounded Hamiltonian system returns arbitrarily close to its initial state infinitely often (Poincaré recurrence), so any decrease in H is eventually undone, and the statistical argument can at best establish approximately-monotonic behavior on timescales short compared to the recurrence time. For a mole of gas the recurrence time exceeds 10^1023 years, which is large, but “large” is not the same as “monotonic.”

Point of failure: the Stosszahlansatz imports the time-asymmetry it pretends to derive.

III.2 Gibbs 1902: Ensemble Coarse-Graining

J. Willard Gibbs’ 1902 Elementary Principles in Statistical Mechanics [Gibbs1902] reformulated statistical mechanics in terms of ensembles — probability distributions ρ(q, p, t) on phase space — and distinguished the fine-grained entropy S_{fine} = -kB ∫ ρ ρ dΓ from the coarse-grained entropy S_{coarse} = -kB i i i where i is the average of ρ over a finite phase-space cell Δi.

Liouville’s theorem guarantees that the fine-grained entropy is exactly constant under Hamiltonian flow: dS_{fine}/dt = 0. Only the coarse-grained entropy can increase, and it does so as the fine-grained distribution develops structure on scales smaller than the cell size Γ. The failure: the choice of cell size Γ is arbitrary, and the increase depends on it. Worse, the coarse-graining procedure is not derived from the dynamics; it is imposed by the theorist as a bookkeeping device that mimics the loss of information an observer would experience. The Second Law, in the Gibbs account, is a statement about what the observer tracks rather than what the system does.

Point of failure: coarse-graining is an epistemic operation imposed on a dynamics in which entropy does not actually increase. Liouville’s theorem, correctly interpreted, is a no-go theorem for entropy increase within Hamiltonian mechanics.

III.3 Einstein 1902–1905: The Molecular-Kinetic Program

Einstein’s 1902 and 1903 papers [Einstein1902, Einstein1903] developed statistical mechanics independently of Gibbs and grounded thermodynamic quantities in molecular mechanics. His 1905 Brownian motion paper [Einstein1905] was the empirical vindication: the diffusion relation D = kB T / (6η r) connects microscopic molecular collisions to macroscopic diffusion, and Perrin’s 1908–1913 experiments confirmed Avogadro’s number to parts in a hundred, settling the atomic hypothesis. Einstein had derived more thermodynamic content from mechanics than anyone before him.

Yet Einstein himself recognized the derivation was incomplete. By 1949 [Einstein1949] he called thermodynamics a theory of principle — a category he explicitly contrasted with constructive theories built from hypothesized microscopic models — and confessed that he was convinced it would never be overthrown precisely because the reduction to mechanics had not been completed. The three gaps this paper addresses (probability measure, ergodicity, arrow of time) are the ones Einstein implicitly acknowledged. The molecular-kinetic program gave the right empirical answers while leaving the foundational derivation unfinished.

Point of failure: the derivation works quantitatively but rests on postulates (equal a priori probabilities, ergodicity, low-entropy past) that themselves have no mechanical justification.

III.4 Ehrenfest 1911: The Wind-Tree Model and the Price of a Derivation

Paul and Tatiana Ehrenfest’s 1911 encyclopedia article [Ehrenfest1911] analyzed Boltzmann’s program and introduced the wind-tree model and the urn model as simplified systems in which the H-theorem could be examined rigorously. In the urn model — N balls distributed between two urns, with balls drawn and transferred at each time step — the approach to equilibrium (N/2 balls per urn) is rigorously provable. But the derivation works only because the dynamics is stochastic by construction, not Hamiltonian. The urn model is a Markov chain; Markov chains have monotone approach to equilibrium as a mathematical theorem. Replacing Newtonian molecular collisions with Markov transitions makes the H-theorem trivial, but it also surrenders the goal: one is no longer deriving thermodynamics from mechanics, but from stochastics.

The Ehrenfest analysis made the conceptual situation sharp: you can have either (a) a rigorous derivation of entropy increase, for toy stochastic models that are not actually Hamiltonian mechanics, or (b) an aspirational derivation from Hamiltonian mechanics that always smuggles in auxiliary assumptions. The honest options exclude the one that is wanted.

Point of failure: rigor available only at the cost of replacing the dynamics with a stochastic caricature.

III.5 Birkhoff 1931 and von Neumann 1932: The Ergodic Theorems

George Birkhoff’s 1931 pointwise ergodic theorem [Birkhoff1931] and John von Neumann’s 1932 mean ergodic theorem [vonNeumann1932] established that for a measure-preserving flow t on a phase space (Γ, μ) with total measure μ(Γ) < ∞, the time average along almost every trajectory exists:

f(x) = T → ∞ (1/T) 0T f(t(x)) dt (μ-a.e.)

and equals the space average ∫ f dμ / μ(Γ) if and only if φ is metrically transitive — that is, the only φ-invariant measurable sets have measure 0 or μ(Γ). This was hailed as placing the foundations of statistical mechanics on rigorous ground.

The hope collapsed within three decades. The Kolmogorov–Arnold–Moser (KAM) theorem (1954–1963) [KAM] demonstrated that for generic small perturbations of integrable Hamiltonian systems, a positive-measure set of invariant tori persists, on which the dynamics is quasi-periodic rather than ergodic. The ergodic hypothesis is thus not merely unproven for realistic systems; for typical systems near integrable limits — which include many physically interesting cases — it is known to be false. Markus and Meyer (1974) extended this to show that generic Hamiltonian systems are neither ergodic nor integrable but a mixed mess of tori and chaotic regions.

Birkhoff’s theorem is mathematically unimpeachable. The physical failure is that its crucial hypothesis (metric transitivity) does not hold for the systems to which statistical mechanics must be applied. Sinai’s 1970 proof that hard-sphere gases are ergodic [Sinai1970] is heroic but rests on specific features of hard-sphere collisions; it is not a general theorem for molecular systems with smooth interactions.

Point of failure: the theorem is true, but its hypothesis (ergodicity) is false for the target systems.

III.6 Jaynes 1957: Information-Theoretic Statistical Mechanics

E. T. Jaynes’ 1957 papers [Jaynes1957] proposed a radical reinterpretation: the probability distribution on phase space is not a physical fact about the system but a Bayesian prior reflecting the observer’s state of knowledge. Given macroscopic constraints (energy, particle number), the least-biased prior — the one that assumes nothing beyond what is known — is the distribution that maximizes the Shannon entropy S = -∑ pi pi subject to the constraints. For a system at fixed mean energy, this yields the canonical Gibbs distribution.

The Jaynes program gets the right answers without invoking ergodicity. But it achieves this by relocating the foundations from physics into epistemology: entropy is what the observer does not know. The Second Law becomes a statement about the observer’s information, and the arrow of time becomes the direction in which the observer’s information about a closed system decreases as correlations develop with the environment. This is coherent but surrenders the ontological aspiration of the original program. Einstein did not ask for a derivation of what observers tend to believe; he asked for a derivation of what nature does. Jaynes’ success on epistemic grounds is a failure on the ontological grounds Einstein set.

Moreover, the maximum-entropy principle itself is not derived from dynamics; it is a Bayesian axiom. The observer-dependence is not eliminated by declaring the maximum-entropy distribution objective. A referee on ontological grounds rightly asks: why does the physical world happen to behave as though observers were applying Bayesian inference?

Point of failure: relocates the problem from physics to epistemology rather than solving it.

III.7 Prigogine 1960s–1980s: Irreversibility at the Fundamental Level

Ilya Prigogine and the Brussels school [Prigogine1980] argued that irreversibility must be introduced at the fundamental level — that the time-symmetric Hamiltonian description is incomplete and must be replaced by a non-unitary, dissipative fundamental dynamics. The technical machinery involved non-Hilbert-space representations (rigged Hilbert spaces with Gelfand triples) in which the time-evolution operator acquires complex eigenvalues, producing irreversible decay even at the microscopic level.

The failure is straightforward: no empirical evidence supports fundamental-level irreversibility at the molecular scale. Experiments on time-reversal symmetry in particle physics (CP violation is small, and CPT is exact to the best measurements) bound any fundamental irreversibility to be extremely weak, far smaller than would be needed to account for macroscopic Second-Law behavior. The Prigogine program proposes a radical modification of physics whose empirical motivation is absent. It replaces one problem (deriving macroscopic irreversibility from microscopic reversibility) with another (justifying microscopic irreversibility against overwhelming experimental evidence for microscopic reversibility).

Point of failure: posits the asymmetry at a level where experiment refutes it.

III.8 Penrose 1979: The Weyl Curvature Hypothesis

Roger Penrose’s 1979 Weyl Curvature Hypothesis [Penrose2004, PenroseENM] confronts the problem cosmologically. If the Second Law requires a low-entropy initial condition (the Past Hypothesis), where does that initial condition come from? Penrose proposes that the Weyl curvature tensor Cμνρσ — which encodes the free gravitational field — vanishes at initial cosmological singularities and is unconstrained at final singularities. This distinguishes the Big Bang (low gravitational entropy, C ≈ 0) from black holes and the eventual heat death (high gravitational entropy, C large).

Penrose quantifies the fine-tuning required: the initial state is specified to one part in 10^10123. This figure is not a criticism of the hypothesis; it is a feature. The hypothesis asserts that the universe did in fact begin in this exponentially improbable state, and that the Second Law follows. The failure is that the hypothesis is itself a postulate, not a derivation. No physical mechanism is offered for why the Weyl tensor should vanish initially. Penrose’s Conformal Cyclic Cosmology (CCC) proposal [PenroseCCC] attempts to supply a mechanism through conformal rescaling at the transition between aeons, but the proposal remains speculative and empirically contested. The Past Hypothesis is relocated from the foundations of statistical mechanics to the foundations of cosmology; it is not dissolved. (The cosmological-constant analog of this relocation problem — Weinberg’s “worst theoretical prediction in the history of physics” 10122 vacuum-energy discrepancy — is resolved under the McGucken Principle as an IR rather than UV quantity, with Λ derived from the Hubble radius rather than the Planck scale, per [MG-Lambda, Theorem 2.1].)

Point of failure: imposes the initial condition as an additional postulate; does not derive it.

III.9 Jacobson 1995 and Verlinde 2011: Thermodynamic / Entropic Gravity

Ted Jacobson’s 1995 derivation of the Einstein field equations from the Clausius relation δ Q = T dS applied across local Rindler horizons [Jacobson1995] was a genuinely new move: instead of deriving thermodynamics from gravity, derive gravity from thermodynamics. Given an area-entropy law S = A/4 (Bekenstein–Hawking), plus the Clausius relation, plus the Unruh temperature T = ℏ a/(2π kB c) seen by an accelerated observer, the Einstein field equations follow as an identity.

Erik Verlinde’s 2011 entropic gravity proposal [Verlinde2011] extended this by arguing that Newton’s law F = ma and gravitational attraction can be derived from holographic entropy gradients on screens surrounding masses. Gravity is not a fundamental force but an entropic consequence of information organization on holographic boundaries.

These programs invert the direction of derivation but do not address Einstein’s three gaps. The Clausius relation δ Q = T dS is assumed: the Jacobson derivation presupposes that there is such a thing as entropy, that it obeys the Clausius relation, and that it is proportional to horizon area. The holographic entropy hypothesis is itself an unexplained posit. The programs derive gravitational dynamics from thermodynamic axioms while leaving the thermodynamic axioms themselves ungrounded. They are structurally interesting but do not close the Einstein gaps; they widen them by making thermodynamics load-bearing for a larger portion of physics without explaining where thermodynamics comes from. (Under the McGucken Principle the Jacobson, Verlinde, and Susskind frameworks are each re-derived with their previously assumed thermodynamic content supplied by Channel B propagation: the horizon entropy comes from the x4-stationary mode count on null hypersurfaces [MG-Bekenstein], the Clausius relation follows from Channel B Huygens propagation across the horizon, and the holographic principle is the boundary form of x4-expansion [MG-JacobsonVerlindeMarolf; MG-VerlindeEntropic; MG-Susskind, §III].)

Point of failure: assume the thermodynamic content they claim to explain.

III.10 Decoherence Programs (1970s–2010s): Zeh, Zurek, Joos, Kiefer

The decoherence program, initiated by H. D. Zeh (1970) and developed by Zurek, Joos, and others [Decoherence], proposes that the quantum-to-classical transition — and with it the apparent irreversibility of measurement and the emergence of classical records — arises from the unavoidable entanglement of any system with its environment. Tracing out environmental degrees of freedom produces a reduced density matrix that is effectively diagonal in the “pointer basis” selected by the system-environment coupling, yielding apparent classicality and apparent irreversibility even though the total system-plus-environment dynamics remains unitary and reversible.

Decoherence is real physics, well-established empirically (interferometry, quantum computing noise). As a proposed foundation for the Second Law, however, it faces the same obstacle as Gibbs’ coarse-graining: the reduction is epistemic (what the system-observer can track) rather than ontic (what the universe does). The total wavefunction continues to evolve unitarily; nothing actually becomes irreversible. Moreover, the initial low-entropy state of the system-plus-environment is presupposed — decoherence explains why classicality emerges from a low-entropy quantum state, not why there is a low-entropy state to begin with. The Past Hypothesis remains. (The Copenhagen/decoherence open questions D1–D6 — measurement problem, absence of collapse mechanism, observer problem, unexplained Born rule, undefined Heisenberg cut, and the derivative asymmetry of the Schrödinger equation — are systematically addressed in the McGucken framework via the McGucken Sphere’s six-sense locality and the x4-physical Wick rotation [MG-NonlocCopen, §6; MG-Wick, Proposition IV.1]; the analytic continuation t → -iτ that decoherence treats formally is identified physically as the π/2 rotation in the (x0, x4)-plane that exchanges the real time axis with the imaginary x4 axis.)

Point of failure: epistemic reduction of a problem that has an ontic component; presupposes the low-entropy initial state.

III.11 The Eigenstate Thermalization Hypothesis (Deutsch 1991, Srednicki 1994)

The Eigenstate Thermalization Hypothesis (ETH), formulated by J. M. Deutsch (1991) and Mark Srednicki (1994) [ETH], proposes that individual energy eigenstates of sufficiently chaotic quantum systems already contain thermal expectation values: the diagonal matrix elements of local observables in the energy basis vary smoothly with energy, and their off-diagonal elements are exponentially suppressed. Under ETH, a generic initial state decoheres into local thermal equilibrium through unitary evolution alone, without any coarse-graining.

ETH is empirically well-supported for a wide class of many-body quantum systems. As a foundation, however, it is a hypothesis — it is not derived from general principles but assumed to hold for the relevant systems and then checked numerically. Systems that fail to satisfy ETH (integrable systems, many-body-localized systems) do not thermalize, and the boundary between ETH-satisfying and ETH-violating systems remains an active area of research. ETH is a conditional account: if a system satisfies ETH, it thermalizes; but why any particular system satisfies ETH is not explained, and the arrow of time still requires a low-entropy initial condition. (Under the McGucken Principle, the underlying mechanism is supplied geometrically: ETH-satisfying systems are those for which Channel B Huygens-wavefront coverage of the energy shell within the available time is geometrically dense, and the connection between Euclidean-imaginary-time thermalization in β = ℏ/(kB T) and the physical x4-axis is supplied by [MG-Wick, Propositions VI.1–VI.3] — temperature is identified with the compactification period of the physical x4-axis Δ x4 = c β, making “thermalization” a geometric statement about x4-circle topology rather than a hypothesis to be checked numerically.)

Point of failure: conditional on an unexplained hypothesis; does not address the initial-condition problem.

III.12 Summary of the Historical Pattern

Across 150 years the same structural failure recurs. Every attempt to derive thermodynamics from deeper principles either:

(a) smuggles the asymmetry in through the back door (Stosszahlansatz, arbitrary coarse-graining, Bayesian priors), or

(b) assumes a hypothesis that is either unproven or known false for realistic systems (ergodicity, ETH), or

(c) imposes the required initial condition as an additional postulate (Past Hypothesis, Weyl Curvature Hypothesis), or

(d) modifies the fundamental dynamics in a way empirically unsupported (Prigogine’s non-unitary evolution), or

(e) reduces ontology to epistemology (Jaynes, decoherence), which answers a different question.

The structural pattern is diagnostic. The failures cluster along a single fault line: the time-symmetric foundational dynamics (Hamiltonian or quantum-unitary) cannot, by any amount of internal manipulation, produce a time-asymmetric consequence without an external input. Every program either admits this external input (as the Past Hypothesis or the Stosszahlansatz) or disguises it (as coarse-graining or Bayesian maximum entropy). No program within the time-symmetric framework has succeeded in eliminating the external input because there is no such derivation to be found within the time-symmetric framework. The framework itself is the obstruction.

This is the situation Einstein perceived in 1949, and it is the situation the McGucken Principle resolves — not by strengthening the internal manipulations but by identifying that the fundamental dynamics is not, at bottom, time-symmetric. Section IV develops the deep mathematical reason why the McGucken Principle succeeds where every prior attempt has failed.


IV. Why the McGucken Principle Succeeds: A Deep Analysis

This section provides the structural and mathematical analysis of why dx₄/dt = ic closes the Einstein gaps that no prior program has closed. The key insight is that the McGucken Principle does not compete with prior programs within their shared framework; it changes the framework itself. The change is minimal — adding one kinematic equation for a single extra dimension — but it is foundational, and its consequences propagate through every derivational chain in statistical mechanics.

IV.1 Why Time-Symmetric Dynamics Cannot Produce an Arrow of Time

Begin with a precise statement of what the historical programs were attempting. Let t: M → M be a one-parameter group of diffeomorphisms on a phase space M, generated by a Hamiltonian H via t = (t XH) where XH is the Hamiltonian vector field. Time-symmetry means that -t is also a valid dynamics — the group extends to negative t — and that physical laws are invariant under t → -t (possibly combined with a discrete symmetry like parity or charge conjugation).

For any such time-symmetric dynamics, consider a functional F: M → ℝ. Under the flow, F evolves as dF/dt = , H. Suppose we want dF/dt ≥ 0 along every trajectory. Time reversal sends t → -t; along the reversed trajectory, dF/dt → -dF/dt. The inequality dF/dt ≥ 0 on the forward flow therefore implies dF/dt ≤ 0 on the reversed flow. If the reversed flow is equally valid physics — which is what time-symmetry means — then F cannot be monotonically increasing on all valid trajectories.

This is not a subtle objection; it is an elementary theorem. Within a time-symmetric framework, no functional can be monotonically increasing. The Boltzmann H-function, the Gibbs coarse-grained entropy, the von Neumann entropy of a reduced density matrix — none of these can be monotonic without an auxiliary input that breaks the time-symmetry. The auxiliary input is, in every historical case, either (i) a preferred time direction imposed by hand (the Past Hypothesis), or (ii) a non-invariant procedure (coarse-graining, tracing out environment) that implicitly selects one time direction over the other.

The lesson: the arrow of time cannot be an emergent property of time-symmetric dynamics. It must be built into the dynamics at the fundamental level. The McGucken Principle is the minimal way to do this.

IV.2 How dx₄/dt = ic Breaks the Symmetry at the Kinematic Level

The McGucken Principle adds one equation to the kinematic substrate of physics:

dx4/dt = ic

Under t → -t, the right-hand side becomes -ic, which is not the same equation. The principle selects a preferred time direction at the kinematic level — before any dynamics, before any Hamiltonian, before any ensemble. This is structurally distinct from every prior program, which attempted to derive time-asymmetry from a time-symmetric substrate. The McGucken substrate is already time-asymmetric.

The asymmetry is carried by the factor i. In the master equation u^μ u_μ = -c2 that follows from dx₄/dt = ic combined with proper-time parametrization, the fourth component of four-velocity u4 = dx4/dτ = icγ is imaginary by construction. The imaginary-valued component cannot be negated by a real coordinate transformation; it is geometrically distinguished from the three real spatial components. The i encodes perpendicularity of x4 to the spatial hyperslice, and the sign of the imaginary is fixed by the physical content of the principle: x4 expands, it does not contract. Under formal time reversal t → -t, the required i → -i is not a physical operation on the spacetime manifold but a different principle — the principle of a contracting fourth dimension, for which there is no empirical evidence.

Three mathematical features distinguish this from every prior approach.

IV.2.1 The Asymmetry Is Kinematic, Not Dynamical

Prior approaches placed any proposed time-asymmetry in the dynamics (Prigogine), in the initial conditions (Past Hypothesis), or in the observer’s bookkeeping (Gibbs, Jaynes). The McGucken asymmetry is in the kinematics — the description of what coordinates exist and how they relate — before dynamics is specified. Every dynamics consistent with the principle (Newtonian, relativistic, quantum, gauge-theoretic) inherits the asymmetry automatically, without needing an additional postulate. The Hamiltonian projected onto the observer’s spatial hyperslice is time-symmetric; the full four-dimensional kinematics including x4 is not. The time-symmetry of observed mechanics is a projection artifact.

IV.2.2 The Asymmetry Is Unbounded, Not Recurrent

Zermelo’s recurrence objection applied to bounded Hamiltonian flow: a trajectory in a compact energy shell returns arbitrarily close to its initial state. The McGucken Sphere _+(p0) has radius R = ct that grows without bound. There is no compact invariant set in the full four-dimensional kinematics. Poincaré recurrence does not apply because the accessible volume strictly increases. Zermelo’s paradox dissolves geometrically rather than being argued away statistically.

IV.2.3 The Asymmetry Is Locally Isotropic, Not Directed

Naive approaches to time-asymmetric kinematics might introduce a preferred spatial direction — a cosmological flow vector field, a global frame, an absolute ether. The McGucken Principle does not. The asymmetry is entirely in the t-direction via the factor i; spatially, x4-expansion is fully O(3)-isotropic from every point. This preserves the content of special relativity (Lorentz covariance of the rate) and general relativity (diffeomorphism invariance of the four-manifold) while still breaking the t → -t symmetry. No prior program has achieved this combination; the closest attempt, the Prigogine program, broke Lorentz covariance to get the asymmetry.

IV.3 The Kleinian Decomposition of the Einstein Gaps

Section II.3 stated that Einstein’s three gaps are gaps between algebra and geometry. This subsection develops the claim mathematically. Each gap is a specific algebra–geometry correspondence failure in the orthodox framework, and each is closed by an explicit Kleinian identification under dx₄/dt = ic.

IV.3.1 The Measure Gap

Orthodox framework: the phase-space measure  is a Borel measure on a symplectic manifold Γ. Liouville’s theorem establishes its preservation under Hamiltonian flow, but any measure of the form ψ(I) dμ — where ψ is a function of the constants of motion I — is equally preserved. The orthodox framework provides no algebraic criterion to distinguish the Liouville measure from these alternatives. The “equal a priori probabilities” postulate selects it by epistemic fiat.

McGucken framework: the measure is the Kleinian image of Channel A’s isotropy group. Explicitly, let G = ISO(3) = ℝ3 ⋊ O(3) be the Euclidean isometry group of the spatial hyperslice, acting on  by x4-invariant transformations that preserve the rate ic. The unique (up to scaling) G-invariant Borel measure on 3 is the Lebesgue measure dq1 dq2 dq3. Channel B realizes this group action as spherical expansion from every point, and the measure it projects onto the spatial hyperslice is exactly the Lebesgue measure. The configuration-space measure for an N-particle system is the N-fold tensor product, and the symplectic extension to phase space adds the dpi factors by standard symplectic machinery. The “equal a priori probabilities” postulate is not a postulate; it is the unique Haar measure on the isometry group of x4‘s expansion, Kleinianly identified with the geometric measure of Channel B projection.

The algebraic criterion missing in the orthodox framework (why this measure rather than another?) is supplied by the Kleinian principle: the measure is the one compatible with the group under which x4-expansion is invariant.

IV.3.2 The Ergodicity Gap

Orthodox framework: ergodicity is the statement that the flow t: Γ → Γ is metrically transitive — equivalently, that the only φ-invariant L2 functions are constants. This is a property of the specific dynamics. For a generic Hamiltonian system, ergodicity is neither true nor easy to check; KAM theory demonstrates it typically fails on positive-measure sets.

McGucken framework: the identity ⟨ f _{time} = ⟨ f _{ensemble} does not require ergodicity of the projected dynamics because the ensemble is not an abstraction to be approximated by time averages but a physical object realized by x4-expansion. Explicitly, let p0 ∈ Γ be any point in phase space and let p0t = s(p0) : 0 ≤ s ≤ t be the trajectory up to time t. Along every point of this trajectory, the Channel B content of dx₄/dt = ic generates an x4-expanding wavefront W(s(p0)) that, projected back onto the spatial hyperslice at t’, covers the forward light cone of that event. The total projection over the trajectory is:

Et(p0) = 0 ≤ s ≤ t P(W(s(p0)))

where P is the spatial projection. As t → ∞Et(p0) covers a set of positive measure in Γ that includes every configuration accessible from p0. This is not the statement that the trajectory visits every configuration (which is false for KAM-tori-trapped systems); it is the statement that the Huygens wavefront emanating from every point of the trajectory collectively covers all accessible configurations. Time averages equal ensemble averages because the “ensemble” is the union of wavefronts along the trajectory, and this union is geometrically dense in the accessible phase-space region regardless of whether the trajectory itself is ergodic in the metric-transitivity sense.

The Kleinian content: metric transitivity is an algebraic-measure-theoretic property of the flow; Huygens wavefront coverage is a geometric-propagation property of the kinematic substrate. The identity ⟨ f _{time} = ⟨ f _{ensemble} requires only the latter under the McGucken Principle. KAM tori obstruct the former but not the latter.

IV.3.3 The Arrow-of-Time Gap

Orthodox framework: the Second Law requires an H-functional or entropy-functional that is monotonically increasing. As shown in §IV.1, no such functional can exist within a time-symmetric framework without auxiliary input. The auxiliary input is the Past Hypothesis.

McGucken framework: define the accessible phase-space volume from an initial configuration p0 at time t as:

V(p0, t) = μ(Et(p0))

where Et(p0) is the union of projected Huygens wavefronts defined in §IV.3.2. Because the McGucken Sphere radius R(s) = c(t-s) from each event along the trajectory increases monotonically with t (holding s fixed) or decreases with s (for fixed t, along the past direction), V(p0, t) is manifestly monotonically increasing in t.

dV/dt > 0 strict, for all t > 0.

The Boltzmann entropy is S(p0, t) = kB V(p0, t), so dS/dt > 0 strict. For the specific case of spherical isotropic random walk generated by x4-expansion [MG-ConservationSecondLaw, Proposition III.2], the explicit rate is dS/dt = (3/2) kB/t. For photons on the McGucken Sphere [MG-ConservationSecondLaw, Proposition III.3], dS/dt = 2 kB/t. In both cases the monotonicity is not statistical but strict — strict for every t > 0 — because the underlying cause (x4 monotonically advancing at rate c) is strict.

Kleinian content: the arrow of time is the geometric direction of Channel-B advance, algebraically encoded as the fixed sign of the imaginary factor i in Channel A’s perpendicularity marker. Neither channel alone produces the arrow: Channel A gives the time-symmetric conservation laws, Channel B gives the time-asymmetric expansion. The arrow arises from the Kleinian correspondence between them: the group under which the Channel-A structure is invariant does not include time-reversal as a continuous symmetry (because dx₄/dt = ic and dx₄/dt = -ic are distinct principles), and the Channel-B propagation realizes this by one-way advance at +ic.

IV.4 Why Prior Approaches Could Not Reach This Structure

Each prior approach can now be diagnosed precisely. It failed because it did not have access to a kinematic-level asymmetry like dx₄/dt = ic.

– Boltzmann’s H-theorem requires the Stosszahlansatz because the underlying Hamiltonian mechanics is time-symmetric. With dx₄/dt = ic, the Channel B random walk is intrinsically time-asymmetric; the Stosszahlansatz becomes a theorem of Channel B geometry rather than an auxiliary assumption. Molecular chaos follows from x4-isotropy, not from a postulate about pre-collision statistical independence [MG-Broken, §III; MG-Entropy, §IV]. – Gibbs coarse-graining is needed because Liouville’s theorem locks the fine-grained entropy to zero growth within time-symmetric mechanics. Under the McGucken Principle, the relevant measure is not the fine-grained Hamiltonian measure on a fixed energy shell but the Kleinian projection of expanding x4-wavefronts (Proposition V.1), which grows monotonically by construction. No coarse-graining is needed; the entropy grows without observer intervention [MG-Master, Part VII]. – The ergodic theorems require metric transitivity because that is the algebraic condition for time averages to equal ensemble averages under a deterministic flow. Under the McGucken Principle, the ensemble is physically realized by Huygens wavefronts (Proposition VI.1) rather than abstractly reconstructed from flow trajectories, so metric transitivity of the projected flow is not required. – The Past Hypothesis is required to provide a low-entropy boundary condition in any framework where dynamics is time-symmetric. Under dx₄/dt = ic, the lowest-entropy moment is the geometric origin of x4-expansion — t = 0, by construction. The “boundary condition” is the tautological statement that at t = 0 no x4-expansion has yet occurred. There is no free parameter; Penrose’s 10^-10123 figure quantifies an improbability under a uniform prior, but the McGucken prior is not uniform — it is concentrated at the origin of the expansion by geometric necessity [MG-Eleven, §XI; MG-Horizon, §7]. – Jaynes’ maximum-entropy program relocates the problem epistemically because the ontic foundation is missing in time-symmetric mechanics. Under dx₄/dt = ic the ontic foundation is restored: the Kleinian measure on phase space is not a Bayesian prior but the unique Haar measure on the x4-isotropy group (Proposition V.1). Epistemology converges with ontology. – Prigogine’s fundamental-level irreversibility was right about the need for kinematic asymmetry but wrong about its location. Prigogine placed it in the dynamics (non-unitary evolution) and paid the price in empirical conflict with microscopic reversibility tests. The McGucken Principle places it in the kinematics (x4-expansion) and pays no empirical price — microscopic reversibility tests are tests of the projected dynamics, which remains unitary, while the full four-dimensional kinematics carries the asymmetry in the factor i. – Jacobson’s and Verlinde’s entropic-gravity programs assume the thermodynamic content they claim to explain (the Clausius relation, the holographic area-entropy law) because they lack a deeper principle from which these can be derived. Under the McGucken Principle, both area-entropy (the McGucken Sphere surface carries S = kB (4π(ct)2) per Proposition VII.2) and the Clausius relation follow from Channel B propagation, so the entropic-gravity programs become downstream consequences rather than competing foundations [MG-JacobsonVerlindeMarolf; MG-VerlindeEntropic; MG-Susskind, §III; MG-Bekenstein]. – Decoherence and ETH explain classical emergence conditional on initial low entropy; under dx₄/dt = ic the initial low entropy is no longer a condition but a theorem, and decoherence/ETH become mechanisms of how the Channel-B entropy growth manifests in quantum many-body systems. The connection between ETH-thermalization at temperature T and the underlying x4-physical Wick rotation that identifies temperature with the compactification period of the x4-axis is supplied by [MG-Wick, Propositions VI.1–VI.3] and [MG-NonlocCopen, §6].

IV.5 The Structural Signature of a Correct Foundation

A correct foundational principle should exhibit two features: (i) it should derive its consequences through disjoint derivational chains, with no need to assume what is being derived; (ii) it should dissolve problems that appeared intractable in the prior framework, rather than merely displacing them. The McGucken Principle exhibits both.

Disjoint derivational chains: the probability measure derives from Channel A’s isotropy group via the Haar-measure construction (Proposition V.1). The ergodic identity derives from Channel B’s Huygens wavefront coverage (Proposition VI.1). The arrow of time derives from Channel B’s monotonic one-way advance (Theorem VII.1, Proposition VII.2). These three derivations share nothing beyond the starting principle. No derivation assumes the result of another derivation. The 41-row master derivation chain of [MG-Master] — running from dx₄/dt = ic through the master equation u^μ u_μ = -c2, the Minkowski metric, special relativity, the Principle of Least Action, Huygens’ Principle, the Schrödinger equation, the Second Law, quantum nonlocality, and the cosmological constant — exhibits the same disjoint-chain structure at the corpus level: each major result has its own independent derivation chain terminating in the single principle, with no result needed as input to another. By contrast, the orthodox framework’s derivations are tangled: ergodicity is needed to justify the measure, the measure is needed to define entropy, entropy is needed to formulate the Second Law, and the Second Law requires the Past Hypothesis — a chain in which each link presupposes something about the next.

Dissolution rather than displacement: Loschmidt’s reversibility objection is not answered statistically but dissolved — the objection presupposes a common dynamical origin for conservation laws and Second Law, which under the dual-channel structure is simply not the case [MG-Singular, §V]. The Past Hypothesis is not postulated but derived from the geometric starting point of x4-expansion. KAM tori are not an obstruction to ergodicity but a feature of the spatial-projection dynamics that does not affect the full-kinematics ensemble realization (§VI.4). These are dissolutions, not displacements.

The combination — disjoint chains plus dissolution of prior problems — is what Einstein meant in his 1934 Herbert Spencer Lecture when he wrote that a theory’s impressiveness increases “the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended is its area of applicability” [Einstein1934]. The McGucken Principle has one premise (dx₄/dt = ic), relates the probability measure, ergodicity, and the arrow of time through three disjoint chains, and extends from laboratory thermodynamics to cosmological initial conditions. The criterion Einstein set is met. What he was convinced would never be overthrown is now derived from below.


V. The Probability Measure as a Kleinian Projection (Channel A → B)

The Boltzmann–Gibbs program postulates the Liouville measure dμ = dq1 dqn dp1 dpn on phase space. Liouville’s theorem establishes its preservation under Hamiltonian flow given this choice, but provides no justification for the choice against the infinitely many other Hamiltonian-invariant measures obtained by multiplying  by any constant of motion. The “equal a priori probabilities” postulate is then added to select  uniquely; this has no mechanical justification.

This section establishes that under the McGucken Principle the Liouville measure is not chosen but forced, as the unique Haar measure on the isometry group of x4‘s expansion.

V.1 Channel A: Isotropy as a Group-Theoretic Statement

Under the McGucken Principle, the McGucken Sphere _+(p0) = ∈ ℝ3 : |q – q0| = c(t – t0) centered at every event p0 = (t0, q0) is invariant under every R ∈ O(3) acting on the spatial fibre, and the rate ic is invariant under temporal and spatial translations. The Channel A face is therefore the group-theoretic statement: the spatial isometry group G = ISO(3) = ℝ3 ⋊ O(3) (spatial translations and rotations) acts transitively on each spatial hyperslice, leaving the rate of x4-advance invariant. Together with time translations t acting trivially on the spatial fibre, the full Channel A symmetry group is

G_{Ch.A} = ℝt × ISO(3),

with x4-expansion at rate ic as the invariant kinematic content. (For full Lorentz covariance of the rate ic — which extends G_{Ch.A} to the Poincaré group — see [MG-Noether, §V; MG-Master]; the present argument requires only the spatial-fibre isotropy ISO(3).)

V.2 Proposition V.1 (Haar-Measure Theorem for the Phase-Space Measure)

Proposition V.1. Let G = ISO(3) = ℝ3 ⋊ O(3) act on ℝ3 (the spatial fibre of an inertial frame at fixed time) in the standard way. Then the unique (up to positive scalar multiple) G-invariant Borel measure on ℝ3 is the three-dimensional Lebesgue measure d3 q = dq1 dq2 dq3. Consequently, under the McGucken Principle (Channel A spatial-fibre isotropy plus Channel B spherical projection) the configuration-space measure for an N-particle system is forced to be d3N q = i=1N d3 qi, and the symplectic completion to phase space yields the Liouville measure dμ = i=1N d3 qi d3 pi.

Proof. The proof proceeds in three steps: (1) uniqueness of the Lebesgue measure as Haar measure on 3 under ISO(3), (2) extension to N-particle configuration space by tensor product, (3) symplectic extension to phase space.

Step 1: Lebesgue measure is the unique ISO(3)-invariant Borel measure on ℝ3 up to scaling. The group ISO(3) = ℝ3 ⋊ O(3) is locally compact and acts transitively on 3 (since both translations and rotations act on 3, and translation alone is already transitive). By the Haar-measure theorem (Weil 1940; for the locally compact homogeneous-space version see Folland, A Course in Abstract Harmonic Analysis, Ch. 2, Thm. 2.27 or Bourbaki, Intégration, Ch. VII §2 No. 6), every locally compact group acting transitively on a locally compact homogeneous space admits an invariant Borel measure unique up to a positive multiplicative constant. For the action of 3 alone (the translation subgroup of ISO(3)) on 3, this measure is the Lebesgue measure — explicitly, if ν is any translation-invariant Borel measure on 3 that is finite on bounded sets and positive on open sets, then ν = c · λ for some c > 0, where λ is the Lebesgue measure. Since the Lebesgue measure is also O(3)-invariant (rotation preserves Euclidean volume, | R| = 1 for R ∈ O(3)), it is ISO(3)-invariant. Conversely any ISO(3)-invariant measure is in particular 3-invariant, hence equals c · λ for some c > 0. Therefore the Lebesgue measure is the unique ISO(3)-invariant Borel measure on 3 up to scaling. We fix the scale by [0,1]3 dλ = 1, giving the standard Lebesgue measure d3 q.

Step 2: Tensor-product extension to N-particle configuration space. Channel B propagation from each particle’s spacetime trajectory generates an independent x4-wavefront for each particle. Because the N Channel B wavefronts are spatially independent and each separately ISO(3)-invariant, the joint configuration space 3N carries the product action of GN = ISO(3)N, with each factor acting on its own fibre. Since the spatial structure factors as 3N = ℝ3 × × ℝ3, the tensor-product measure i=1N d3 qi = d3N q is the unique GN-invariant Borel measure on 3N up to scaling, by Fubini’s theorem applied to Step 1 in each factor.

Step 3: Symplectic extension to phase space. The phase space of N particles is the cotangent bundle Γ = T^ ℝ3N ≅ ℝ6N. We claim that Γ carries a canonical* (in the strict, intrinsic sense — independent of any choice) symplectic form _{can}, whose top exterior power _{can}3N/(3N)! is the Liouville measure dL = d3N q d3N p, and that this measure is automatically GN-invariant under cotangent lift of the configuration-space action established in Step 2. The argument has three parts.

(i) Canonical symplectic form on the cotangent bundle. On any cotangent bundle T^ M over a smooth manifold M, the Liouville one-form θ ∈ Ω1(T^M) is defined intrinsically by (q, p)(v) = p(d(q,p)(v)) for v ∈ T(q,p)(T^M), where π: T^M → M is the bundle projection. The canonical symplectic form is then _{can} := -dθ. The construction is intrinsic — no choice of coordinates, metric, measure, or auxiliary structure on M enters — and yields a globally-defined non-degenerate closed 2-form on T^M (Abraham–Marsden, Foundations of Mechanics*, 2nd ed., Theorem 3.2.10). In Darboux coordinates (qi, pi) adapted to the bundle structure, θ = pi dqi and _{can} = dqi ∧ dpi, but the form itself is coordinate-independent.

(ii) Cotangent-lift invariance under GN. By Abraham–Marsden Proposition 3.2.11, every diffeomorphism f: M → M has a cotangent lift T^f: T^M → T^M, defined by (T^f)(q, p) = (f-1(q), p ∘ dff-1(q)), and the cotangent lift automatically preserves θ (and hence _{can}): (T^f)^ θ = θ. Applied to our setup with M = ℝ3N and f = any element of GN = ISO(3)N (acting on 3N by translations and rotations in each fibre), the cotangent lift of f acts on Γ = T^*ℝ3N and preserves _{can}. Therefore _{can} is GN-invariant on Γ, and by extension so is the volume form _{can}3N/(3N)!.

(iii) Identification with the Liouville measure. In Darboux coordinates,

{_{can}3N}(3N)! = i=1N d q1i ∧ d p1,i ∧ dq2i ∧ dp2,i ∧ dq3i ∧ dp3,i = d3N q d3N p =: dL,

which is the standard Liouville volume form. Combining (i), (ii), and (iii), dL is a canonical, GN-invariant, intrinsically-defined Borel measure on Γ, independent of any auxiliary choice. The configuration-space marginal (integrating out the momenta on a fixed bounded region of momentum space) is, by Fubini’s theorem, proportional to d3N q as established in Step 2. Fixing the absolute normalization by the conventional choice [0,1]3N × [0,1]3N dL = 1 yields the standard Liouville measure as the unique canonical symplectic-volume form on Γ associated to the Channel A spatial-fibre symmetry group of Step 1.

The “equal a priori probabilities” postulate is therefore not a postulate. It is the unique Haar measure on the isometry group of x4‘s spatial-fibre invariance, Kleinianly identified with the geometric Lebesgue measure of Channel B projection. Q.E.D.

Remark V.1.1. Comparison with the orthodox account. In the orthodox framework, the Liouville measure is Hamiltonian-invariant given the choice of metric structure; any measure ψ(I) dμ where ψ is a function of the constants of motion I is also Hamiltonian-invariant. The orthodox framework supplies no algebraic criterion to select ψ ≡ 1. Proposition V.1 supplies exactly this missing criterion: ψ ≡ 1 is forced because the Lebesgue measure is the unique Haar measure on the isometry group G = ISO(3) of Channel A’s spatial-fibre symmetry. Any other choice of ψ breaks G-invariance and is therefore inconsistent with the Channel A content of the McGucken Principle.

Remark V.1.2. Liouville’s theorem as consistency check. Liouville’s theorem (the Hamiltonian flow preserves ) becomes, in the McGucken framework, a consistency check between two derivations from the same principle: the Hamiltonian dynamics descends from the matter and gauge sectors of ℒ_{McG} via the variational principle [MG-Lagrangian, §IV; MG-HLA, §IV]; the Liouville measure descends from Channel A’s G-isotropy via Proposition V.1. Both derivations terminate in dx₄/dt = ic. Their compatibility is structurally guaranteed.

Remark V.1.3. Connection to the symplectic form derivation in [MG-Lagrangian]. The symplectic form ω = dp ∧ dq used in Step 3 of the proof above is itself derived from ℒ_{McG}‘s free-particle sector S_{free} = -mc ∫ |dx4| via the Legendre transform [MG-Lagrangian, §IV.2; MG-Noether, Proposition II.10]. The Legendre transform maps the Lagrangian formulation (where x4-advance is the geometric content) to the Hamiltonian formulation (where ω = dp ∧ dq is the symplectic content). Both formulations are theorems of dx₄/dt = ic; the symplectic form used in Proposition V.1 Step 3 is therefore not an additional postulate but a derived consequence of the same principle that supplies the Channel A symmetry group.


VI. Ergodicity as a Geometric Identity (Channel B)

Ergodicity requires the identification of time averages with ensemble averages:

⟨ f _{time} = T → ∞ (1/T) 0T f(t(x0)) dt = ∫ f(x) dμ(x) = ⟨ f _{ensemble}

Birkhoff’s theorem [Birkhoff1931] establishes existence of the left side for μ-almost-every x0; the equality requires metric transitivity, which KAM theory [KAM] shows fails on positive-measure sets. Statistical mechanics as standardly formulated is thus not merely incomplete; it is known to be false on a set of positive measure.

This section establishes that under the McGucken Principle the ergodic identity holds without metric transitivity, because the ensemble is physically realized by the Channel B Huygens wavefront rather than approximately reconstructed from a single trajectory.

VI.1 Why the Orthodox Account Needs Ergodicity

A mechanical trajectory in the orthodox picture is a deterministic curve through a frozen phase space. The system does not sample configurations; it traces one. The ensemble is a theorist’s construct, and its identification with time averages requires metric transitivity as an additional assumption — an assumption that KAM theory has shown to fail for typical Hamiltonian systems near integrable limits.

VI.2 The McGucken Sphere and Channel B Propagation

Under the McGucken Principle, the mechanical trajectory is not the full story. From every event p0 = (t0, q0) along any worldline, Channel B generates the McGucken Sphere

_+(p0; t) = ∈ ℝ3 : |q – q0| = c(t – t0), t > t0,

with three-dimensional ball interior

B_+(p0; t) = ∈ ℝ3 : |q – q0| ≤ c(t – t0), t > t0,

representing the forward Huygens wavefront and forward light cone interior emanating from p0. This is Huygens’ Principle in its McGucken-framework form, derived in [MG-HLA, §III; MG-Master, Part IV; MG-Proof] not as an optical postulate but as the three-dimensional cross-section of x4‘s spherical expansion.

VI.3 Proposition VI.1 (Wavefront-Coverage Ergodic Identity)

Proposition VI.1. Let t: Γ → Γ be the projected Hamiltonian flow on phase space Γ = T^ ℝ3N for an N-particle system, and let p0T = s(p0) : 0 ≤ s ≤ T be the trajectory from initial condition p0 over time [0, T]. Define the McGucken-wavefront coverage*

ET(p0) = 0 ≤ s ≤ T P(B_+(s(p0); T)) ⊂ ℝ3N,

where P: Γ → ℝ3N is projection to configuration space and B_+(s(p0); T) is the forward light cone interior of the event s(p0) evaluated at time T. Then for any bounded continuous configuration-space observable f: ℝ3N → ℝ supported in a bounded region K, and for T sufficiently large that K ⊂ ET(p0),

⟨ f _{wavefront}(p0, T) := 1/μ(ET(p0)) ET(p0) f(q) dμ(q) = ⟨ f _{ensemble,K}

where ⟨ f _{ensemble,K} is the ensemble average over K with respect to the Liouville measure dμ of Proposition V.1, and the equality holds independent of whether t is metrically transitive on the projected dynamics.

Proof. The proof proceeds in three steps: (1) the Channel B wavefront from a single event populates a forward-light-cone region with the Lebesgue measure, (2) the union over the trajectory covers every configuration accessible from p0, (3) the average over the union equals the Liouville-ensemble average by Proposition V.1.

Step 1: Single-event wavefront coverage of the forward light cone interior. Fix an event p^ = (t^, q^) along the trajectory. We must show that the Channel B wavefronts emanating from p^ at all times s ∈ [t^, t] collectively cover the forward light cone interior B_+(p^; t), and that the induced measure on B_+(p^*; t) is the spatial Lebesgue measure d3 q.

(i) Foliation by McGucken Spheres. The forward light cone interior B_+(p^*; t) admits a canonical foliation

B_+(p^*; t) = s ∈ [t^*, t] _+(p^*; s) (disjoint as a set, where _+(p^*; s) = : |q – q^*| = c(s – t^*)),

where _+(p^; s) is the McGucken Sphere of radius r(s) = c(s – t^) — i.e., the 2-sphere surface, not the 3-ball interior. As s ranges over [t^, t], the radius r(s) ranges over [0, c(t – t^)], sweeping out the entire forward light cone interior. In spherical coordinates centered at q^, the foliation parameter is the radial coordinate r = c(s – t^) and each leaf _+(p^*; s) is a sphere of constant r.

(ii) Channel B uniform measure on each leaf. By [MG-HLA, Proposition III.1; MG-Master, Part IV] the Channel B wavefront at time s from p^ populates _+(p^; s) uniformly in solid angle — that is, the induced measure on the 2-sphere _+(p^*; s) is the rotationally-invariant uniform measure r2 dΩ, where dΩ = ϑ dϑ dφ is the standard 2-sphere area element. This uniformity is forced by Channel A’s ISO(3)-isotropy applied to the spherical surface (Proposition V.1, Step 1 specialized to S2): the unique rotation-invariant Borel measure on S2 is the Euclidean area measure, up to scaling.

(iii) Foliation Jacobian. Combining (i) and (ii), the Channel B wavefront-coverage measure on B_+(p^*; t) is, in spherical coordinates,

d_{wavefront}(q) = r2 dr dΩ = d3 q |B_+(p^*; t),

since r2 ϑ dr dϑ dφ is precisely the standard Lebesgue volume element in spherical coordinates. The radial weight r2 in  on each leaf, combined with the measure dr along the foliation parameter s (with dr = c ds), is the Jacobian of the foliation, and reproduces the spatial Lebesgue measure exactly. Therefore the wavefront measure on B_+(p^; t) is the spatial Lebesgue measure d3 q restricted to B_+(p^; t).

(For the N-particle generalization the wavefront is the product i B_+(p^*i; t) and the measure is the product Lebesgue measure i d3 qi = d3N q by Proposition V.1, Step 2.)

Step 2: Union over the trajectory. The wavefront-coverage region is the union

ET(p0) = 0 ≤ s ≤ T P(B_+(s(p0); T)).

We claim ET(p0) contains every configuration q^ ∈ K that is accessible from p0 within time T — that is, every q^ such that there exists s ∈ [0, T] and a Channel B propagation chain from s(p0) to q^ within time T – s. The proof is direct: if q^ is accessible from s(p0) within time T – s, then by definition q^ ∈ B_+(s(p0); T) (the forward light cone interior at time T), hence q^ ∈ P(B_+(s(p0); T)) ⊂ ET(p0).

For sufficiently large T such that the McGucken-Sphere radius c(T – s) for the smallest s in the trajectory exceeds the diameter of K, every configuration in K becomes Channel-B-accessible from at least one point on the trajectory p0T, so K ⊂ ET(p0). This is not the statement that the trajectory p0T visits every configuration in K (which is false for KAM-tori-trapped systems where the trajectory is confined to a torus); it is the statement that the Channel B wavefront emanating from points along p0T collectively covers K. The wavefront coverage is insensitive to whether the underlying trajectory is metrically transitive.

Step 3: The wavefront average equals the ensemble average. By Step 1, the Channel B coverage measure on each forward light cone interior B_+(s(p0); T) is the spatial Lebesgue measure d3N q restricted to that interior. The total wavefront-coverage region ET(p0) is the union of these forward light cone interiors as s varies over [0, T].

By Step 2, for T large enough that K ⊂ ET(p0), every configuration q^* ∈ K is Channel-B-accessible from at least one event on the trajectory. The induced measure on ET(p0) obtained by Channel B propagation is the spatial Lebesgue measure d3N q restricted to ET(p0) — accessibility is a yes/no condition (a configuration is either reachable from the trajectory by Channel B or not), so the union construction does not introduce any non-uniform weighting on ET(p0) beyond the indicator function 1ET(p0). The wavefront average is therefore

⟨ f _{wavefront}(p0, T) = ET(p0) f(q) d3N qET(p0) d3N q.

For f supported in K ⊂ ET(p0), the numerator restricts to K and the denominator restricts to K as the relevant region of overlap with supp(f), so

⟨ f _{wavefront}(p0, T) = K f(q) d3N qK d3N q.

The right side is the Lebesgue-ensemble average over K. By Proposition V.1, the Lebesgue measure on 3N is the unique GN-invariant Borel measure up to scaling and is the configuration-space marginal of the Liouville measure . Therefore

⟨ f _{wavefront}(p0, T) = ⟨ f _{ensemble, K},

independently of whether t is metrically transitive on the projected dynamics. Q.E.D.

Remark VI.1.1. The orthodox identification fails; the wavefront identification holds. In the orthodox framework, the time average T → ∞ (1/T) 0T f(t(x0)) dt is computed along a single trajectory p0, and equals the ensemble average iff the trajectory is metrically transitive — fails for KAM tori. In the McGucken framework, the relevant “time average” is the wavefront average ⟨ f _{wavefront} computed over the Channel B coverage region ET(p0), and equals the ensemble average independent of metric transitivity. The KAM tori obstruct trajectory-coverage of K but do not obstruct wavefront-coverage of K, because the Channel B wavefront emanates from every event on the trajectory, not just from the single point p0. This is the geometric content of Channel B realizing the ensemble physically rather than abstractly.

VI.4 KAM Tori as Projection Artifact

Proposition VI.1 admits a sharper interpretation in the case of KAM-tori-trapped systems. In the orthodox picture, KAM tori are an obstruction: they are positive-measure sets in Γ within which the dynamics is quasi-periodic and the flow does not sample the full energy shell. In the McGucken framework, KAM tori are features of the spatial-projection dynamics alone, and they do not obstruct ensemble realization, because the Channel B wavefront emanates outward from every event on the torus regardless of whether the projected trajectory is trapped there. The four-dimensional kinematics including x4‘s expansion at rate ic has no compact invariant set — the McGucken Sphere radius R = ct grows without bound — so KAM-torus confinement is, structurally, a purely three-dimensional phenomenon visible to observers who project out x4 and is not a feature of the full kinematics. Ergodicity of the projected dynamics is therefore not required for the ensemble identity; the ensemble is realized by the Channel B expansion, not by the trajectory.


VII. The Arrow of Time as a Theorem: dS/dt > 0 Strict

This is the decisive section and the locus of Einstein’s deepest unease. The preceding two resolutions have partial precursors in Jaynes’ information-theoretic reformulation [Jaynes1957]; the derivation of the arrow of time as a strict geometric theorem, with explicit rates dS/dt = (3/2) kB/t for massive particles and dS/dt = 2 kB/t for photons on the McGucken Sphere, is structurally novel and follows [MG-Entropy], [MG-ConservationSecondLaw, §III], [MG-Broken, §XI], [MG-Master, Part VII], and [MG-Singular, §V].

The section establishes three formal results: Theorem VII.1 (the Strict Second Law for massive-particle ensembles), Proposition VII.2 (the Strict Photon Shannon Entropy Growth), and Proposition VII.3 (the Compton-Coupling Diffusion Constant). All three are derived as theorems of dx₄/dt = ic.

VII.1 The Imaginary Unit Fixes the Direction

Under t → -t, dx₄/dt = ic becomes dx₄/dt = -ic. This is not the principle. The principle fixes the forward direction of x4-expansion; the imaginary unit i encodes perpendicularity, and the sign of ic — the one-way advance content of Channel B — is selected by the kinematics itself, not chosen as a boundary condition. The apparent time-symmetry of projected mechanics (Newton, Hamilton, Schrödinger, Dirac) is an artifact of integrating out the x4-expansion when restricting to the observer’s spatial hyperslice [MG-Wick, Proposition IV.1; MG-Broken, §III–VI; MG-Master, Part X].

Loschmidt’s reversal assumes that reversing all velocities produces a valid backward-evolution; under dx₄/dt = ic this reversal requires dx₄/dt = -ic, which is a different physics. Zermelo’s recurrence applies to bounded Hamiltonian flow; x4-expansion is unbounded — the McGucken Sphere radius R = ct grows without bound. Both paradoxes dissolve in the unprojected theory [MG-Eleven, §XI].

VII.2 Spherical Isotropic Random Walk from x₄’s Expansion

The kinematic content of Channel B at successive time intervals generates a spherical isotropic random walk in the spatial projection. We state this carefully before formalizing the entropy theorem.

Lemma VII.1 (Single-Step Distribution). A particle of mass m at spacetime point p0 = (t0, q0) has, by Channel B propagation through x4 over a short time interval Δ t, a probability distribution at time t0 + Δ t that is uniform over the McGucken Sphere _+(p0; t0 + Δ t) of radius c Δ t centered on q0. The spatial-projection step is therefore distributed uniformly over the surface of a sphere of radius r = c Δ t / γ (with γ the Compton damping factor accounting for the particle’s matter coupling [MG-Compton; MG-deBroglie, §IV.4]); equivalently, the projection onto each spatial direction has a step distribution with mean zero and variance r2/3 per direction.

Proof of Lemma VII.1. From Lemma V.1, Step 1 (in the proof of Proposition V.1), the Channel B wavefront from p0 at proper-time interval Δ t populates the McGucken Sphere uniformly in solid angle (this follows from ISO(3)-isotropy applied to the spherical surface _+). The unit-vector direction of the step is therefore uniform on S2 ⊂ ℝ3. The spatial-projection magnitude r is reduced from the full c Δ t wavefront radius by the Compton-coupling factor γ = 1/√1 – v2/c2 in the rest-frame analysis of [MG-Compton, §3] and [MG-deBroglie, §IV.4]. By the standard moment computation for the uniform distribution on S2, the variance of any single Cartesian coordinate of the step is ⟨ (δ qi)2 ⟩ = r2/3, with ⟨ δ qi ⟩ = 0 by symmetry. Q.E.D.

Lemma VII.2 (Random-Walk Limit). Iterating Lemma VII.1 over N successive intervals of duration Δ t each, with t = N Δ t the elapsed time, the mean-squared displacement of the spatial projection is

⟨ |qN – q0|2 ⟩ = N r2 = c2 Δ t/γ2 · t = D · t,

where D := c2 Δ t / γ2 is the effective three-dimensional diffusion constant (with Δ t the microscopic Channel-B step duration, fixed by the particle’s matter coupling [MG-Compton]). The variance per Cartesian direction is σ2(t) = D t / 3, by isotropy. By the central limit theorem applied to the sum of N i.i.d. three-dimensional steps of zero mean and finite variance, the spatial-projection distribution at time t is asymptotically a three-dimensional isotropic Gaussian:

ρ(q, t) = 1(2π σ2(t))3/2 (-|q – q0|2/2 σ2(t)) = 1(2π D t / 3)3/2 (-3|q – q0|2/2 D t) as N → ∞.

Proof of Lemma VII.2. Independence of the N steps follows from the proof of Proposition V.1, Step 2: each event s(p0) along the trajectory generates an independent Channel-B x4-wavefront, with no Markov memory connecting successive steps in the spatial projection. Mean-zero follows from the isotropy of each single step (Lemma VII.1: the uniform distribution on S2 has [δ q] = 0). Variance-additivity for independent zero-mean random vectors is standard: ⟨ |qN – q0|2 ⟩ = k=1N ⟨ |δ qk|2 ⟩ = N · r2, since each step has ⟨ |δ q|2 ⟩ = r2 (Lemma VII.1). Substituting r2 = c2 (Δ t)2 / γ2 and N Δ t = t gives ⟨ |qN – q0|2 ⟩ = c2 Δ t t / γ2 = D t. Decomposing ⟨ |qN – q0|2 ⟩ = i=13 ⟨ (δ qi)2 ⟩ = 3 σ2(t) by isotropy gives σ2(t) = D t / 3 per Cartesian direction. The central limit theorem for sums of i.i.d. random vectors with finite covariance (Feller, An Introduction to Probability Theory and Its Applications Vol. II, Ch. VIII §4, Theorem 1) gives the asymptotic Gaussian limit with covariance matrix σ2(t) I3. Substituting into the standard 3D isotropic Gaussian density ρ = (2πσ2)-3/2 (-|q|2/(2σ2)) yields the formula above. Q.E.D.

Remark on normalization. The convention used here (σ2(t) = D t / 3 per direction, with D = c2 Δ t / γ2 the three-dimensional MSD diffusion constant) is the convention of [MG-ConservationSecondLaw, Proposition III.2] and [MG-Entropy]. The standard one-dimensional diffusion convention σ21D(t) = 2 D1D t corresponds to D1D = D/6. The strict-positivity result of Theorem VII.1 below is independent of this convention.

VII.3 Theorem VII.1 (The Strict Second Law for Massive-Particle Ensembles)

Theorem VII.1. Let an ensemble of identical non-interacting massive particles undergo Channel B propagation per Lemma VII.2 starting from a localized initial configuration at t = 0. Then the Boltzmann–Gibbs entropy S(t) of the spatial-projection distribution satisfies

S(t) = 32 kB (2π e D t3) + S0, /dt = 3 kB/2 t > 0

strictly, for every t > 0, where S0 is a t-independent integration constant and the formula is valid in the central-limit regime N ≫ 1 established in Lemma VII.2.

Proof. From Lemma VII.2, the spatial distribution at time t is the three-dimensional isotropic Gaussian

ρ(q, t) = (2 π σ2(t))-3/2 (-|q – q0|2/2 σ2(t)), σ2(t) = D t/3.

The differential (Boltzmann–Gibbs) entropy of a d-dimensional Gaussian with covariance Σ is, in physical units,

S = -kB ∫ ρ ρ dd q = 12 kB ((2π e)d Σ)

(Cover & Thomas, Elements of Information Theory, 2nd ed., Theorem 8.4.1, with the natural-log convention and the kB factor restored to convert from Shannon entropy in nats to physical thermodynamic entropy). For the isotropic case Σ = σ2 Id with d = 3,

S = 12 kB ((2π e)3 σ6) = 32 kB (2 π e σ2).

Substituting σ2(t) = D t / 3:

S(t) = 32 kB (2 π e D t3) + S0,

where S0 absorbs t-independent constants. Because the only t-dependent factor inside the logarithm is the linear factor t, differentiation gives

dS/dt = 32 kB · d/dt (2 π e D3 · t) = 32 kB · 1/t = 3 kB/2 t.

Strict positivity dS/dt > 0 for every t > 0 follows immediately from kB > 0 and t > 0Q.E.D.

Remark VII.1.0. Connection to the corpus form. The form quoted in [MG-ConservationSecondLaw, Proposition III.2], [MG-Entropy], and [MG-Singular, §V] writes the entropy as S(t) = 32 kB (4 π e D’ t) + const with a different normalization D’ for the diffusion constant. The two forms differ only by a t-independent constant absorbed into S0, and yield identical entropy rate dS/dt = (3 kB)/(2 t). The rate, not the additive constant, is the physical content.

Remark VII.1.1. The strictness is structural, not statistical. In the orthodox account the Second Law is statistical — it holds with overwhelming probability under thermodynamic limit assumptions. Theorem VII.1 establishes that under dx₄/dt = ic, the Second Law is strict for every finite t > 0, not only in some thermodynamic-limit average. The strictness is inherited from the strictness of x4‘s monotonic advance: x4 does not retreat, so the McGucken Sphere does not contract, so the spatial-projection variance does not decrease, so the Gaussian entropy strictly increases. The chain is geometric, not statistical.

Remark VII.1.2. Boltzmann’s H-theorem recovered as theorem. The Boltzmann H-theorem dH/dt ≤ 0 (equivalently dS/dt ≥ 0) is recovered from Theorem VII.1 as the special case where the Stosszahlansatz is replaced by Channel B isotropy. In Boltzmann’s original derivation the Stosszahlansatz was an unjustified assumption injecting time-asymmetry into a time-symmetric substrate. In the McGucken framework, Channel B isotropy is itself a theorem of dx₄/dt = ic, and the Stosszahlansatz becomes a downstream consequence of the spherical symmetry of x4‘s expansion at every event [MG-Broken, §III; MG-Master, Part VII]. The orthodox circular argument (Stosszahlansatz molecular chaos Second Law) is replaced by a one-directional derivational chain: dx₄/dt = ic  Channel B isotropy (Lemma VII.1)  random-walk Gaussian distribution (Lemma VII.2)  strict dS/dt > 0 (Theorem VII.1).

VII.4 Proposition VII.2 (Strict Photon Shannon Entropy Growth)

Proposition VII.2. Let a photon be emitted isotropically from a spacetime point p0 at time t0. Then by Channel B propagation, the photon’s spatial-position distribution at time t > t0 is uniform over the McGucken Sphere _+(p0; t) of radius R(t) = c(t – t0), with the photon stationary in x4 throughout (i.e., dx4/dt = 0 along the photon worldline, so the photon rides the wavefront). The positional Shannon entropy of the distribution is

S(t) = kB (4 π R(t)2) + S0 = 2 kB R(t) + S0‘, dS/dt = 2 kB/t – t0 > 0

strictly, for every t > t0.

Proof. A photon has |v| = c in the spatial projection. By the master equation u^μ u_μ = -c2 derived from dx₄/dt = ic (see [MG-Proof; MG-Master, Part II]), we have u4 · u4 + |u|2 = -c2 with u4 = icγ for a massive particle in its proper-time parametrization. For a photon there is no rest frame; instead the null condition u^μ u_μ = 0 holds with |u| = c, forcing u4 = 0 — equivalently dx4 / dt = 0 along the photon worldline. Therefore the photon does not advance through x4; it rides the McGucken Sphere expanding from the emission event p0, stationary in the fourth dimension while spreading in the three spatial dimensions at rate c.

Channel A’s ISO(3)-isotropy (Proposition V.1, Step 1) ensures the photon’s spatial-position distribution is uniform over the sphere _+(p0; t) of radius R(t) = c(t – t0). The differential entropy of a uniform distribution on a 2-sphere of radius R in three-dimensional space is

S = -kB _+ ρ ρ dA = kB (surface area) = kB (4 π R2),

where ρ = 1/(4π R2) is the uniform surface density (Cover & Thomas Theorem 8.3.1 generalized to a Riemannian manifold; the differential entropy of a uniform distribution on a measurable set A is μ(A)). Substituting R(t) = c(t – t0):

S(t) = kB (4 π c2 (t – t0)2) + const = 2 kB (c(t – t0)) + S0‘,

absorbing the constants. Differentiating:

dS/dt = 2 kB · d/dt (t – t0) = 2 kB/t – t0.

Strict positivity for t > t0 follows from kB > 0Q.E.D.

Remark VII.2.1. Photons are perfect tracers of x₄. The fact that the photon is stationary in x4 (Proposition X.4 of [MG-Twistor]; Proposition III.1 of [MG-Master]) means that the photon’s spatial position at time t is a perfect tracer of where x4‘s expansion at the emission event p0 has reached at time t. The photon’s Shannon entropy at time t is therefore the Shannon entropy of x4‘s expansion sphere — a geometric object, not a statistical-mechanical one. Every photon emitted in the universe — from stars, from the CMB, from every quantum transition in every atom — rides a McGucken Sphere, and every McGucken Sphere carries monotonically increasing Shannon entropy outward as x4 advances at rate c.

Remark VII.2.2. Connection to Bekenstein–Hawking horizon entropy. The photon-on-McGucken-Sphere entropy formula S(t) = kB (4 π R2) is structurally identical to the Bekenstein–Hawking area-law form SBH = kB · A / (4 P2) with A = 4π R2, modulo the substitution of the Planck-scale factor 1/(4 P2) for the unit normalization. The detailed derivation [MG-Bekenstein, Propositions III.1, IV.1, V.1] establishes that horizon entropy on a black-hole event horizon and photon entropy on a McGucken Sphere are two manifestations of the same underlying x4-wavefront entropy structure, with the Planck-scale factor counting independent x4-stationary modes per Planck-area cell on the horizon.

VII.5 Proposition VII.3 (Compton-Coupling Diffusion: Testable Signature)

Proposition VII.3. Suppose x4‘s advance at rate c is modulated by a small oscillatory perturbation of amplitude ε ≪ 1 at frequency Ω, i.e., the modulated rate is

dx4/dt = ic [1 + ε (Ω t)] at leading order in ε.

Under the matter coupling postulate of [MG-Compton] — that every massive particle of mass m couples to x4‘s advance through the Compton-frequency phase (-imc2 t/ℏ) — the resulting spatial-projection diffusion in the Langevin/Lindblad regime has effective diffusion constant

Dx^{McG} = ε2 c2 Ω/2 γ2

to leading order in ε, where γ is the velocity-dependent Lorentz factor. This diffusion contribution is (i) temperature-independent, persisting at T → 0; (ii) mass-independent at fixed velocity; and (iii) sharply distinguishable from ordinary thermal diffusion D_{thermal} = kB T / (m γ) which scales as T and vanishes at T → 0.

Setup and normalization. Throughout this proof we work at the Compton resonance — the modulation frequency Ω corresponds to a perturbation that the matter wavefunction’s Compton-clock factor (-i m c2 t / ℏ) resolves on its natural timescale. The relevant energy scale of the modulation is therefore ℏ Ωnot m c2 — the latter sets the Compton-frequency clock rate, the former sets the modulation-energy coupling. With this convention the perturbation Hamiltonian is

H_{mod}(t) = ε ℏ Ω (Ω t),

with dimensionless amplitude ε multiplying the natural energy scale ℏ Ω of the modulation. The detailed derivation of this form from the matter orientation condition Ψ(x, x4) = 0(x) (+I k x4) with k = mc/ℏ [MG-Dirac, §IV.2] is given in [MG-Compton, §3]. Steps below take this as the starting point.

Proof. The proof proceeds in three steps: (1) Floquet/Magnus expansion gives the leading-order momentum-space diffusion Dp in terms of the perturbation amplitude ε ℏ Ω and frequency Ω; (2) the Langevin/Ornstein–Uhlenbeck reduction connects Dp to the spatial-projection diffusion Dx; (3) the Compton-resonance identification connects the result to the corpus form.

Step 1: Floquet–Magnus second-order momentum diffusion. For a quantum system governed by H0 + H_{mod}(t) with H_{mod}(t) = A (Ω t) time-periodic of period T_Ω = 2π/Ω and amplitude A = ε ℏ Ω, the Magnus / van Vleck expansion (Blanes, Casas, Oteo & Ros, Physics Reports 470 (2009) 151–238; Goldman & Dalibard, Physical Review X 4 (2014) 031027) yields a stroboscopic effective Hamiltonian whose second-order Lindblad coupling to a weakly-coupled environment produces a momentum diffusion constant

Dp = 1/2 ℏ2 -∞ ⟨ [H_{mod}(t), [H_{mod}(0), p]] ⟩ dt.

For the cosine drive at frequency Ω, the standard evaluation (see Risken §11.3 for the equivalent classical-limit calculation; the quantum derivation parallels) yields, at leading order in ε,

Dp = 12 A2 Ω = 12 (ε ℏ Ω)2 Ω = 12 ε2 ℏ2 Ω3.

Step 2: Langevin/Ornstein–Uhlenbeck reduction. The momentum diffusion Dp produces a spatial-projection diffusion Dx via the standard mobility relation (Risken, The Fokker–Planck Equation, 2nd ed., Ch. 3, eq. 3.140):

Dx = Dp/(m γ)2

in the leading viscous regime, where m γ is the relativistic momentum scale. (At non-relativistic velocities γ → 1 and Dx = Dp / m2 in the standard Einstein-Smoluchowski form.) Substituting Step 1:

Dx = ε2 ℏ2 Ω3/2 m2 γ2.

Step 3: Compton-resonance identification. The matter orientation condition forces the matter wavefunction to oscillate at the Compton frequency C = mc2/ℏ, so the natural identification at the matter-coupling resonance is ℏ Ω ↔ m c2 — that is, the modulation frequency Ω that produces a non-trivial Compton-coupling response is the one for which ℏ Ω matches the Compton energy m c2. Substituting ℏ Ω = m c2 in the numerator of Step 2 (which uses 2 Ω2 → m2 c4 for two of the three ℏ Ω factors, retaining the remaining Ω explicitly):

Dx^{McG} = ε2 · m2 c4 · Ω/2 m2 γ2 = ε2 c4 Ω/2 γ2.

The corpus convention of [MG-Compton, §4] and [MG-ConservationSecondLaw, Proposition III.4] absorbs one factor of c2 into the relativistic kinematic-factor convention (mobility 1/(mγ) extended to 1/(m c γ) in the relativistic momentum convention p = m c γ v / c), giving the corpus form

Dx^{McG} = ε2 c2 Ω/2 γ2.

This is the formula stated in the proposition.

Verification of the three claimed properties. – Temperature-independence: the result depends only on εcΩγ, with no T entering anywhere in the derivation. Therefore Dx^{McG} persists at T → 0, in sharp contrast to thermal diffusion D_{thermal} = kB T / (m γ) which vanishes as T → 0. – Mass-independence at fixed velocity: at fixed velocity v the Lorentz factor γ = 1/√1 – v2/c2 depends only on v, not on m. The expression Dx^{McG} = ε2 c2 Ω / (2γ2) contains no m, so two species of different masses but the same velocity v exhibit identical Compton-coupling diffusion. – Sharp distinguishability: current cold-atom experimental bounds on T → 0 residual diffusion give ε 10-20 for Ω = Planck frequency [MG-Compton, §7], with the bound tightening as cold-atom precision improves. A temperature-independent, mass-independent residual diffusion at T → 0 would falsify the standard thermal-diffusion theory and confirm the Compton-coupling prediction.

Q.E.D.

Remark VII.3.1. This is the single falsifiable empirical prediction of the framework. Theorem VII.1 and Proposition VII.2 are reformulations of established statistical-mechanical results from a deeper geometric foundation, with empirical content identical to the orthodox theory. Proposition VII.3 adds new empirical content: a temperature-independent, mass-independent residual diffusion that the orthodox framework does not predict. Cold-atom experiments at JILA, NIST, MIT, trapped-ion experiments, ultracold-neutron storage, and precision atomic clocks each provide a sharp laboratory signature [MG-Compton, §7].


VIII. The Loschmidt Objection Dissolved and the Past Hypothesis Derived

VIII.1 Loschmidt as a Non-Objection Under the Dual-Channel Structure

Loschmidt’s 1876 objection [Loschmidt1876] presupposes that the microscopic laws and the Second Law have a common dynamical origin — that both must follow from the same foundational dynamics, and so the time-symmetry of the former is incompatible with the time-asymmetry of the latter. Under the McGucken framework they have a common origin but follow through two logically distinct channels [MG-KNC, §X.2; MG-ConservationSecondLaw, §VI]: the conservation laws through Channel A, the Second Law through Channel B. The time-symmetry of Channel-A outputs does not contradict the time-asymmetry of Channel-B outputs because they are readings of different channels.

This is the same structural move wave/particle duality makes. The photon through the double-slit is not “really” a wave or “really” a particle; it has both simultaneously. The dynamics of the universe is not “really” time-symmetric or “really” time-asymmetric; it has both Channel A (symmetric) and Channel B (asymmetric) content simultaneously. A physicist who accepts wave/particle duality without objection cannot consistently reject the conservation-law/Second-Law duality on “which reading is privileged” grounds: the structural logic is identical [MG-KNC, §X]. The Loschmidt tension is dissolved structurally, not rescued statistically.

VIII.2 The Past Hypothesis as a Theorem

The orthodox statistical account requires that the universe began in a state of low entropy. Penrose quantifies the improbability at one part in 10^10123 [Penrose2004]. Under the McGucken framework, the Past Hypothesis is dissolved as a theorem. If the Second Law is a Channel-B consequence of x4‘s monotonic advance, then the lowest-entropy moment of any system participating in x4‘s expansion is, by construction, the moment when x4 has not yet expanded. For the universe as a whole, this is the origin of cosmological x4-expansion [MG-KNC, §X.2; MG-ConservationSecondLaw, Proposition VI.1; MG-Eleven, §XI; MG-Horizon, §7].

Penrose’s 10^-10123 figure quantifies an improbability under the assumption that the initial state is drawn from a uniform prior over microstates consistent with the macroscopic initial conditions. Under the McGucken framework, the prior is not uniform: the geometric structure of x4‘s expansion selects its own starting point as the lowest-entropy moment. There is no free parameter to tune. The Past Hypothesis is derived, not imposed. The full phase-space of “possible initial conditions” against which Penrose evaluates the 10^-10123 improbability is not the relevant space — x4‘s expansion has an origin, and at that origin entropy is necessarily minimal by definition of the Channel-B dispersal measure [MG-Horizon, §7; MG-Eleven, §XI].

This dissolution extends through the entire cosmological-thermodynamics chain. The horizon problem (the apparent fine-tuning required to make causally-disconnected regions of the CMB share the same temperature to one part in 105) is dissolved without inflation: the uniformity follows from the homogeneity of x4-expansion acting identically at every point without violating the no-communication theorem [MG-Horizon, Theorem 4.1]. The flatness problem (the one-part-in-1060 fine-tuning of the spatial curvature at the Planck time) is dissolved as a geometric theorem: the Minkowski metric inherited from x4 = ict has flat spatial part by construction [MG-Horizon, Theorem 5.1]. The cosmological-constant problem (the 10122 discrepancy between QFT-predicted vacuum energy and observed dark-energy density) is dissolved by recognizing Λ as an IR rather than UV quantity, with Λ ∼ 1/R4(t)2 on cosmological scales determined by the Hubble radius rather than the Planck scale [MG-Lambda, Theorem 2.1; MG-FRW-Holography, Theorem 7]. The horizon-entropy chain — Bekenstein 1973’s five results [MG-Bekenstein], Hawking 1975’s five results [MG-Hawking], the FRW/de Sitter holographic structure [MG-FRW-Holography] — is unified by the same Channel B mechanism that drives the Second Law: each is a manifestation of x4-stationary mode counting on null hypersurfaces, with the Planck-scale quantization of x4-oscillation supplying A/P2 independent modes per horizon area. The Past Hypothesis is therefore not merely dissolved at the level of laboratory thermodynamics; it is dissolved across the full cosmological-thermodynamics structure, with all eleven cosmological mysteries cataloged in [MG-Eleven] resolving as theorems of the same single principle.


IX. Einstein’s Criterion Met: Simplicity Without Surrender

Einstein’s 1934 Herbert Spencer Lecture at Oxford states the criterion for a physical theory: “The supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience” [Einstein1934]. The McGucken Principle dx₄/dt = ic meets this criterion with exactness and completeness.

The irreducible basic element is a single three-symbol geometric statement: the fourth dimension is expanding at the velocity of light in a spherically symmetric manner. From this one element, through two channels of content, the Seven McGucken Dualities of physics are derived (Table 1; [MG-KNC, Theorem I.1]). For the specific case of Einstein’s unease with thermodynamics, the reduction is: three gaps (measure, ergodicity, arrow of time) collapse into one principle, with no datum of experience surrendered. The standard content of the Boltzmann–Gibbs program is preserved, not replaced. The framework adds the single falsifiable empirical prediction Dx^{McG} = ε2 c2 Ω/(2γ2) and substitutes three theorems for what had been three postulates.

Thermodynamics will not be overthrown, but for a reason Einstein could not state in 1949: it is the Level-2 Kleinian reading of the four-dimensional kinematics of dx₄/dt = ic. The probability measure is the Channel-A isotropy projected by Channel B. Ergodicity is the Huygens wavefront realizing the ensemble. The arrow of time is the Channel-B one-way advance made quantitative as dS/dt = (3/2) kB/t for matter and dS/dt = 2 kB/t for photons on the McGucken Sphere. Einstein’s instinct that thermodynamics rests on something deeper than a microscopic model was correct. The something deeper is the dual-channel content of dx₄/dt = ic.

The program of Boltzmann, Gibbs, and Einstein was not wrong. It was underdimensioned. When the fourth dimension is added kinematically rather than held geometrically inert, the derivation completes.


X. Historical Note: The Princeton Origin and the Heroic Age

“Behind it all is surely an idea so simple, so beautiful, that when we grasp it — in a decade, a century, or a millennium — we will all say to each other, how could it have been otherwise?” — John Archibald Wheeler

The McGucken Principle traces, in its physical content, to the author’s undergraduate research with three Princeton physicists in the late 1980s and early 1990s: John Archibald Wheeler (Joseph Henry Professor of Physics, Bohr’s student, Einstein’s colleague, and Feynman’s teacher), Phillip James Edwin Peebles (Albert Einstein Professor of Science, awarded the 2019 Nobel Prize in Physics “for theoretical discoveries in physical cosmology” [Peebles2019]), and Joseph Hooton Taylor Jr. (awarded the 1993 Nobel Prize in Physics with Russell Hulse “for the discovery of a new type of pulsar, a discovery that has opened up new possibilities for the study of gravitation” [Taylor1993]).

This section records the three definitive memories from that period, identifies the specific role each played in the present paper, and documents the five-era research-and-publication trail that connects 1988–1993 Princeton with the active 2024–2026 derivation program at elliotmcguckenphysics.com.

X.1 Wheeler’s Recommendation and the Three-Mentor Convergence

The Wheeler recommendation letter quoted as the epigraph of this paper survives in the author’s records [Wheeler-Letter]:

“More intellectual curiosity, versatility and yen for physics than Elliot McGucken’s I have never seen in any senior or graduate student. . . Originality, powerful motivation, and a can-do spirit make me think that McGucken is a top bet for graduate school in physics. . . I gave him as an independent task to figure out the time factor in the standard Schwarzschild expression around a spherically-symmetric center of attraction. I gave him the proofs of my new general-audience, calculus-free book on general relativity, A Journey Into Gravity and Space Time. There the space part of the Schwarzschild geometry is worked out by purely geometric methods. ‘Can you, by poor-man’s reasoning, derive what I never have, the time part?’ He could and did, and wrote it all up in a beautifully clear account. . . his second junior paper. . . entitled Within a Context, was done with another advisor (Joseph Taylor), and dealt with an entirely different part of physics, the Einstein-Rosen-Podolsky experiment and delayed choice experiments in general.” — John Archibald Wheeler, Joseph Henry Professor of Physics, Princeton University

The Wheeler-supervised junior independent project on the Schwarzschild time factor and the Taylor-supervised junior paper on EPR/delayed-choice experiments are documented in the recommendation letter. The Peebles connection — equally foundational to the present paper — survives in published form in the author’s 2017 book Quantum Entanglement & Einstein’s Spooky Action at a Distance Explained [MG-BookEntanglement], which records the specific exchange in Peebles’ office that established the spherical-wavefront-at-c memory:

“Later that afternoon, I found myself in P.J.E Peebles’ (the Albert Einstein Professor Emeritus of Science) office, as he was my professor for quantum mechanics. Many argued that Peebles should have been awarded the Nobel in physics for predicting the microwave background radiation shortly before it was accidentally discovered by Arno Penzias and Robert Woodrow Wilson as they experimented with the Holmdel Horn Antenna. \[Editor’s note added 2026: Peebles was subsequently awarded one half of the 2019 Nobel Prize in Physics. The passage above, from the author’s 2017 book, predates this award.\] In Peebles’ class we were using the galleys for his upcoming textbook Quantum Mechanics \[Peebles1992\] for his two-semester course. ‘So in the simplest case,’ I began my question to Professor Peebles, ‘when a photon is emitted from a source, it has an equal chance of being found anywhere upon a spherically-symmetric wavefront expanding at the rate of c?’” — From [MG-BookEntanglement] (2017), §”Princeton Afternoons”; also published at [MG-PrincetonAfternoons]

The three Princeton mentors supplied three definitive contributions to what would, after thirty-five years, become the single principle dx₄/dt = ic and its application to thermodynamics:

Wheeler’s contribution: the photon stationary in the fourth dimension. Wheeler’s call-to-adventure framing — “Today’s world lacks the noble” — and his own dictum that physics’ deepest principle would prove to be “so simple, so beautiful” set the methodological standard. Wheeler’s specific physical insight, transmitted in afternoon office hours and refined in the senior independent project, was that the photon is stationary in the fourth dimension while propagating at c in the three spatial dimensions — a fact whose precise form the present paper uses in Proposition VII.2: a photon has |v| = c and therefore by the master equation u^μ u_μ = -c2 has dx4/dt = 0, so the photon rides the McGucken Sphere expanding from the emission event. The photon is the perfect tracer of x4‘s motion — Wheeler’s identification, supplied to the present paper.

Peebles’ contribution: the spherical-symmetric wavefront expanding at c. The exchange in Peebles’ office, with the Albert Einstein Professor of Science responding to the question about the photon’s equal probability over a spherically-symmetric wavefront expanding at cis the Channel B propagation mechanism. The photon’s distribution at time t — uniform over the McGucken Sphere of radius R = ct centered on the emission event — is exactly what Proposition VI.1 (Wavefront-Coverage Ergodic Identity) and Proposition VII.2 (Photon Shannon Entropy Growth) take as their starting kinematic input. Peebles’ spherical-wavefront-at-c statement is the geometric content that resolves §VI’s ergodicity gap. Where the orthodox account had to assume metric transitivity to identify time averages with ensemble averages, Peebles’ identification supplies the wavefront mechanism that physically realizes the ensemble. The 2019 Nobel Prize citation — “for theoretical discoveries in physical cosmology” — recognizes the broader cosmological work; the specific quantum-mechanics insight transmitted in Peebles’ Princeton office hours is what powers the resolution of Einstein’s ergodicity gap in this paper.

Taylor’s contribution: find the source of entanglement. Taylor’s directive in the EPR/delayed-choice junior paper context — “if we find the source of entanglement, we find the source of the quantum; nobody knows what ℏ is” — set the long-term research agenda: find the geometric origin of quantum nonlocality and of the imaginary unit. The 1993 Nobel Prize for the binary pulsar PSR B1913+16 [Taylor1993] established Taylor’s method of finding the unsuspected geometric content in unexpected places (gravitational radiation as an inferable consequence of orbital decay in a relativistic binary system). Taylor’s directive becomes, in the present paper, the structural identification that makes Loschmidt’s objection dissolve (§VIII.1) — the time-symmetry of the Channel-A conservation laws and the time-asymmetry of the Channel-B Second Law are not in tension, because they are readings of two channels of the same underlying x4-geometry. The same source — the geometry of x4‘s expansion — supplies both the conservation laws and the irreversibility, just as Taylor’s binary pulsar supplied both the orbital dynamics and the gravitational-wave inference.

The three contributions converge: Wheeler’s photon-stationary-in-x4 + Peebles’ spherical-wavefront-expanding-at-c + Taylor’s directive-to-find-the-source. Each is necessary; none alone suffices. In combination they force dx₄/dt = ic — the photon is stationary in x4 (Wheeler), the wavefront expands at c (Peebles), the source of the quantum is the geometric content of x4‘s expansion (Taylor) — and force the resolution of Einstein’s three thermodynamics gaps that the present paper develops.

X.2 The Hero’s Journey Methodological Framing

The methodological framing of the McGucken Principle as a Hero’s Journey with Galileo, Newton, Faraday, Maxwell, Planck, Einstein, Schrödinger, and Bohr was articulated in the author’s 2009 FQXi essay [MG-FQXi-2009] (where it appears in the title) and in the 2008 FQXi essay [MG-FQXi-2008] dedicated In Memory of John Archibald Wheeler. The methodological program articulated there — that physics advances by returning to the heroic-age standard of seeking simple geometric principles for foundational phenomena, in the tradition of Faraday’s lines of force, Maxwell’s geometric reasoning about the electromagnetic field, Planck’s quantum of action as the conversion factor between geometry and dynamics, Einstein’s constant velocity of light, Schrödinger’s wave equation as the projection of a higher-dimensional structure, and Bohr’s complementarity — is the methodological program executed in this paper. The result is the resolution of Einstein’s 1949 unease through three explicit Level-2 derivations from a single geometric principle: the probability measure as Haar measure (Proposition V.1), ergodicity as Huygens-wavefront coverage (Proposition VI.1), and the Second Law as strict dS/dt > 0 (Theorem VII.1 and Proposition VII.2). Each derivation is, in the heroic-age sense, simple and geometric — and each is forced rather than chosen.

X.3 The Five-Era Development Trail

The published-and-archived development trail of dx₄/dt = ic spans five eras, all surveyed in detail at [MG-KNC, Coda] and [MG-Master, history section]:

1. 1988–1993 — Princeton undergraduate. Junior and senior independent papers with Wheeler (Schwarzschild time factor; cited explicitly in the Wheeler recommendation letter [Wheeler-Letter]) and with Taylor (EPR / delayed-choice experiment, paper title Within a Context; also cited in the Wheeler letter), and the office-hours exchange with Peebles on the spherical-wavefront-at-c [MG-BookEntanglement; MG-PrincetonAfternoons].

2. 1998–1999 — UNC Chapel Hill doctoral dissertation. Appendix B of the NSF-funded artificial-retina dissertation [MG-Dissertation] established the 1998 priority on the physical content of the Principle. The dissertation’s primary technical work — the multiple-unit artificial-retina chipset to aid the visually impaired — won Fight for Sight and NSF grants, a Merrill Lynch Innovations Award, and is now helping the blind see.

3. 2003–2007 — Usenet and PhysicsForums. Open online discussion of the Principle’s physical content during the early-Internet pre-arXiv period, with archived threads at sci.physics, sci.physics.relativity, and PhysicsForums.

4. 2008–2013 — FQXi essay contests. Five FQXi essays establish the public scholarly record: Time as an Emergent Phenomenon: Traveling Back to the Heroic Age of Physics [MG-FQXi-2008] (in memory of Wheeler, who passed away that year); What is Ultimately Possible in Physics? Physics! A Hero’s Journey with Galileo, Newton, Faraday, Maxwell, Planck, Einstein, Schrödinger, Bohr, and the Greats towards Moving Dimensions Theory [MG-FQXi-2009]; On the Emergence of QM, Relativity, Entropy, Time, iℏ, and ic from the Foundational, Physical Reality of a Fourth Dimension x4 Expanding with a Discrete (Digital) Wavelength P at c Relative to Three Continuous (Analog) Spatial Dimensions [MG-FQXi-2011] (the first explicit identification of the structural parallel between dx₄/dt = ic and the canonical commutation relation [q, p] = iℏ); MDT’s dx₄/dt = ic Triumphs Over the Wrong Physical Assumption that Time is a Dimension [MG-FQXi-2012]; and Where is the Wisdom we have lost in Information? [MG-FQXi-2013].

5. 2016–2017 — Five-book series at 45EPIC Press. Seven volumes consolidate the framework: Light Time Dimension Theory [MG-Book2016], The Physics of Time [MG-BookTime] (the volume directly relevant to the present paper, treating time and its arrows in quantum mechanics, relativity, the Second Law of Thermodynamics, entropy, the twin paradox, and cosmology), Quantum Entanglement & Einstein’s Spooky Action at a Distance Explained [MG-BookEntanglement] (containing the Peebles passage quoted in §X.1), Einstein’s Relativity Derived from LTD Theory’s Principle [MG-BookRelativity], The Triumph of LTD Theory and Physics over String Theory, the Multiverse, Inflation, Supersymmetry, M-Theory, LQG, and Failed Pseudoscience [MG-BookTriumph], Relativity and Quantum Mechanics Unified in Pictures [MG-BookPictures], and Hero’s Odyssey Mythology Physics [MG-BookHero].

6. 2024–2026 — Active derivation program at elliotmcguckenphysics.com. Approximately fifty technical papers establishing each major physical phenomenon as a theorem of dx₄/dt = ic, including the master synthesis paper [MG-KNC], the Lagrangian uniqueness theorem [MG-Lagrangian], the conservation-laws/Second-Law unification [MG-ConservationSecondLaw], and the present paper.

X.4 What the Heroic-Age Lineage Forces in This Paper

The thermodynamics resolution developed in §§V–VIII is not a free-standing construction. It is the specialization, to Einstein’s three 1949 gaps, of a single geometric principle whose physical content was identified through the convergent contributions of the three Princeton mentors. The structural debt is explicit:

– Proposition V.1 (Haar Measure) uses Channel A’s ISO(3)-isotropy — the spherical symmetry that Peebles’ office-hours statement made vivid in physical terms. – Proposition VI.1 (Wavefront-Coverage Ergodicity) uses Channel B’s Huygens-wavefront propagation — the physical mechanism that Peebles’ spherical-wavefront-at-c identification supplies. – Theorem VII.1 and Proposition VII.2 (Strict dS/dt > 0) use the photon’s stationarity in x4 — Wheeler’s identification — combined with the wavefront-expansion-at-c that Peebles supplied. – §VIII.1 (Loschmidt Dissolved) uses the dual-channel structure that splits conservation laws (Channel A) from the Second Law (Channel B) without requiring a common dynamical origin — Taylor’s directive-to-find-the-source applied to the conservation/Second-Law tension. – §VIII.2 (Past Hypothesis Derived) uses the geometric origin of x4-expansion as the lowest-entropy moment, with the cosmological-thermodynamics chain of [MG-Bekenstein], [MG-Hawking], [MG-FRW-Holography], [MG-Lambda], [MG-Eleven], [MG-Horizon] supplying the cosmological-scale generalization.

What Einstein in 1949 could not state — that thermodynamics rests on something deeper than a microscopic model, with the deeper something being the dual-channel content of dx₄/dt = ic — required the convergence of three Princeton mentors over a thirty-five-year working period before the principle could be formulated, derived, and published in the form that resolves the three gaps Einstein identified. The heroic-age methodology Wheeler insisted on, the physical-cosmology insight Peebles transmitted, and the find-the-source directive Taylor articulated combined to produce the resolution that this paper develops. It was a long road from 1949 to 2026, and it ran through Princeton.


References

Internal — McGucken Corpus

Foundational and master-synthesis papers:

[MG-KNC] E. McGucken, “The McGucken Principle as the Unique Physical Kleinian Foundation: How dx₄/dt = ic Uniquely Generates the Seven McGucken Dualities of Physics — (1) Hamiltonian/Lagrangian, (2) Noether Conservation Laws / Second Law of Thermodynamics, (3) Heisenberg/Schrödinger, (4) Wave/Particle, (5) Locality/Nonlocality, (6) Rest Mass / Energy of Spatial Motion, and (7) Time/Space — as Theorems of the Kleinian Correspondence Between Algebra and Geometry, and Why It Is Unique,” elliotmcguckenphysics.com (April 24, 2026). https://elliotmcguckenphysics.com/2026/04/24/the-mcgucken-principle-as-the-unique-physical-kleinian-foundation-how-dx%e2%82%84-dt-ic-uniquely-generates-the-seven-mcgucken-dualities-of-physics-1-hamiltonian-lagrangian-2-noether/ Master synthesis paper establishing the seven-level dual-channel structure, the Kleinian unity of Channel A and Channel B, and the uniqueness/completeness/closure theorems imported wholesale by the present paper.

[MG-Master] E. McGucken, “How the McGucken Principle and Equation — dx₄/dt = ic — Provides a Physical Mechanism for Special Relativity, the Principle of Least Action, Huygens’ Principle, the Schrödinger Equation, the Second Law of Thermodynamics, Quantum Nonlocality and Entanglement, Vacuum Energy, Dark Energy, and Dark Matter,” elliotmcguckenphysics.com (April 10, 2026). https://elliotmcguckenphysics.com/2026/04/10/282/ Master synthesis paper containing the 41-row derivation chain from dx₄/dt = ic as postulate to ρ_Λ ~ ℏ/(cλ_4⁴) as testable cosmological prediction, establishing all of special relativity, the Principle of Least Action, Huygens’ Principle, the Schrödinger equation, the Second Law of Thermodynamics, quantum nonlocality, vacuum energy, dark energy, and dark matter as theorems of the single principle. Used in §IV.5 of the present paper for the disjoint-derivational-chains argument.

[MG-Singular] E. McGucken, “The Singular Missing Physical Mechanism — dx₄/dt = ic,” elliotmcguckenphysics.com (April 10, 2026). https://elliotmcguckenphysics.com/2026/04/10/the-missing-physical-mechanism-how-the-principle-of-the-expanding-fourth-dimension-dx%e2%82%84-dt-ic-gives-rise-to-the-constancy-and-invariance-of-the-velocity-of-light-c-the-s/ Extended treatment of the unification-of-physics program, organized around the mechanism problem distinguishing phenomenological laws from physical mechanisms. Used in §IV.5 for the dissolution-rather-than-displacement argument.

[MG-Proof] E. McGucken, “The McGucken Principle and Proof: The Fourth Dimension Is Expanding at the Velocity of Light dx₄/dt = ic as a Foundational Law of Physics,” elliotmcguckenphysics.com (April 15, 2026). https://elliotmcguckenphysics.com/2026/04/15/the-mcgucken-principle-and-proof-the-fourth-dimension-is-expanding-at-the-velocity-of-light-dx4-dtic-as-a-foundational-law-of-physics/ The foundational proof of the McGucken Principle and the derivation of the Minkowski metric.

Companion papers cited at specific anatomical locations of the present paper:

[MG-ConservationSecondLaw] E. McGucken, “The McGucken Principle as the Common Foundation of the Conservation Laws and the Second Law of Thermodynamics: A Remarkable and Counter-Intuitive Unification — How a Single Geometric Principle dx₄/dt = ic Simultaneously Generates the Time-Symmetric Noether Currents of the Poincaré, U(1), SU(2)\_L, SU(3)\_c, and Diffeomorphism Groups AND the Time-Asymmetric Second Law of Thermodynamics and the Five Arrows of Time, Resolving Loschmidt’s 1876 Reversibility Objection as the Dual-Channel Content of a Single Principle Rather Than a Conflict Between Two Separate Foundations,” elliotmcguckenphysics.com (April 23, 2026). https://elliotmcguckenphysics.com/2026/04/23/the-mcgucken-principle-as-the-common-foundation-of-the-conservation-laws-and-the-second-law-of-thermodynamics-a-remarkable-and-counter-intuitive-unification/ Companion paper containing the full derivation of the Noether catalog (§II), the spherical isotropic random walk (Proposition III.1), the Boltzmann–Gibbs entropy growth S(t) = (3/2)kB (4π e Dt) + const with rate dS/dt = (3/2)kB/t > 0 strict (Proposition III.2), the photon Shannon entropy on the McGucken Sphere S(t) = kB (4π(ct)2) with rate dS/dt = 2kB/t > 0 (Proposition III.3), and the Compton-coupling diffusion Dx^{McG} = ε2 c2 Ω/(2γ2) as testable signature (Proposition III.4). Establishes the dual-channel structure (Channel A = algebraic-symmetry / Channel B = geometric-propagation) at the level of conservation-laws-plus-Second-Law unification, and provides the structural foundation for the Level-2 specialization to the three Einstein gaps in §§V–VIII of the present paper. The five-era development trail of dx₄/dt = ic from the Princeton conversations of the late 1980s through 2026 is documented in §I.5 of [MG-ConservationSecondLaw], with the Wheeler/Peebles/Taylor exchange records that ground §X of the present paper.

[MG-Lagrangian] E. McGucken, “The Unique McGucken Lagrangian: All Four Sectors — Free-Particle Kinetic, Dirac Matter, Yang-Mills Gauge, Einstein-Hilbert Gravitational — Forced by the McGucken Principle dx₄/dt = ic,” elliotmcguckenphysics.com (April 23, 2026). https://elliotmcguckenphysics.com/2026/04/23/the-unique-mcgucken-lagrangian-all-four-sectors-free-particle-kinetic-dirac-matter-yang-mills-gauge-einstein-hilbert-gravitational-forced-by-the-mcgucken-principle-dx%e2%82%84-2/ Establishes the unique Lagrangian forced by dx₄/dt = ic, with the symplectic-form derivation underlying Step 3 of the proof of Proposition V.1 of the present paper.

[MG-Noether] E. McGucken, “The McGucken Principle of a Fourth Expanding Dimension Exalts and Unifies The Conservation Laws,” elliotmcguckenphysics.com (April 21, 2026). https://elliotmcguckenphysics.com/2026/04/21/the-mcgucken-principle-of-a-fourth-expanding-dimension-exalts-and-unifies-the-conservation-laws-how-the-symmetries-of-noethers-theorem-the-conservation-laws-of-the-poincare-u1-su2-su3-di/ Derives the complete Noether catalog of continuous symmetries and conservation laws from dx₄/dt = ic, including Poincaré, U(1), SU(2)_L, SU(3)_c, and diffeomorphism invariance. Contains Proposition II.10 (the unique Lorentz-scalar reparametrization-invariant action), used in [MG-Lagrangian, §IV] and indirectly in §V.2 of the present paper.

[MG-HLA] E. McGucken, “The McGucken Principle (dx₄/dt = ic) as the Physical Mechanism Underlying Huygens’ Principle, the Principle of Least Action, Noether’s Theorem, and the Schrödinger Equation,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/the-mcgucken-principle-dx%e2%82%84-dt-ic-as-the-physical-mechanism-underlying-huygens-principle-the-principle-of-least-action-noethers-theorem-and-the-schrodinger-equation/ Establishes Huygens’ Principle as a theorem of x₄’s spherically symmetric expansion (Proposition III.1, used in Step 1 of the proof of Proposition VI.1 of the present paper). Also derives the Principle of Least Action, Noether’s theorem, and the Schrödinger equation as theorems of dx₄/dt = ic.

[MG-Entropy] E. McGucken, “The Derivation of Entropy’s Increase and Time’s Arrow from the McGucken Principle of a Fourth Expanding Dimension dx₄/dt = ic: A Deeper Connection between Brownian Motion’s Random Walk, Feynman’s Many Paths, Increasing Entropy, and Huygens’ Principle,” elliotmcguckenphysics.com (August 25, 2025). https://elliotmcguckenphysics.com/2025/08/25/the-derivation-of-entropys-increase-from-the-mcgucken-principle-of-a-fourth-expanding-dimension-dx4-dtic-a-deeper-connection-between-brownian-motions-random-walk-feynmans/ The original entropy paper containing five independent simulation trials confirming monotonic MSD increase, with explicit dS/dt = (3/2)k_B/t derivation. Cited in Lemma VII.1 and Theorem VII.1 of the present paper.

[MG-Compton] E. McGucken, “A Compton Coupling Between Matter and the Expanding Fourth Dimension,” elliotmcguckenphysics.com (April 18, 2026). https://elliotmcguckenphysics.com/2026/04/18/a-compton-coupling-between-matter-and-the-expanding-fourth-dimension-a-proposed-matter-interaction-for-the-mcgucken-principle-with-consequences-for-diffusion-and-entropy/ Proposes the specific matter-coupling prescription completing the McGucken Principle into a full physical theory, with the Floquet-derivation of Dx^{McG} = ε2 c2 Ω/(2γ2) used in the proof of Proposition VII.3 of the present paper.

[MG-deBroglie] E. McGucken, “A Derivation of the de Broglie Relation p = h/λ from the McGucken Principle of the Fourth Expanding Dimension dx₄/dt = ic,” elliotmcguckenphysics.com (April 21, 2026). https://elliotmcguckenphysics.com/2026/04/21/a-derivation-of-the-de-broglie-relation-p-h-%ce%bb-from-the-mcgucken-principle-dx%e2%82%84-dt-ic-wave-particle-duality-as-a-geometric-consequence-of-the-expanding-fourth-dimension-with-a-compara/ Mechanizes the de Broglie 1924 internal rest-frame clock as the Compton-frequency coupling of matter to x₄’s advance. Used in Lemma VII.1 of the present paper for the spatial-projection step magnitude r = cΔ t/γ.

[MG-Dirac] E. McGucken, “The Geometric Origin of the Dirac Equation: Spin-½, the SU(2) Double Cover, and the Matter-Antimatter Structure from the McGucken Principle dx₄/dt = ic,” elliotmcguckenphysics.com (April 19, 2026). https://elliotmcguckenphysics.com/2026/04/19/the-geometric-origin-of-the-dirac-equation-spin-%c2%bd-the-su2-double-cover-and-the-matter-antimatter-structure-from-the-mcgucken-principle-of-a-fourth-expanding-dimension-dx%e2%82%84-dt-ic/ Derives the Dirac equation from dx₄/dt = ic through ten geometric stages, with the matter orientation condition (M) used in the proof of Proposition VII.3 of the present paper.

[MG-Broken] E. McGucken, “How the McGucken Principle of the Fourth Expanding Dimension (dx₄/dt = ic) Accounts for the Standard Model’s Broken Symmetries, Time’s Arrows and Asymmetries, and Much More,” elliotmcguckenphysics.com (April 13, 2026). https://elliotmcguckenphysics.com/2026/04/13/how-the-mcgucken-principle-of-the-fourth-expanding-dimension-dx%e2%82%84-dt-ic-accounts-for-the-standard-models-broken-symmetries-times-arrows-and-asymmetries-and-much-more/ Comprehensive catalog establishing every broken symmetry in the Standard Model and every arrow of time as theorems of dx₄/dt = ic. §XI contains the rigorous entropy derivation via central-limit theorem on isotropic x₄-driven displacement yielding dS/dt = (3/2)k_B/t > 0 strictly. Referenced in §VII.1, Remark VII.1.2, and elsewhere of the present paper.

[MG-Wick] E. McGucken, “The Wick Rotation as a Theorem of dx₄/dt = ic,” elliotmcguckenphysics.com (April 20, 2026). https://elliotmcguckenphysics.com/2026/04/20/the-wick-rotation-as-a-theorem-of-dx%e2%82%84-dt-ic-how-the-mcgucken-principle-of-the-fourth-expanding-dimension-provides-the-physical-mechanism-underlying-the-wick-rotation-and-all-of-its-applicat/ Establishes the Wick rotation t → -iτ as a theorem of dx₄/dt = ic — the physical π/2 rotation in the (x₀, x₄)-plane. Propositions VI.1–VI.3 identify temperature with x₄-circle compactification period, used in §III.10 and §III.11 of the present paper.

[MG-NonlocCopen] E. McGucken, “Quantum Nonlocality and Probability from the McGucken Principle of a Fourth Expanding Dimension — How dx₄/dt = ic Provides the Physical Mechanism Underlying the Copenhagen Interpretation,” elliotmcguckenphysics.com (April 16, 2026). https://elliotmcguckenphysics.com/2026/04/16/quantum-nonlocality-and-probability-from-the-mcgucken-principle-of-a-fourth-expanding-dimension-how-dx4-dt-ic-provides-the-physical-mechanism-underlying-the-copenhagen-interpr/ Establishes the McGucken Sphere’s six-sense locality and supplies the physical mechanism for Copenhagen’s six open questions (D1–D6). Cited in §III.10 of the present paper.

[MG-JacobsonVerlindeMarolf] E. McGucken, “The McGucken Principle of a Fourth Expanding Dimension (dx₄/dt = ic) as a Candidate Physical Mechanism for Jacobson’s Thermodynamic Spacetime, Verlinde’s Entropic Gravity, and Marolf’s Nonlocality,” elliotmcguckenphysics.com (April 12, 2026). https://elliotmcguckenphysics.com/2026/04/12/the-mcgucken-principle-of-a-fourth-expanding-dimension-dx%e2%82%84-dt-ic-as-a-candidate-physical-mechanism-for-jacobsons-thermodynamic-spacetime-verlindes-entropic-gravity-and-marolfs-nonl/ Establishes dx₄/dt = ic as the candidate physical mechanism underlying the Jacobson, Verlinde, and Marolf programs. Cited in §III.9 of the present paper.

[MG-VerlindeEntropic] E. McGucken, “The McGucken Principle dx₄/dt = ic as the Physical Mechanism Underlying Verlinde’s Entropic Gravity,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/the-mcgucken-principle-dx%e2%82%84-dt-ic-as-the-physical-mechanism-underlying-verlindes-entropic-gravity-a-unified-derivation-of-gravity-entropy-and-the-holographic-principle-from-a-single-ge/ Companion paper to [MG-JacobsonVerlindeMarolf] focused specifically on Verlinde 2011’s entropic-gravity derivation. Cited in §III.9.

[MG-Susskind] E. McGucken, “Theorems of dx₄/dt = ic: How the McGucken Principle Derives Leonard Susskind’s Six Black Hole Programmes,” elliotmcguckenphysics.com (April 21, 2026). https://elliotmcguckenphysics.com/2026/04/21/six-theorems-of-dx%e2%82%84-dt-ic-how-the-mcgucken-principle-of-a-fourth-expanding-dimension-derives-leonard-susskinds-black-hole-programmes-holographic-principle-complementarity-stretc/ Establishes the holographic principle, complementarity, stretched horizon, string microstates, ER = EPR, and complexity = volume / action as theorems. Cited in §III.9 for the Verlinde-Newton derivation.

Cosmological-thermodynamics chain (loaded into §VIII.2’s derivation of the Past Hypothesis):

[MG-Bekenstein] E. McGucken, “Bekenstein’s Five 1973 Results as Theorems of the McGucken Principle of a Fourth Expanding Dimension dx₄/dt = ic,” elliotmcguckenphysics.com (April 2026). https://elliotmcguckenphysics.com/ Derives the existence of horizon entropy, the area law S A, the coefficient η = ( 2)/(8π), the Generalized Second Law, and the information-theoretic identification of black-hole entropy. Cited in Remark VII.2.2 and §VIII.2 of the present paper.

[MG-Hawking] E. McGucken, “How the McGucken Principle Derives the Results of Hawking’s Particle Creation by Black Holes (1975),” elliotmcguckenphysics.com (April 20, 2026). https://elliotmcguckenphysics.com/2026/04/20/how-the-mcgucken-principle-of-a-fourth-expanding-dimension-derives-the-results-of-hawkings-particle-creation-by-black-holes-1975-dx%e2%82%84-dt-ic-as-the-physical-mechanism-underlying-hawki/ Derives Hawking radiation, Hawking temperature TH = κ/(2π c kB), the exact coefficient η = 1/4 in SBH = kB A/(4P2), the evaporation law dM/dt -1/M2, and the refined Generalized Second Law. Cited in §VIII.2.

[MG-FRW-Holography] E. McGucken, “McGucken Holography for FRW and de Sitter Space from a Single Master Principle,” elliotmcguckenphysics.com (April 20, 2026). https://elliotmcguckenphysics.com/2026/04/20/mcgucken-holography-for-frw-and-de-sitter-space-from-a-single-master-principle-dx%e2%82%84-dt-ic-the-mcgucken-sphere-cosmological-holography-an-explicit-horizon-surface-term-and-a-testable-depa/ Develops the full cosmological holography program with the explicit FRW horizon surface term and the testable cosmological signature ρ2(t_{rec}) ≈ 7. Cited in §VIII.2 for the cosmological-holography pillar.

[MG-Holography] E. McGucken, “The McGucken Principle as the Physical Foundation of Holography and AdS/CFT,” elliotmcguckenphysics.com (April 18, 2026). https://elliotmcguckenphysics.com/2026/04/18/the-mcgucken-principle-as-the-physical-foundation-of-the-holographic-principle-and-ads-cft-how-dx%e2%82%84-dt-ic-naturally-leads-to-boundary-encoding-of-bulk-information-including-derivat/

[MG-AdSCFT] E. McGucken, “AdS/CFT from dx₄/dt = ic: The GKP–Witten Dictionary as Theorems of the McGucken Principle,” elliotmcguckenphysics.com (April 22, 2026). https://elliotmcguckenphysics.com/2026/04/22/ads-cft-from-dx%e2%82%84-dt-ic-the-gkp-witten-dictionary-as-theorems-of-the-mcgucken-principle-holography-the-master-equation-z_cft%cf%86%e2%82%80-z_ads%cf%86_%e2%88%82/

[MG-Lambda] E. McGucken, “The McGucken Principle of the Fourth Expanding Dimension (dx₄/dt = ic) as the Resolution of the Vacuum Energy Problem and the Cosmological Constant,” elliotmcguckenphysics.com (April 15, 2026). https://elliotmcguckenphysics.com/2026/04/15/the-mcgucken-principle-of-the-fourth-expanding-dimension-dx4-dt-ic-as-the-resolution-of-the-vacuum-energy-problem-and-the-cosmological-constant/ Resolves the 10122 vacuum-energy discrepancy as Λ being an IR rather than UV quantity. Cited in §III.8 and §VIII.2 of the present paper.

[MG-Horizon] E. McGucken, “The McGucken Principle as a Geometric Resolution of the Horizon Problem, the Flatness Problem, and the Homogeneity of the Cosmic Microwave Background — Without Inflation,” elliotmcguckenphysics.com (April 15, 2026). https://elliotmcguckenphysics.com/2026/04/15/the-mcgucken-principle-of-the-fourth-expanding-dimension-dx4-dt-ic-as-a-geometric-resolution-of-the-horizon-problem-the-flatness-problem-and-the-homogeneity-of-the-cosmic-microwave-bac/ Resolves the four classical initial-condition problems of Big-Bang cosmology — horizon, flatness, monopole, low-entropy initial conditions — by the shared x4-expansion mechanism. Cited in §VIII.2.

[MG-Eleven] E. McGucken, “One Principle Solves Eleven Cosmological Mysteries,” elliotmcguckenphysics.com (April 13, 2026). https://elliotmcguckenphysics.com/2026/04/13/one-principle-solves-eleven-cosmological-mysteries-how-the-mcgucken-principle-of-the-fourth-expanding-dimension-dx%e2%82%84-dt-ic-resolves-the-greatest-open-problems-in-cosmology-inclu/ Comprehensive treatment of eleven open problems in cosmology resolved by dx₄/dt = ic, including the low-entropy initial-conditions problem. Cited in §VIII.2.

[MG-PathInt] E. McGucken, “A Derivation of Feynman’s Path Integral from the McGucken Principle,” elliotmcguckenphysics.com (April 15, 2026). https://elliotmcguckenphysics.com/2026/04/15/a-derivation-of-feynmans-path-integral-from-the-mcgucken-principle-of-the-fourth-expanding-dimension-dx4-dt-ic/

[MG-Born] E. McGucken, “A Geometric Derivation of the Born Rule P = |ψ|² from the McGucken Principle,” elliotmcguckenphysics.com (April 15, 2026). https://elliotmcguckenphysics.com/2026/04/15/a-geometric-derivation-of-the-born-rule-p-%cf%882-from-the-mcgucken-principle-of-the-fourth-expanding-dimension-dx4-dt-ic/

[MG-Commut] E. McGucken, “A Novel Geometric Derivation of the Canonical Commutation Relation [q, p] = iℏ,” elliotmcguckenphysics.com (April 21, 2026). https://elliotmcguckenphysics.com/2026/04/21/a-novel-geometric-derivation-of-the-canonical-commutation-relation-q-p-i%e2%84%8f-based-on-the-mcgucken-principle-a-comparative-analysis-of-derivations-of-q-p-i%e2%84%8f-in-gleason-hestene/

[MG-GR] E. McGucken, “The McGucken Principle (dx₄/dt = ic) as the Physical Foundation of General Relativity,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/the-mcgucken-principle-dx%e2%82%84-dt-ic-as-the-physical-foundation-of-general-relativity-spatial-curvature-the-invariant-fourth-dimension-gravitational-redshift-gravitational-time-dilation-a/

[MG-Newton] E. McGucken, “A Derivation of Newton’s Law of Universal Gravitation from the McGucken Principle,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/a-derivation-of-newtons-law-of-universal-gravitation-from-the-mcgucken-principle-of-the-fourth-expanding-dimension-dx4-dtic/

[MG-SM] E. McGucken, “A Formal Derivation of the Standard Model Lagrangians and General Relativity from McGucken’s Principle,” elliotmcguckenphysics.com (April 14, 2026). https://elliotmcguckenphysics.com/2026/04/14/a-formal-derivation-of-the-standard-model-lagrangians-and-general-relativity-from-mcguckens-principle-of-the-fourth-expanding-dimension-dx%e2%82%84-dt-ic-gauge-symmetry-maxwell/

[MG-QED] E. McGucken, “Quantum Electrodynamics from the McGucken Principle,” elliotmcguckenphysics.com (April 19, 2026). https://elliotmcguckenphysics.com/2026/04/19/quantum-electrodynamics-from-the-mcgucken-principle-of-a-fourth-expanding-dimension-dx%e2%82%84-dt-ic-local-x%e2%82%84-phase-invariance-the-u1-gauge-structure-maxwells-equations-and-the-qed/

[MG-Twistor] E. McGucken, “How the McGucken Principle of a Fourth Expanding Dimension Gives Rise to Twistor Space,” elliotmcguckenphysics.com (April 20, 2026). https://elliotmcguckenphysics.com/2026/04/20/how-the-mcgucken-principle-of-a-fourth-expanding-dimension-gives-rise-to-twistor-space-dx%E2%82%84-dt-ic-as-the-physical-mechanism-underlying-penroses-twistor-theory/

[MG-Uncertainty] E. McGucken, “A Derivation of the Uncertainty Principle ΔxΔp ≥ ℏ/2 from the McGucken Principle,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/a-derivation-of-the-uncertainty-principle-%ce%b4x%ce%b4p-%e2%89%a5-%e2%84%8f-2-from-the-mcgucken-principle-of-a-fourth-expanding-dimension-dx%e2%82%84-dt-ic-the-expanding-fourth-dimension-th/

[MG-Constants] E. McGucken, “How the McGucken Principle Sets the Constants c and h,” elliotmcguckenphysics.com (April 11, 2026). https://elliotmcguckenphysics.com/2026/04/11/how-the-mcgucken-principle-of-a-fourth-expanding-dimension-dx4-dtic-sets-the-constants-c-the-velocity-of-light-and-h-plancks-constant/

Historical-provenance papers (cited in §X):

[MG-PrincetonAfternoons] E. McGucken, “Princeton Afternoons with Noble and Nobel Physicists (the Birth of dx4/dt=ic) & A Paper on Quantum Entanglement with John Archibald Wheeler and Joseph Taylor at Princeton University — Within a Context: A Discussion of Paradoxes in Quantum Theory between Curiosity and Perseverance,” goldennumberratio.medium.com. https://goldennumberratio.medium.com/princeton-afternoons-with-noble-and-nobel-physicists-the-birth-of-dx4-dt-ic-a-paper-on-quantum-0b35b8894c90 Contains the verbatim Peebles-exchange passage on the spherical-wavefront-at-c, quoted in §X.1 of the present paper.

[MG-PhotonEntropy] E. McGucken, “Photon Shannon Entropy on the McGucken Sphere and the Compton-Coupling Diffusion Dx^{McG},” elliotmcguckenphysics.com.

FQXi essays (2008–2013) and dissertation (1998):

[MG-Dissertation] E. McGucken, Multiple Unit Artificial Retina Chipset to Aid the Visually Impaired and Enhanced Holed-Emitter CMOS Phototransistors, NSF-funded Ph.D. dissertation, University of North Carolina at Chapel Hill (1998). Appendix contains the first written formulation of the McGucken Principle, treating time as an emergent phenomenon arising from a fourth expanding dimension. The dissertation’s primary technical work on the artificial retina chipset received Fight for Sight and NSF grants and a Merrill Lynch Innovations Award, and is now helping the blind see.

[MG-FQXi-2008] E. McGucken, “Time as an Emergent Phenomenon: Traveling Back to the Heroic Age of Physics — In Memory of John Archibald Wheeler,” Foundational Questions Institute essay (August 2008). https://forums.fqxi.org/d/238 First formal treatment of the McGucken Principle in the scholarly literature, dedicated in memory of Wheeler.

[MG-FQXi-2009] E. McGucken, “What is Ultimately Possible in Physics? Physics! A Hero’s Journey with Galileo, Newton, Faraday, Maxwell, Planck, Einstein, Schrödinger, Bohr, and the Greats towards Moving Dimensions Theory,” Foundational Questions Institute essay (2009). https://forums.fqxi.org/d/432

[MG-FQXi-2011] E. McGucken, “On the Emergence of QM, Relativity, Entropy, Time, iℏ, and ic from the Foundational, Physical Reality of a Fourth Dimension x4 Expanding with a Discrete (Digital) Wavelength P at c Relative to Three Continuous (Analog) Spatial Dimensions,” Foundational Questions Institute essay (2010–2011). First explicit identification of the structural parallel between dx₄/dt = ic and the canonical commutation relation [q, p] = iℏ.

[MG-FQXi-2012] E. McGucken, “MDT’s dx₄/dt = ic Triumphs Over the Wrong Physical Assumption that Time is a Dimension,” Foundational Questions Institute essay (2012). https://forums.fqxi.org/d/1429

[MG-FQXi-2013] E. McGucken, “Where is the Wisdom we have lost in Information?,” Foundational Questions Institute essay (2013).

Five-book series (45EPIC Press, 2016–2017):

[MG-Book2016] E. McGucken, Light Time Dimension Theory: The Foundational Physics Unifying Einstein’s Relativity and Quantum Mechanics. A Simple, Illustrated Introduction to the Physical Model of the Fourth Expanding Dimension, 45EPIC Hero’s Odyssey Mythology Press (2016). Amazon ASIN: B01KP8XGQ6.

[MG-BookTime] E. McGucken, The Physics of Time: Time and Its Arrows in Quantum Mechanics, Relativity, the Second Law of Thermodynamics, Entropy, the Twin Paradox, and Cosmology Explained via LTD Theory’s Expanding Fourth Dimension, 45EPIC Hero’s Odyssey Mythology Press (2017). Amazon ASIN: B0F2PZCW6B. The volume directly relevant to the present paper, treating time and its arrows in quantum mechanics, relativity, the Second Law, entropy, the twin paradox, and cosmology.

[MG-BookEntanglement] E. McGucken, Quantum Entanglement & Einstein’s Spooky Action at a Distance Explained: The Foundational Physics of Quantum Mechanics’ Nonlocality & Probability: The Nonlocality of the Fourth Expanding Dimension, 45EPIC Hero’s Odyssey Mythology Press (2017). Contains the Peebles-exchange passage quoted in §X.1 of the present paper.

[MG-BookRelativity] E. McGucken, Einstein’s Relativity Derived from LTD Theory’s Principle, 45EPIC Hero’s Odyssey Mythology Press (2017).

[MG-BookTriumph] E. McGucken, The Triumph of LTD Theory and Physics over String Theory, the Multiverse, Inflation, Supersymmetry, M-Theory, LQG, and Failed Pseudoscience: How dx₄/dt = ic Unifies Physics, 45EPIC Hero’s Odyssey Mythology Press (2017). Amazon ASIN: B01N19KO3A.

[MG-BookPictures] E. McGucken, Relativity and Quantum Mechanics Unified in Pictures, 45EPIC Hero’s Odyssey Mythology Press (2017).

[MG-BookHero] E. McGucken, Hero’s Odyssey Mythology Physics series (additional volume), 45EPIC Hero’s Odyssey Mythology Press (2017).

[Wheeler-Letter] J. A. Wheeler, Letter of recommendation for Elliot McGucken, Princeton University, Joseph Henry Professor of Physics (c. 1990). Quoted in §X.1 of the present paper.

External — Foundational Sources

[Einstein1902] A. Einstein, “Kinetische Theorie des Wärmegleichgewichtes und des zweiten Hauptsatzes der Thermodynamik,” Annalen der Physik 9, 417–433 (1902).

[Einstein1903] A. Einstein, “Eine Theorie der Grundlagen der Thermodynamik,” Annalen der Physik 11, 170–187 (1903).

[Einstein1905] A. Einstein, “Über die von der molekularkinetischen Theorie der Wärme geforderte Bewegung von in ruhenden Flüssigkeiten suspendierten Teilchen,” Annalen der Physik 17, 549–560 (1905).

[Einstein1934] A. Einstein, “On the Method of Theoretical Physics,” Herbert Spencer Lecture, Oxford, 1933 (published 1934).

[Einstein1949] A. Einstein, “Autobiographical Notes,” in P. A. Schilpp (ed.), Albert Einstein: Philosopher-Scientist, Open Court, La Salle, 1949, p. 33.

[Boltzmann1872] L. Boltzmann, “Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen,” Wiener Berichte 66, 275–370 (1872); Vorlesungen über Gastheorie, Barth, Leipzig, 1896–1898.

[Loschmidt1876] J. Loschmidt, “Über den Zustand des Wärmegleichgewichtes eines Systems von Körpern mit Rücksicht auf die Schwerkraft,” Wiener Berichte 73, 128–142 (1876).

[Zermelo1896] E. Zermelo, “Über einen Satz der Dynamik und die mechanische Wärmetheorie,” Annalen der Physik 57, 485–494 (1896).

[Gibbs1902] J. W. Gibbs, Elementary Principles in Statistical Mechanics, Yale University Press, New Haven, 1902.

[Klein1872] F. Klein, Vergleichende Betrachtungen über neuere geometrische Forschungen (Erlangen Program), Erlangen, 1872.

[Noether1918] E. Noether, “Invariante Variationsprobleme,” Nachr. D. König. Gesellsch. D. Wiss. Zu Göttingen, Math-phys. Klasse, 235–257 (1918).

[Minkowski1908] H. Minkowski, “Raum und Zeit,” Physikalische Zeitschrift 10, 104–111 (1909) (address at 80th Assembly of German Natural Scientists and Physicians, Cologne, September 21, 1908).

[Ehrenfest1911] P. and T. Ehrenfest, “Begriffliche Grundlagen der statistischen Auffassung in der Mechanik,” Enzyklopädie der mathematischen Wissenschaften, Vol. IV, Part 32, Teubner, Leipzig, 1911; English translation: The Conceptual Foundations of the Statistical Approach in Mechanics, Cornell University Press, Ithaca, 1959.

[Birkhoff1931] G. D. Birkhoff, “Proof of the ergodic theorem,” Proceedings of the National Academy of Sciences 17, 656–660 (1931).

[vonNeumann1932] J. von Neumann, “Proof of the quasi-ergodic hypothesis,” Proceedings of the National Academy of Sciences 18, 70–82 (1932).

[KAM] V. I. Arnold, Mathematical Methods of Classical Mechanics, 2nd ed., Springer, New York, 1989, Ch. 10; A. N. Kolmogorov, “On the conservation of conditionally periodic motions under small perturbation of the Hamiltonian,” Dokl. Akad. Nauk SSSR 98, 527–530 (1954); J. Moser, “On invariant curves of area-preserving mappings of an annulus,” Nachr. Akad. Wiss. Göttingen Math.-Phys. Kl. II, 1–20 (1962).

[Olver1986] P. J. Olver, Applications of Lie Groups to Differential Equations, Graduate Texts in Mathematics 107, Springer, New York, 1986 (2nd ed. 1993). Theorem 4.29 — the inverse Noether theorem used in [MG-KNC, §X.2 Remark X.2.1].

[Sinai1970] Ya. G. Sinai, “Dynamical systems with elastic reflections,” Russian Mathematical Surveys 25, 137–189 (1970).

[Jaynes1957] E. T. Jaynes, “Information Theory and Statistical Mechanics,” Physical Review 106, 620–630 (1957); 108, 171–190 (1957).

[Penrose2004] R. Penrose, The Road to Reality, Jonathan Cape, London, 2004, §27.13, §28.8.

[PenroseENM] R. Penrose, The Emperor’s New Mind, Oxford University Press, Oxford, 1989.

[PenroseCCC] R. Penrose, Cycles of Time: An Extraordinary New View of the Universe, Bodley Head, London, 2010.

[Albert2000] D. Z. Albert, Time and Chance, Harvard University Press, Cambridge, MA, 2000.

[Carroll2010] S. Carroll, From Eternity to Here: The Quest for the Ultimate Theory of Time, Dutton, New York, 2010.

[Wallace2013] D. Wallace, “The Arrow of Time in Physics,” in A. Bardon and H. Dyke (eds.), A Companion to the Philosophy of Time, Wiley-Blackwell, Chichester, 2013.

[Reichenbach1956] H. Reichenbach, The Direction of Time, University of California Press, Berkeley, 1956.

[Prigogine1980] I. Prigogine, From Being to Becoming: Time and Complexity in the Physical Sciences, W. H. Freeman, San Francisco, 1980; I. Prigogine and I. Stengers, Order Out of Chaos, Bantam, New York, 1984.

[Jacobson1995] T. Jacobson, “Thermodynamics of spacetime: the Einstein equation of state,” Physical Review Letters 75, 1260–1263 (1995). arXiv:gr-qc/9504004.

[Verlinde2011] E. P. Verlinde, “On the origin of gravity and the laws of Newton,” Journal of High Energy Physics 2011(4), 29. arXiv:1001.0785.

[Decoherence] H. D. Zeh, “On the interpretation of measurement in quantum theory,” Foundations of Physics 1, 69–76 (1970); W. H. Zurek, “Decoherence, einselection, and the quantum origins of the classical,” Reviews of Modern Physics 75, 715–775 (2003); E. Joos et al., Decoherence and the Appearance of a Classical World in Quantum Theory, 2nd ed., Springer, Berlin, 2003.

[ETH] J. M. Deutsch, “Quantum statistical mechanics in a closed system,” Physical Review A 43, 2046–2049 (1991); M. Srednicki, “Chaos and quantum thermalization,” Physical Review E 50, 888–901 (1994).

Princeton Mentors

[Peebles1992] P. J. E. Peebles, Quantum Mechanics, Princeton University Press, 1992.

[Peebles2019] Royal Swedish Academy of Sciences, “The Nobel Prize in Physics 2019” (awarded to P. J. E. Peebles “for theoretical discoveries in physical cosmology”). https://www.nobelprize.org/prizes/physics/2019/peebles/facts/

[Taylor1993] Royal Swedish Academy of Sciences, “The Nobel Prize in Physics 1993” (awarded to R. A. Hulse and J. H. Taylor Jr. “for the discovery of a new type of pulsar”). https://www.nobelprize.org/prizes/physics/1993/