The Singular Missing Physical Mechanism–dx₄/dt = ic: How the Principle of the Expanding Fourth Dimension Gives Rise to the Constancy and Invariance of the Velocity of Light c; the Second Law of Thermodynamics; Time, Its Flow, Its Arrows and Asymmetries; Quantum Nonlocality, Entanglement, and the McGucken Equivalence; the Principle of Least Action; Huygens’ Principle; the Schrödinger Equation; the McGucken Sphere and the Law of Nonlocality; Vacuum Energy, Dark Energy, and Dark Matter; and the Deeper Physical Reality from Which All of Special Relativity Naturally Arises

Physics advances through unification — the recognition that phenomena previously thought to be distinct are in fact expressions of a single deeper reality. The McGucken Principle, encoded in the equation dx4/dt = ic, proposes the deepest unification yet attempted: a single geometric postulate from which the constancy and invariance of the velocity of light c; the second law of thermodynamics and the irreversible increase of entropy; time itself, its flow, its arrows, and all its asymmetries — thermodynamic, radiative, cosmological, causal, and psychological; quantum nonlocality, entanglement, and the McGucken Equivalence; the Principle of Least Action and Newton’s second law as its corollary; Huygens’ Principle and the propagation of waves; the Schrödinger equation and the foundations of quantum mechanics; the McGucken Sphere and the Law of Nonlocality; wave-particle duality and quantum probability; the unification of Feynman’s path integral, Brownian motion, and Huygens’ wavelets; candidate physical mechanisms for vacuum energy, dark energy, and dark matter; the origin of both fundamental constants c and ℏ from the foundational motion and wavelength of x4‘s oscillatory expansion; the dissolution of the block universe and the emergence of time as a real and flowing phenomenon; and the full kinematics of special relativity — time dilation, length contraction, mass-energy equivalence E = mc2, and the Lorentz transformation — all follow as mathematical theorems rather than independent axioms, from this single geometric postulate about the physical character of the fourth dimension.


I. The Pattern of Physical Unification

The history of theoretical physics is, at its deepest level, a history of unification. Each major conceptual revolution has revealed that phenomena previously understood as separate are in fact manifestations of a single underlying reality, and that the laws governing them are projections of a more fundamental law onto a more restricted domain.

Newton’s unification of terrestrial and celestial mechanics — demonstrating that the force pulling an apple toward the Earth and the force governing the Moon’s orbit are one and the same — was the founding achievement of modern physics [1]. Maxwell’s unification of electricity, magnetism, and optics into a single field theory not only resolved apparent conflicts among three bodies of experimental knowledge but predicted a new phenomenon — electromagnetic radiation — and identified light as an electromagnetic wave [2]. Einstein’s special relativity unified space and time into a single four-dimensional manifold, revealing length contraction, time dilation, and mass-energy equivalence as consequences of the geometry of that manifold [3]. General relativity then unified gravity with the geometry of spacetime itself [4]. Each unification did not merely summarize prior knowledge; it revealed a deeper structure from which prior laws emerged as limiting cases, and it extended physics into new domains the prior laws could not reach.

Quantum mechanics brought a further unification — of particle and wave descriptions of matter — through the de Broglie relation and the Schrödinger equation [5, 6]. The standard model of particle physics subsequently unified the electromagnetic, weak, and strong interactions within a single gauge-theoretic framework [7]. In each case, the unification was achieved not by patching together existing formalisms but by identifying a more fundamental mathematical or geometric structure from which the existing formalisms descended as special cases.

Against this historical backdrop, the McGucken Principle [8] advances a claim that is at once conservative and radical: it proposes no new mathematics, introduces no new particles or fields, and modifies no existing experimental predictions. What it does is supply what every previous unification has left absent — a single, physically motivated geometric postulate from which the deepest and most empirically robust laws of physics follow as theorems. The postulate is this:

The McGucken Principle. The fourth dimension of spacetime x4 is a genuine geometric axis advancing at the fixed imaginary rate ic per unit of coordinate time:

dx4/dt = ic

where x4 = ict is the imaginary fourth coordinate of Minkowski spacetime, c is the speed of light in vacuo, and i is the imaginary unit.

The equation follows from direct differentiation of Minkowski’s own notation x4 = ict, introduced in 1907–1908 [9]. McGucken’s dx4/dt = ic describes the physical reality of the fourth expanding dimension rather than a mere notational convenience. The systematic demonstration that this novel physical interpretation yields the entire architecture of physical law proves the logical beauty of the McGucken Principle.


II. The Mechanism Problem: What Prior Theories Left Unanswered

A recurring feature of theoretical physics is the distinction between a phenomenological law — a compact description of what happens — and a physical mechanism — an account of the underlying process that makes it happen. The history of physics is punctuated by the conversion of the former into the latter through deeper theory.

Kepler’s laws of planetary motion were phenomenological: they described orbital shapes and periods with remarkable precision, but the mechanism behind them was unknown until Newton demonstrated that they follow from universal gravitation and the laws of motion [1]. Similarly, the ideal gas law PV = nRT described the behavior of gases without explaining why molecules behave as they do; the kinetic theory of gases provided the mechanism [10]. Ohm’s law described electrical resistance before the Drude model and later quantum mechanics explained it in terms of electron-lattice interactions [11].

By this standard, several of the most fundamental laws of contemporary physics remain in a pre-mechanistic state — known to hold with extraordinary precision, but lacking a physical account of why they must hold.

The Constancy of the Speed of Light

Einstein’s special relativity rests on two postulates: the principle of relativity and the invariance of the speed of light [3]. The second postulate is empirically unimpeachable — the Michelson-Morley experiment [12] and a century of subsequent precision measurements confirm it without exception. But postulating the invariance of c is not the same as explaining it. Why is there a universal speed limit? Why must it be c specifically? Why is it the same in all inertial frames? These questions lie outside the scope of special relativity, which treats c‘s invariance as a brute empirical fact elevated to a foundational axiom.

The Second Law of Thermodynamics

Boltzmann’s statistical mechanics provided a partial mechanism for entropy increase, showing that states of higher entropy are exponentially more probable than states of lower entropy [13]. But probability is not necessity, and the second law as empirically observed is absolute: entropy never decreases in an isolated system, not merely very rarely. A purely statistical account cannot in principle explain an absolute prohibition. As Penrose has emphasized, the problem of time’s arrow — why entropy increases in one direction but not the other — requires a physical explanation that statistical mechanics alone cannot provide [14].

Quantum Nonlocality and Entanglement

Bell’s theorem [15] and its experimental confirmation by Aspect, Dalibard, and Roger [16], and subsequently by Hensen et al. [17] and many others with progressively closed loopholes, establish rigorously that the correlations exhibited by entangled quantum systems cannot be explained by any local hidden-variable theory. The quantum mechanical formalism predicts and correctly accounts for these correlations, but it does so without providing a physical mechanism. Einstein’s dissatisfaction with this state of affairs — expressed in the Einstein-Podolsky-Rosen paper [18] and maintained throughout his life — was not a failure to accept the empirical facts; it was a demand for the kind of mechanistic account that the quantum formalism does not supply. No subsequent interpretation of quantum mechanics has provided one that achieves broad consensus.

The Arrows of Time

Physics recognizes at least five distinct temporal asymmetries: the thermodynamic arrow (entropy increases toward the future), the radiative arrow (radiation expands outward from sources, never converges), the cosmological arrow (the universe expands), the causal arrow (causes precede effects), and the psychological arrow (we remember the past, not the future). These are empirically well established and intuitively compelling, but they are not derived from the time-symmetric fundamental laws of physics. The laws of classical mechanics, electromagnetism, and quantum mechanics are all invariant (or CPT-invariant) under time reversal. The arrows of time are imposed, not derived. As Price [19] and others have argued, their common origin remains one of the deepest unsolved problems in the philosophy of physics.

The Principle of Least Action and Huygens’ Principle

The Principle of Least Action — from which Newton’s second law, Lagrangian mechanics, and the equations of motion of all classical and quantum fields are derived — is typically introduced as a postulate whose deep justification is left unstated [20]. Huygens’ Principle — that every point on a wavefront acts as the source of a secondary spherical wavelet — is presented as an empirical rule of wave propagation, not as a theorem of anything more fundamental [21]. That these two principles are in fact the same equation viewed from opposite sides of the semiclassical limit ℏ → 0 is known from the eikonal formalism, but why both hold, and why they are unified, has not been derived from a physical mechanism. The McGucken framework derives both as theorems and proves their unification explicitly.

The Schrödinger Equation

The Schrödinger equation iℏ∂tψ = Ĥψ is the central dynamical equation of quantum mechanics. In every standard textbook treatment it is introduced as a postulate — motivated by analogy, by the de Broglie relation, or by the correspondence principle, but never derived from a more fundamental physical statement [5]. The McGucken framework derives it as a theorem: the non-relativistic limit of the Klein-Gordon equation, which is itself the quantized mass-shell condition, which follows from the master equation, which follows from dx4/dt = ic.

The McGucken Principle proposes mechanisms for all of these simultaneously — not as separate explanations, but as a single geometric account from which each follows as a distinct facet of one underlying reality.


III. The Geometric Foundation: x4 as a Physical Axis

Minkowski’s great contribution in 1907–1908 was to show that Einstein’s special relativity could be understood geometrically: the Lorentz transformation, which mixes space and time coordinates, is a rotation in a four-dimensional spacetime manifold [9]. This insight made the mathematics of special relativity transparent and opened the path to general relativity. But Minkowski’s formulation came with a choice of notation whose physical significance was subsequently set aside.

In Minkowski’s original formulation, the four coordinates of a spacetime event are (x, y, z, x4), where x4 = ict. The imaginary factor i is what produces the correct sign in the spacetime interval:

ds2 = dx2 + dy2 + dz2 + dx42 = dx2 − c2dt2

The negative sign on the temporal term — which encodes the entire causal structure of relativity — is a direct consequence of the imaginary character of x4. Later authors preferred to make the metric signature explicit through sign conventions rather than imaginary coordinates, and Minkowski’s x4 = ict came to be treated as an archaic notational device rather than a statement about a physical coordinate [20].

The McGucken Principle reverses this dismissal. It takes Minkowski’s equation seriously as a physical statement: x4 is a genuine geometric axis of the universe, and the equation x4 = ict, differentiated with respect to coordinate time, yields dx4/dt = ic. This is not a new equation — it is Minkowski’s equation, read as a description of motion. The content of the McGucken Principle lies not in the algebra but in the assertion that x4 is physically advancing at this rate, that it is a real geometric axis and not a mathematical convenience, and that this physical reality is the source of the laws it generates.

From this single postulate, the four-velocity norm follows immediately. If every object moves through the four-dimensional manifold with four-velocity uμ = dxμ/dτ, then the Minkowski metric and the constraint that all motion is motion through the same four-dimensional space yield:

uμuμ = −c2

This is the master equation. It asserts that every object’s total rate of traversal through four-dimensional spacetime is the fixed constant c. No assumption about the invariance of c is required; it emerges from the geometry of a four-dimensional space whose fourth axis advances at ic.


IV. The Constancy of Light’s Speed as a Geometric Theorem

The first major consequence of the master equation is the mechanism for the invariance of the speed of light — the result that has stood as an axiom since Einstein and that no prior theory has explained from a deeper principle.

The master equation uμuμ = −c2 partitions an object’s fixed four-speed budget between spatial motion and advance along x4. A particle at rest in three spatial dimensions directs its entire four-speed budget into the x4 direction. As its spatial velocity v increases, progressively more of the four-speed budget is redirected into spatial motion and less remains for x4. At exactly v = c — the case of a photon — the x4 component is exactly zero: dτ/dt = √(1 − v2/c2) → 0 as v → c. A photon does not advance along x4 at all.

This is the physical mechanism for the invariance of c. The speed of light is not a law imposed on dynamics; it is the geometric budget constraint of a four-dimensional space whose fourth axis advances at rate c. An object cannot travel faster than c for the same reason a right triangle cannot have a hypotenuse shorter than either of its legs: it would require a negative contribution to the four-speed budget, which the geometry of the space does not permit. The invariance of c across all inertial frames is not an empirical postulate elevated to a foundational axiom; it is the Pythagorean theorem, applied to four dimensions.

Time dilation and length contraction follow from the same budget constraint. A moving clock advances less along x4 per unit of coordinate time — the Lorentz factor γ = 1/√(1 − v2/c2) is precisely the ratio of remaining x4-budget to total budget. Length contraction is the Pythagorean projection of a four-dimensionally extended object onto three-dimensional space when it is oriented at an angle through the (x, x4) plane. Mass-energy equivalence E = mc2 follows from the identification of the x4 component of four-momentum with E/c: rest energy is the energy associated with advance along x4, and c2 is the square of the invariant four-speed set by the expansion of x4 [8].

The derivation thereby achieves what Einstein’s formulation of special relativity explicitly did not attempt: it provides a physical mechanism for the postulates from which special relativity is derived, making them consequences of a deeper geometric reality rather than irreducible axioms.


V. The Second Law of Thermodynamics as Geometric Necessity

Boltzmann’s statistical mechanics [13] established the connection between thermodynamic entropy and the phase-space volume available to a macroscopic system. Entropy increase is, in this account, the tendency of systems to evolve toward more probable macrostates. This is a profound insight, but it is not a mechanism for the absolute prohibition of entropy decrease. As Boltzmann himself was aware, the statistical account predicts only that entropy-decreasing fluctuations are improbable — not that they are impossible. The observed absolute irreversibility of the second law demands something more.

The McGucken framework provides it. Because x4 expands at rate c in a perfectly spherically symmetric manner — no preferred spatial direction — the spatial projection of each particle’s x4-driven displacement at each moment is isotropic. This is not an additional assumption but a direct consequence of the spherical symmetry of x4‘s expansion: a sphere has no preferred direction, and therefore the displacement it induces in each spatial dimension is equally likely in any direction.

Applied iteratively, this isotropic displacement is precisely the condition required for Brownian motion [21]. The central limit theorem then yields a Gaussian spreading of any particle ensemble over time, with diffusion coefficient D = v2δt/6. The Boltzmann-Gibbs entropy of a Gaussian distribution is:

S(t) = −kB ∫ P ln P d3x = (3/2)kB ln(4πeDt)

Therefore dS/dt = (3/2)kB/t, which is strictly positive for all t > 0 — not probably positive, but necessarily positive. The second law is not a statistical tendency; it is a geometric necessity. Entropy cannot decrease because x4 cannot retreat. The irreversibility of thermodynamics is the irreversibility of x4‘s expansion, expressed in the three-dimensional language of statistical mechanics.

Furthermore, through the Wick rotation t → −iτ — which replaces the imaginary fourth axis x4 = ict with a real Euclidean axis x4 = cτ — Feynman’s path integral [22] for quantum mechanical propagation becomes the diffusion kernel summing over Brownian paths. Quantum mechanical propagation in real time and thermal diffusion in imaginary time are analytically related facets of the same underlying geometric process: the spherically symmetric expansion of x4 at rate c. Huygens’ wavelets, Brownian diffusion, and Feynman path integrals are unified as a single geometric phenomenon [8].


VI. All Five Arrows of Time from a Single Source

The problem of time’s arrows has a long pedigree in physics and philosophy. Boltzmann, Reichenbach [23], Gold [24], Penrose [14], Price [19], and many others have analyzed the thermodynamic, radiative, cosmological, causal, and psychological asymmetries of time and have noted that they all point in the same direction. But explaining why they do — and why that direction is the one it is, rather than the opposite — has resisted complete solution. The standard account appeals ultimately to the special initial conditions of the Big Bang, a cosmological boundary condition rather than a dynamical explanation [14].

The McGucken framework derives all five arrows from the single geometric fact that x4 expands in one direction, irreversibly, at rate c.

The thermodynamic arrow is entropy’s increase toward the future, derived above as a direct consequence of the spherical symmetry and irreversibility of x4‘s expansion. Entropy increases in the direction of x4‘s advance because the isotropic displacement associated with x4 spreading is a one-way process: x4 does not retreat.

The radiative arrow — radiation expands outward from sources and never spontaneously converges onto them — is the direct expression of the retarded Green’s function of the wave equation. The retarded solution describes outward-expanding spherical shells; the advanced solution (inward-converging) is mathematically valid but physically unrealized. The McGucken framework explains why: the advanced solution corresponds to a contracting McGucken Sphere, which would require x4 to retreat. Since x4 does not retreat, the advanced solution is not physically realized. The radiative arrow is enforced by the same geometric asymmetry as the thermodynamic arrow [8].

The causal arrow — causes precede their effects — is the statement that causal influence propagates only into the forward light cone of any event. The McGucken Sphere expands outward from any event, carrying causal influence with it. Since x4 does not retreat, the sphere does not contract, and causal influence cannot propagate backward. The causal structure of spacetime is the forward expansion of x4.

The cosmological arrow — the universe expands — is, in the McGucken framework, the large-scale collective manifestation of x4‘s advance. Every object’s fundamental motion through four-dimensional spacetime contributes to the universal tendency toward expansion; the cosmological expansion is the macroscopic expression of the same geometric process that drives entropy and radiation.

The psychological arrow — we remember the past and anticipate the future — follows from the causal arrow. Memory is the physical record of events that have already influenced a system through the forward light cone. Anticipation is inference about states that have not yet done so. The psychological asymmetry is the causal asymmetry, instantiated in neural systems.

The unification achieved here has no precedent in the literature on time’s arrows. Reichenbach’s account [23] unified thermodynamic and causal arrows but left the cosmological arrow separate and the psychological arrow derivative. Penrose’s Weyl curvature hypothesis [14] attempts to explain the thermodynamic arrow through cosmological boundary conditions but does not derive the remaining arrows from the same source. The McGucken framework derives all five from a single equation.


VII. Quantum Nonlocality: The Hidden Variable Was x4

Bell’s theorem [15] is a landmark result in the foundations of physics. It establishes that any theory reproducing the predictions of quantum mechanics for entangled systems must either abandon locality (the principle that distant events cannot influence each other instantaneously) or abandon realism (the principle that physical quantities have definite values independent of measurement) or both. Subsequent experiments — Aspect et al. [16], Hensen et al. [17], and the loophole-free tests of Giustina et al. [25] and Shalm et al. [26] — have confirmed the quantum mechanical predictions with increasing precision and closed successive experimental loopholes.

Einstein, Podolsky, and Rosen’s original paper [18] argued that the completeness of quantum mechanics was in question: if two spatially separated particles exhibit correlated measurement outcomes without any causal connection between the measurements, either quantum mechanics is incomplete (there are hidden variables not represented in the formalism) or the principle of locality must be abandoned. Bell’s theorem showed that local hidden variable theories are incompatible with quantum mechanics; the experiments confirmed that nature agrees with quantum mechanics and not with local hidden variables.

What has not been resolved — in any of the major interpretations of quantum mechanics, including Copenhagen, many-worlds, de Broglie-Bohm, or relational interpretations — is the question of what physical mechanism produces the nonlocal correlations. The quantum mechanical formalism predicts them correctly; it does not explain what connects the two particles.

The McGucken framework identifies the connection: x4.

Because photons travel at exactly v = c, the master equation uμuμ = −c2 gives dτ = 0 for photons throughout their propagation. Since x4 = ict and dτ = 0, it follows that dx4 = ic dτ = 0: a photon’s x4 coordinate does not change from emission to absorption. The x4 coordinate at detection is identical to the x4 coordinate at creation, regardless of the spatial distance traveled. The four-dimensional interval between emission and absorption is exactly zero: ds2 = |Δx|2 − c2Δt2 = 0 — the null interval — and this is a frame-independent statement.

Two photons created at a common event O — as in spontaneous parametric down-conversion, the standard source of entangled photon pairs in Bell-inequality experiments — begin with identical x4 coordinates. As they travel spatially, neither advances in x4. Their x4 coordinates at any later moment remain identical to each other and to their value at creation. The four-dimensional interval between them is always zero, however large their spatial separation becomes.

The McGucken Equivalence [8]: Quantum nonlocality is the three-dimensional shadow of four-dimensional x4-coincidence, arising from the null interval ds2 = 0 on every photon worldline, which follows from dτ = 0 at v = c, which follows from dx4/dt = ic.

The mechanism is not action at a distance. There is no signal traveling between the particles, no instantaneous influence, no violation of relativistic causality. The particles were never separated in the geometric dimension that matters: x4. Their spatial separation is real; their four-dimensional separation is always null. The correlations we call nonlocal are, in the geometry of four-dimensional spacetime, perfectly local in x4.

This resolves Einstein’s demand [18] without contradicting Bell’s theorem. Bell proved that no local hidden variable in three-dimensional space can reproduce the quantum correlations. The McGucken Equivalence identifies a hidden variable in four-dimensional spacetime — x4 itself — that is local in x4 while appearing nonlocal when projected onto three-dimensional space. The hidden variable Einstein sought was not a particle property, not a pilot wave, not a pre-assigned measurement outcome. It was the geometric axis that photons do not traverse.


VIII. McGucken’s Law of Nonlocality and the McGucken Sphere

The analysis of quantum nonlocality within the McGucken framework yields a further result that can be stated as a formal theorem [8]:

McGucken’s Law of Nonlocality: All nonlocality begins as locality. Two particles can share x4-locality (null interval ds2 = 0) only if they are connected by a null worldline. A null worldline exists between two events if and only if they lie on the same McGucken Sphere — the surface of the forward light cone of a common event. Therefore, entanglement requires prior causal contact.

The McGucken Sphere — the sphere of radius R = ct centred on any event, representing the surface swept out by x4‘s expansion at rate c in time t — is simultaneously the surface from which Huygens’ secondary wavelets emanate, the support of the retarded Green’s function, and the locus of x4-coincident events available for entangled correlations. These are not three separate objects. They are one object — the McGucken Sphere — expressing the same geometry of dx4/dt = ic in three different physical regimes: wave optics, classical field theory, and quantum entanglement.

The Law of Nonlocality has a striking empirical consequence: no experiment has ever produced genuine quantum entanglement between two particles that share no prior causal history. Every confirmed Bell-inequality violation in the literature traces to a shared local origin — a common source event, a common beam splitter, a common production vertex. The Law predicts this without exception.

The double-slit experiment, delayed-choice experiments, and quantum eraser experiments are all explained within the McGucken Sphere framework. Every photon event in these experiments is null-separated from the source — all participating photons share the same x4 coordinate as the emission event — and the apparent temporal paradoxes dissolve when this x4-coincidence is recognized. What appears to be a measurement in the future affecting the past is, in the geometry of x4, a set of events that are all simultaneous in the only four-dimensional sense that matters for photons: they all share x4(O) [8].


IX. The Principle of Least Action and Huygens’ Principle as Theorems

The Principle of Least Action — δ∫L dt = 0, from which Newton’s second law, Lagrangian mechanics, and the equations of motion of all classical fields are derived — follows from the observation that the relativistic action for a free particle is S = −mc2 ∫dτ: the unique Lorentz-invariant quantity associated with a worldline, proportional to the proper time elapsed. The master equation fixes the four-speed at c, making proper time the natural measure of the worldline’s geometric length. Varying this action with fixed endpoints yields the geodesic equation duμ/dτ = 0. In the non-relativistic limit v ≪ c, this reduces to the Principle of Least Action with Lagrangian L = ½mv2 − V. The Principle of Least Action is not a postulate; it is the non-relativistic shadow of the geometric fact that free particles in four-dimensional spacetime traverse paths of extremal proper time [8].

Huygens’ Principle — that every point on a wavefront acts as the source of a secondary spherical wavelet — follows from the retarded Green’s function of the wave equation □ψ = 0, which is itself the quantized light-cone geometry of dx4/dt = ic promoted to a field equation. The Green’s function G ∝ δ(t − t’ − |x − x’|/c) / |x − x’| enforces propagation exactly on the light cone at speed c. Huygens’ Principle is the statement that any field is a superposition of such spherical shells centred on every point of a prior wavefront — a theorem of the wave equation, which is a theorem of the mass-shell condition, which is a theorem of dx4/dt = ic [8].

The eikonal equation (∇S)2 − (∂tS)2/c2 = m2c2 — derived from the Klein-Gordon equation in the limit ℏ → 0 — is simultaneously the Hamilton-Jacobi equation of classical mechanics (whose solutions are the classical trajectories extremizing action) and the eikonal equation of wave optics (whose solutions are the wavefronts of Huygens’ construction). The Principle of Least Action and Huygens’ Principle are therefore the same partial differential equation, encountered from opposite sides of the semiclassical limit. Both are theorems of dx4/dt = ic.


X. The Schrödinger Equation as a Theorem

The Schrödinger equation iℏ∂tψ = Ĥψ is the central dynamical equation of quantum mechanics. In every standard textbook treatment it is introduced as a postulate — motivated by analogy with classical wave mechanics or the de Broglie relation, but never derived from a more fundamental physical statement. The McGucken derivation chain makes it a theorem [8].

The chain runs as follows. The master equation uμuμ = −c2, multiplied by m2, yields the four-momentum norm pμpμ = −m2c2, which expands to the energy-momentum relation E2 = |p|2c2 + m2c4. Applying the canonical quantization correspondence pμ → iℏ∂μ yields the Klein-Gordon equation. Factoring out the rapid rest-mass oscillation e−imc2t/ℏ and taking the non-relativistic limit v ≪ c — in which the second time-derivative term is suppressed by (v/c)2 — the rest-mass terms cancel exactly and what remains is:

iℏ ∂tφ = −(ℏ2/2m)∇2φ

With an external potential V(x, t) added through minimal coupling, this is the full Schrödinger equation. Every textbook presents it as an irreducible postulate. Within the McGucken framework it is a theorem — the non-relativistic, low-energy limit of a chain of consequences that begins with the single statement that the fourth geometric axis of spacetime advances at rate ic [8].


XI. The Deeper Physical Reality from Which Relativity Arises

Special relativity, in Einstein’s original formulation [3], is a theory of the relationships between physical measurements made in different inertial reference frames. It derives these relationships from two postulates and demonstrates their logical consistency and empirical adequacy. It does not, and does not claim to, provide a physical account of why the postulates hold.

The McGucken Principle reopens that question — not by reintroducing the ether, but by identifying a different and more fundamental geometric structure. The deeper reality it proposes is not a medium through which light propagates, but the geometric character of the four-dimensional spacetime manifold itself: the fact that its fourth axis is a physical axis advancing at a definite rate. From this fact, the Lorentz transformation is not a set of empirically motivated transformation rules but a rotation in a four-dimensional space — a change of orientation within the manifold, not a change of the manifold itself. Different inertial frames are different orientations in the (x, x4) plane; the Lorentz factor γ is the cosine of the rotation angle; the mixing of space and time in the transformation is the three-dimensional projection of a four-dimensional rotation.

This is, in the precise sense that the history of physics recognizes, a deeper level of reality: a geometric structure from which the previously axiomatic structure of special relativity emerges as a set of consequences. The Minkowski metric is not the foundational object; the expansion of x4 at rate ic is. The metric is what the expansion of x4 looks like when its geometry is expressed in coordinates.


XII. The Block Universe and the Reality of Time’s Flow

The block universe view, associated with philosophers including Weyl [31] and Rietdijk [32] and more recently defended by Sider [33] and Hawking [34], holds that all moments of time — past, present, and future — are equally real and equally present within the four-dimensional spacetime manifold. On this view, time does not flow; the sense of temporal passage is a feature of conscious experience with no correlate in objective physical reality.

This view rests on the confusion of x4 with t. The McGucken Principle asserts that x4 is a physical axis that is advancing: the equation dx4/dt = ic is a statement about motion, not about a static relationship. The four-dimensional manifold is not laid out in advance; it is continuously being generated by the forward expansion of x4. The present moment is the advancing surface of x4‘s expansion — the McGucken Sphere of radius ct, the locus at which causality is being enacted at this moment. Time flows because x4 expands, and x4 expands because that is its physical nature. The block universe dissolves not through philosophical argument but through physics: a static block has no dx4/dt; the physical universe does.


XIII. Vacuum Energy, Dark Energy, Dark Matter, and the Origin of the Constants

The McGucken framework’s implications extend, in a more speculative direction, into three of the deepest unsolved problems in contemporary physics: the vacuum energy problem, the origin and nature of dark energy, and the cosmological constant problem.

Quantum field theory predicts that the vacuum contains zero-point oscillations of every field mode, contributing a vacuum energy density of order 10113 J/m3. The observed value is approximately 5 × 10−10 J/m3 — a discrepancy of roughly 120 orders of magnitude, described by Weinberg [35] as “the worst theoretical prediction in the history of physics.” The standard model of cosmology, ΛCDM, inserts a cosmological constant Λ to account for the observed accelerated expansion of the universe [36, 37], but provides no physical mechanism for Λ’s value or existence.

The McGucken framework suggests that the oscillatory, wave-like character of x4‘s expansion provides a candidate physical mechanism for vacuum energy: the zero-point oscillations of quantum fields are the response of matter and radiation to the underlying oscillatory expansion of x4. If x4 expands in discrete, wavelength-scale increments λ4, then Planck’s constant ℏ = E4λ4/c is determined by the foundational wavelength of x4‘s expansion, and the speed of light c is the rate of that expansion. Both fundamental constants would then be shadows of a single geometric process: dx4/dt = ic. The structural parallel between dx4/dt = ic and the canonical commutation relation pq − qp = iℏ — both placing a differential or commutator on the left and an imaginary unit i on the right — points toward x4‘s expansion as the geometric origin of quantization itself [8].

Dark matter accounts for approximately 27% of the universe’s energy budget. As x4 expands spherically from every point in space, it preferentially carries particles outward. At the scale of a galaxy, the distribution of matter swept outward by x4‘s isotropic expansion over cosmic time would create a diffuse halo of displaced particles — consistent with the flat rotation curves that require a mass distribution ρ(r) ∝ 1/r2 at large radii. Whether this mechanism can quantitatively reproduce the observed dark matter distribution is a specific prediction of the McGucken framework that merits detailed calculation [8].


XIV. The McGucken Principle in the Context of Unification

To situate the McGucken Principle within the historical arc of physical unification is to appreciate both its ambition and its logical structure. Each major unification in physics has shared certain characteristic features: it has identified a more fundamental structure from which previously separate laws emerge as limiting cases; it has explained why the prior laws hold in the domains they were established for; and it has extended physics into new domains beyond the reach of the prior laws.

Newton’s unification explained Kepler’s laws as consequences of universal gravitation and predicted new phenomena including the tides, the precession of the equinoxes, and the shape of the Earth. Maxwell’s unification explained the laws of electricity and magnetism as aspects of a single field and predicted electromagnetic radiation at the speed of light. Einstein’s special relativity unified space and time and predicted the equivalence of mass and energy; general relativity extended this to gravity and predicted gravitational waves and the expansion of the universe.

The McGucken Principle follows this pattern precisely. Its foundational structure — a physical geometric axis advancing at rate ic — generates the laws of special relativity, classical mechanics, wave optics, quantum mechanics, and thermodynamics as theorems. It explains why these laws hold: because they are geometric consequences of the structure of a four-dimensional space whose fourth axis advances at a fixed rate. And it extends physics toward mechanisms for phenomena that the existing formalisms describe but do not explain: the invariance of c, the irreversibility of entropy, the arrows of time, the nonlocality of entanglement, the Principle of Least Action, Huygens’ Principle, the Schrödinger equation, and candidate mechanisms for vacuum energy, dark energy, and dark matter.

The logical structure of the derivation chain — forty-one steps from a single postulate, covering five major domains of physics — is precisely the structure of a unifying theory in the tradition of Newton, Maxwell, and Einstein. The question it poses to physics is the one that every unification has posed in its time: is this the deeper reality from which the existing structure emerges?

The speed of light is invariant not because experiments say so, but because c is the rate at which x4 advances — a geometric budget, not a dynamical law. Entropy increases not because disorder is more probable than order, but because x4 moves forward and never retreats — a geometric necessity, not a statistical tendency. Entangled photons are correlated not through any signal or action at a distance, but because they share an x4 coordinate and never leave it — a geometric coincidence, not a spooky force. The Schrödinger equation is not a postulate; it is a theorem. The Principle of Least Action is not a postulate; it is a theorem. Huygens’ Principle is not a postulate; it is a theorem. These are mechanisms, not re-descriptions. They answer the question “why,” not merely “how much.” That is what a deeper physical reality looks like.


References

  1. Newton, I. (1687). Philosophiæ Naturalis Principia Mathematica. London: Royal Society.
  2. Maxwell, J. C. (1865). A dynamical theory of the electromagnetic field. Philosophical Transactions of the Royal Society of London, 155, 459–512.
  3. Einstein, A. (1905). Zur Elektrodynamik bewegter Körper. Annalen der Physik, 17, 891–921.
  4. Einstein, A. (1916). Die Grundlage der allgemeinen Relativitätstheorie. Annalen der Physik, 49, 769–822.
  5. Schrödinger, E. (1926). Quantisierung als Eigenwertproblem. Annalen der Physik, 79, 361–376.
  6. de Broglie, L. (1924). Recherches sur la théorie des quanta. Ph.D. thesis, Université de Paris.
  7. Weinberg, S. (1967). A model of leptons. Physical Review Letters, 19, 1264–1266.
  8. McGucken, E. (2026). The fourth dimension is expanding at the speed of light: dx4/dt = ic. Light Time Dimension Theory, April 2026. https://elliotmcguckenphysics.com
  9. Minkowski, H. (1908). Raum und Zeit. Lecture, 80th Assembly of German Natural Scientists and Physicians, Cologne. Published in Physikalische Zeitschrift, 10, 104–111 (1909).
  10. Maxwell, J. C. (1867). On the dynamical theory of gases. Philosophical Transactions of the Royal Society of London, 157, 49–88.
  11. Drude, P. (1900). Zur Elektronentheorie der Metalle. Annalen der Physik, 1, 566–613.
  12. Michelson, A. A., & Morley, E. W. (1887). On the relative motion of the Earth and the luminiferous ether. American Journal of Science, 34, 333–345.
  13. Boltzmann, L. (1872). Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. Sitzungsberichte der Kaiserlichen Akademie der Wissenschaften, 66, 275–370.
  14. Penrose, R. (1989). The Emperor’s New Mind. Oxford: Oxford University Press.
  15. Bell, J. S. (1964). On the Einstein–Podolsky–Rosen paradox. Physics, 1(3), 195–200.
  16. Aspect, A., Dalibard, J., & Roger, G. (1982). Experimental test of Bell’s inequalities using time-varying analyzers. Physical Review Letters, 49, 1804–1807.
  17. Hensen, B., et al. (2015). Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres. Nature, 526, 682–686.
  18. Einstein, A., Podolsky, B., & Rosen, N. (1935). Can quantum-mechanical description of physical reality be considered complete? Physical Review, 47, 777–780.
  19. Price, H. (1996). Time’s Arrow and Archimedes’ Point: New Directions for the Physics of Time. Oxford: Oxford University Press.
  20. Misner, C. W., Thorne, K. S., & Wheeler, J. A. (1973). Gravitation. San Francisco: W. H. Freeman.
  21. Brown, R. (1828). A brief account of microscopical observations made in the months of June, July, and August, 1827. Philosophical Magazine, 4, 161–173.
  22. Feynman, R. P., & Hibbs, A. R. (1965). Quantum Mechanics and Path Integrals. New York: McGraw-Hill.
  23. Reichenbach, H. (1956). The Direction of Time. Berkeley: University of California Press.
  24. Gold, T. (1962). The arrow of time. American Journal of Physics, 30, 403–410.
  25. Giustina, M., et al. (2015). Significant-loophole-free test of Bell’s theorem with entangled photons. Physical Review Letters, 115, 250401.
  26. Shalm, L. K., et al. (2015). Strong loophole-free test of local realism. Physical Review Letters, 115, 250402.
  27. Jozsa, R., & Linden, N. (2003). On the role of entanglement in quantum-computational speed-up. Proceedings of the Royal Society of London A, 459, 2011–2032.
  28. Klein, O. (1926). Quantentheorie und fünfdimensionale Relativitätstheorie. Zeitschrift für Physik, 37, 895–906.
  29. Gordon, W. (1926). Der Comptoneffekt nach der Schrödingerschen Theorie. Zeitschrift für Physik, 40, 117–133.
  30. Lorentz, H. A. (1904). Electromagnetic phenomena in a system moving with any velocity smaller than that of light. Proceedings of the Royal Netherlands Academy of Arts and Sciences, 6, 809–831.
  31. Weyl, H. (1922). Space–Time–Matter (trans. H. L. Brose). London: Methuen.
  32. Rietdijk, C. W. (1966). A rigorous proof of determinism derived from the special theory of relativity. Philosophy of Science, 33, 341–344.
  33. Sider, T. (2001). Four-Dimensionalism: An Ontology of Persistence and Time. Oxford: Oxford University Press.
  34. Hawking, S. W. (1988). A Brief History of Time. London: Bantam Press.
  35. Weinberg, S. (1989). The cosmological constant problem. Reviews of Modern Physics, 61, 1–23.
  36. Riess, A. G., et al. (1998). Observational evidence from supernovae for an accelerating universe and a cosmological constant. The Astronomical Journal, 116, 1009–1038.
  37. Perlmutter, S., et al. (1999). Measurements of Ω and Λ from 42 high-redshift supernovae. The Astrophysical Journal, 517, 565–586.
  38. Burikham, P., et al. (2013). Can dark energy emerge from quantum effects in a compact extra dimension? Astronomy & Astrophysics, 554, A89.
  39. Josset, T., Perez, A., & Sudarsky, D. (2020). Can the quantum vacuum fluctuations really solve the cosmological constant problem? European Physical Journal C, 80, 21.
  40. Huygens, C. (1690). Traité de la Lumière. Leyden: Pieter van der Aa.
  41. Hamilton, W. R. (1834). On a general method in dynamics. Philosophical Transactions of the Royal Society, 124, 247–308.

Leave a comment