The seeming coordination between entangled quanta lurks as a mystery of physics, much as accounting for the teleology of adaptation has bedeviled evolutionary biologists. The two conundrums have the same answer.
Quantum entanglement is the idea that multiple quanta – fundamental particles of matter – may be linked across some distance such that measuring one of the entangled quantum’s state determines the state of other entangled quanta. Quantum entanglement has been observed in innumerable experiments.
French physicist Louis de Broglie speculated in 1924 that all particles in motion might exhibit wavelike behavior. Fascinated at this, Austrian physicist Erwin described an electron as a wave function rather than a particle at a particular point in time. This became known as Schrödinger’s equation.
In 1925, German physicist and mathematician Max Born formulated a matrix representation of quantum mechanics, based upon interpreting Schrödinger’s equation as a probability function for an electron’s location. Born’s theory formally introduced wave/particle duality: an electron had properties of both a particle and a wave, thus supposedly reconciling opposite views.
Einstein had essentially come to the same conclusion in 1905, when he argued that radiant energy consisted of quanta. But Einstein did not appreciate the implications of his discovery. It would not be the last time that Einstein failed to fathom the import his own conclusions.
At the heart of quantum field theory (QFT) is wave/particle duality. QFT stuffs the basic bits of Nature into quanta while acknowledging their wavy properties as paramount.
A quantum is not a particle, like some itty-bitty billiard ball, but instead a little localized chunk of ripple in a field that deceptively looks like a particle. Understanding a particle is more about wave interactions than about particulate properties. It is a story of field behavior, not characterizing fantastically fleeting fragments. “Particles are epiphenomena arising from fields. Unbounded fields, not bounded particles, are fundamental,” stated American theoretical physicist Art Hobson.
A most significant consequence of describing electrons as waveforms, as Schrödinger had done, was to make it mathematically impossible to state the position and momentum of an electron at any point in time. This observation was first made by German theoretical physicist Werner Heisenberg in 1926. It became known as the uncertainty principle: a measurement may be made to get a sense of either a quantum particle’s position or momentum, but not both at the same time.
Quantum particles cannot be described as a point-like object with a well-defined velocity because they inherently behave as a wave. For a wave, momentum and position cannot both be defined accurately at any instant.
Mathematically, uncertainty between position and momentum arises because the expressions of the wave function for these supposedly independent variables are actually Fourier transforms of one another. Position and momentum are conjugate variables, which means they are symplectic: interdependent, not independent.
Neil Bohr interpreted the uncertainty principle holistically: the universe is basically an unanalyzable whole, in which the notion of separation of particle and environment is a vacuous abstraction except as an approximation.
To extract information about a certain wave, interferometry superimposes one wave upon another. Interferometry is also used to measure subatomic particles.
During interferometry, the measurement wave and subatomic target wave become coherently entangled. The measurement itself decoheres the particle to a definite observable outcome which is the local state result.
Meanwhile, entanglement continues. The global measurement state is the context surrounding the local state. Observation of the measurement state creates uncertainty. Art Hobson explains: “As a consequence of nonlocality, the states we actually observe are the local states. These actually observed local states collapse, whereas the global measurement state, which can be “observed” only after the fact by collecting coincidence data from both subsystems, continues its unitary evolution. This conclusion implies a refined understanding of the eigenstate principle: following a measurement, the actually observed local state instantly jumps into the observed eigenstate.” An eigenstate is a measured state of an object with quantifiable characteristics, such as position and momentum.
Even if a measurement is attempted, but no result read, entanglement ensues in such a way that the target quantum system is disturbed, and information of the event – as a lingering resonance in the affected quantum field – is retained.
Measurement is immaterial to uncertainty. The quantum level exhibits uncertainty regardless of observation.
Uncertainty is a form of potentiality, which is inherent in the nature of the particle field itself owing to unobservable entanglements. Uncertainty appears phenomenal, not merely mathematical. Physicist Andrew Zimmerman Jones conveys the now-conventional concept: “in quantum physics, the original uncertainty about the particle’s quantum state isn’t just a lack of knowledge. A fundamental property of quantum theory is that prior to the act of measurement, the particle really doesn’t have a definite state, but is in a superposition of all possible states.”
Thinking that uncertainty is inherent became the conventional view only after several decades. Schrödinger first conceived wave/particle duality as a reality. That was cast aside as fantastic in what emerged at the end of the 1920s as the Copenhagen interpretation – a term Heisenberg applied in a 1955 series of lectures. The Copenhagen interpretation considered the wave function a computational tool, giving good results, but not to be taken literally. The Copenhagen interpretation lost its popularity owing to its intrinsic inconsistency; sputtering out as the predominant dogma in the late 1950s.
Uncertainty at the originating level of existence suggests a deeper reality. As Dutch theoretical physicist Gerard ‘t Hooft noted, “The particles and fields are very, very crude statistical descriptions. Those particles and those fields are not true representatives of what’s really going on.” ‘t Hooft insight is treated as an obscene footnote by most quantum theorists, who are less unsettled at the prospect that their characterization is quixotic.
Accepting inherent quantum uncertainty ushers a seminal question when it comes to entanglement: how do distant particles communicate so that they appear to dance synchronously? A recent mathematical proof – at a stem-winding 165 pages – concludes: “no way to know.” American mathematician Thomas Vidick, one of the authors of the proof: “There is no algorithm that is going to tell you the maximal violation [of locality] you can get in quantum mechanics.”
The proof can only be considered valid if mathematics is of discovery rather than invention. If mathematics is itself an invention, then any proof is equally a product of imagination. If instead the principles from which mathematics emanate represent a system correspondent with Nature, then there are mathematical discoveries. As mathematics principles have evolved in complexity and depth and yet never been contradicted once proven, there is every reason to consider mathematics a natural science of discovery.
In other words, mathematics can be considered the language of Nature rather than a human invention to describe Nature. That other animals and life forms seemingly embrace the same mathematical conventions as we do fortifies this impression.
That is not to say that math is foolproof. Indeed, whereas the wheat of math is theories which have held up, the volume of mathematical chaff is orders of magnitude larger. Math can be used to fabricate fictions as easily as sentences can be spun to tell lies.
More precisely, this proof of unpredictability about entanglement is applicable only if the underlying quantum theory is correct. This is problematic.
The central problem with the conventional interpretation of the uncertainty principle is that it merely provides a statistical convenience rather than a representation of Nature (shades of the Copenhagen interpretation, yes, but a distinctive coloration). As a tool rather than a characterization, the uncertainty principle explains nothing, while leaving the universe inherently nondeterministic.
Quantum uncertainty appears an oxymoron in a world of physics which presupposes predictability. If quantum uncertainty is true, then the very idea of “laws of Nature” appears laughable. Further, what would be the point of the razzle-dazzle of indeterminism for an existence that resolves into a mundane order?
That quanta are superpositioned until they make their appearance is not the only theoretical possibility. Comprehending the uncertainty inherent in particles manifesting from waves, de Broglie in 1927 theorized that each particle is guided by a background wave, which he later called a pilot wave. A pilot wave steers its particle by nonlinear environmental interaction.
Phase harmony between a wave and its particle, as well as synchrony between particles, is provided by a periodic process, equivalent to a clock. Existence is quantized in both space and time, emergently appearing on a regular beat. This was the logical conclusion German physicist Max Planck came to with his quantum of action. Planck ushered the recognition that physical action cannot take an arbitrary value. In other words, there is a fundamental order to Nature, which begins with the infinitesimal.
Phenomena must be a multiple of the Planck constant. Even thermal energy (heat) quantizes.
Planck’s quantum of action essentially states that only certain energy levels may manifest, while values in between are forbidden to do so. Physics cannot explain why.
The pilot wave theory provides a deterministic system that characterizes existence with a cynosure and casts off uncertainty. In its developed form, the theory is also consistent with classical physics, quantum mechanics, and relativity.
Despite its ostensible appeal, the pilot wave theory was repudiated by physicists at the 1927 Solvay Conference. Einstein’s failure to speak up for the theory led to its rejection.
A strident rejector of the idea that uncertainty held sway over Nature, Einstein liked the theory’s determinism. His objection was the implication that the entire universe was entangled, affording nonlocal interactions between particles. Einstein held to his religious rejection of quantum entanglement until he died, derisively damning it as “spooky action at a distance.” Einstein’s close-mindedness hampered his contributions to physics, especially as he struggled to conceive of a unified field theory in the latter part of his life.
The pilot wave theory requires the potential for interaction between any and all particles in a system. Distance does not drive interactivity to zero. The instantaneous state of a particle depends upon its overall environment. “In the long run, only the entire universe can be regarded as self-determinate, while any part may be independent in general only for some limited period of time. The very mode of interaction between constituent parts depends on the whole, in a way that cannot be specified without first specifying the state of the whole,” concluded American theoretical physicist David Bohm & British quantum physicist Basil Hiley.
Determinism as an emergent spatial holism is a statement of coherence: that the universe is interactively ordered. Quantum uncertainty and determinism are metaphysically antithetical.
Energy wave fronts make their way forward knowing exactly what they are doing: incorporating all relevant information of their environment as they propagate. This proven principle of least action plumps not only for determinism, but for a mystical omniscience.
The universe knows itself. This conclusion is embedded within all physics equations via their employment of Lagrangian mechanics. Lagrangian equations provide that any motion may be calculated by incorporating all the information about the dynamics of the system. As a consumer of Lagrangian mechanics, quantum theories which produce uncertainty are oxymoronic.
Motion begins at the quantum level with wavelike behavior which instantaneously begets the appearance of particles. To say that emergent motion is inherently uncertain is belied by the consistent regularity with which particles appear. Mathematically characterizing wave/particle duality as generating uncertainty is the fatal flaw of conventional quantum field theory, which can’t possibly be correct.
Superposition is an unsupported supposition. But then, deception is the lingua franca of Nature, the appearance of which is really a ruse. Existence is a coherent fabrication of interlaced symbolic constructs, emergent from the quantum level up, which manifest as matter, albeit always in motion. Bear in mind that “manifestation” is only what appears in the mind. Matter itself is a mirage, for all is process.
As British musician Pete Townshend once sang, “The simple things you see are all complicated.” If you want to comprehend reality, take ‘t Hooft’s suggestion: dig deeper.
Ishi Nobu, Clarity: The Path Inside, BookBaby (2019).
Ishi Nobu, Unraveling Reality: Behind the Veil of Existence, BookBaby (2019).
Ishi Nobu, The Science of Existence, BookBaby (2019).
Andrew Zimmerman Jones, “Quantum entanglement in physics,” ThoughtCo. (10 July 2017).
Zhengfeng Ji et al, “MIP*=RE,” arXiv (13 January 2020).
Davide Castelvecchi, “How ‘spooky’ is quantum physics? The answer could be incalculable,” Nature Communications (16 January 2020).
Ishi Nobu, “Energy transmission optimality,” (17 December 2019).
Ishi Nobu, “The mechanics of existence,” (10 December 2019).