“All matter originates and exists only by virtue of a force which brings the particle of an atom to vibration and holds this most minute solar system of the atom together. We must assume behind this force the existence of a conscious and intelligent mind. This mind is the matrix of all matter.” ~ German physicist Max Planck
Max Planck was musically gifted, but he decided to study physics against the advice of his physics professor, Philipp von Jolly, who told Planck in 1878: “in this field, almost everything is already discovered, and all that remains is to fill a few holes.” Planck replied that he had no desire for discovery; he only wanted to understand the fundamentals.
Planck soon became fascinated with thermodynamics, whose classical laws he viewed as absolute laws of Nature. His later discovery to the contrary birthed quantum mechanics.
In 1859, German physicist Gustav Kirchhoff, who Planck later studied under, coined the term black body for an object which absorbs all the electromagnetic radiation which falls upon it (in other words, an utterly opaque object). He posed the inquiry: how does the intensity of the electromagnetic radiation emitted by a black body depend on the frequency of the radiation and the temperature of the body?
Kirchhoff’s question had been explored experimentally, but there was a serious problem with the answer that classical mechanics provided.
When a black body is heated, it emits electromagnetic waves in a broad spectrum. Experimentally, emitted black body radiation always drops off sharply on the short wavelength side. Further, as the temperature increases, peak wavelength grows shorter: visibly bluer rather than redder.
Based upon the mathematical assumption that everything is infinitely divisible, classical mechanics predicts that the energy emitted in thermal radiation is evenly distributed across all wavelengths. But that does not happen with a black body. This failure of classical thermodynamics to accurately predict the spectral characteristics of black-body radiation came to be called the “ultraviolet catastrophe.”
In 1894, electricity companies commissioned Planck to discover how to generate the greatest luminosity from light bulbs with the minimum energy. To get to a solution, Planck turned his attention to the black-body radiation problem.
In 1900, Planck had a theoretical answer. With great distaste, he had borrowed ideas from statistical mechanics that had been introduced earlier by Austrian physicist Ludwig Boltzmann.
Planck had a strong aversion to treating thermodynamics’ laws as statistical rather than absolute gospel. Being compelled to apply statistics to get an agreeable solution he considered “an act of despair.”
Planck achieved concordance with experimental results via a simple formula: E = hv, where E is the energy of a wave, v is the frequency of the radiation, and h is a very small number that came to be known as Planck’s constant (aka Planck’s action quantum).
To his consternation, what Planck found was that energy absorption and radiation was not continuous. Energetic work instead happens in discrete amounts: quanta of energy, with the Planck constant (h) as the quantum. What was supposedly entirely wavelike manifested in particulate form.
(The elementary quantum of action, known as Planck’s constant, is 6.626 × 10–34 joule/second in meter/kilogram/second units, with just a bit of uncertainty.)
“It seemed so incompatible with the traditional view of the universe provided by physics.” ~ Max Planck
At first, Planck considered quantization only “a purely formal assumption” which he “did not think much about.” But then he tried to stuff the quantum genie back into the bottle and found that he could not.
“My unavailing attempts to somehow reintegrate the action quantum into classical theory extended over several years and caused me much trouble.” ~ Max Planck
◊ ◊ ◊
Statistical classical mechanics requires the existence of the Planck constant, but does not define its value. Planck ushered the recognition that physical action cannot take an arbitrary value. In other words, there is a fundamental order to Nature, which begins with the infinitesimal.
Phenomena must be a multiple of the Planck constant. Planck’s quantum of action essentially states that only certain energy levels may manifest, while values in between are forbidden to do so. Physics cannot explain why.
(Planck’s constant represents the limit of empirical existence. The least possible distance is Planck length. Minimal matter has Planck mass. The shortest duration is measured in Planck time.)
Existence consists of interacting fields which necessarily manifest in particulate form. Even thermal energy (heat) quantizes. We’ll see that this duality is both illusory and necessary for existence.
The dynamics of quantum systems are encoded in the amplitude and phase of wave packets. ~ French quantum physicist V. Gruson et al
The science of the quantum world is alternately called quantum mechanics (accenting the statistical nature of the study), quantum field theory (QFT) (emphasizing that quanta are merely manifest fields), or simply quantum theory (which points out that all the packets under discussion are entirely theoretical, and not to be confused with reality).
Packets of Light
Einstein instantly appreciated Plank’s discovery of quantization. He later called it “the basis of all 20th century research in physics.”
“Without this discovery, it would not have been possible to establish a workable theory of molecules and atoms and the energy processes that govern their transformations. Moreover, it has shattered the whole framework of classical mechanics and electrodynamics and set science a fresh task: that of finding a new conceptual basis for all of physics.” ~ Albert Einstein
In 1905, addressing classical physics’ inability to explain the photoelectric effect, Einstein argued that radiant energy consisted of quanta.
“The energy of a light ray is not continuously distributed over an increasing space but consists of a finite number of energy quanta which are localized at points in space, which move without dividing, and which can only be produced and absorbed as whole units.” ~ Albert Einstein*^
(The photoelectric effect is the glow of objects when absorbing radiation. Electrically charged particles, either electrons or ions, are emitted when a body takes on energy.)
(American chemist Gilbert N. Lewis termed these packets of light photons in 1926.)
Echoing Plank’s equation, Einstein’s formula for photonic energy was: E = hf, where E is the energy of light at frequency f, tempered by Planck’s action quantum (h).
Einstein generalized the quantum hypothesis in 1907 by using it to interpret the temperature-dependence of the specific heats of solids. As a follow-on, Einstein treated thermodynamic fluctuations in two 1909 papers. Though he did not use the word duality or make any assertion of principle, Einstein introduced wave/particle duality into physics. This was one of several instances where Einstein failed to appreciate the implications of his assertions.
“The theory of relativity has changed our view of the nature of light insofar as it does not conceive of light as a sequence of states of a hypothetical medium, but rather as something having an independent existence just like matter.” ~ Albert Einstein in 1909
In the 5th century BCE Empedocles conceptualized Nature as comprising atomic elements. A century later Aristotle elaborated that these elements comprised a physical substrate which emerged from ethereal “forms” – an idea originally espoused by Plato, Aristotle’s teacher. Forms comprised the essences which begat Nature: the exhibition of existence. Forms took form as elements.
Atoms were considered the most minuscule particle of matter until 1897, when English physicist J.J. Thomson found something smaller, which he called corpuscles. What Thomson discovered was the subatomic particle now called the electron.
Experimenting with cathode ray emissions, Thomson concluded that atoms were divisible into constituent corpuscles. From this he concocted a plum-pudding model for atoms. To explain the overall neutral charge of an atom, as contrasted to the corpuscle (electron) negative charge, Thomson proposed that corpuscles floated in a sea of positive charges, with electrons embedded like plums in a pudding; though Thomson’s model posited rapidly moving corpuscles instead of plopped plums.
One of Thomson’s pupils, English physicist and chemist Ernest Rutherford, disproved Thomas’ atomic plum pudding in 1909. In its place, Rutherford imagined in 1911 a planetary atomic model: a cloud of negatively charged electrons swirling in orbits over a compact positively charged nucleus.
Rutherford was working with Danish physicist Niels Bohr, who conjectured in 1913 that electrons moved in specific orbits, which were regulated by Planck’s quantum of action. By 1921, Rutherford and Bohr had come up with an atomic model comprising protons, neutrons, and electrons. This model was validated in the 1950s, when atomic nuclei were manually disassembled by newly developed subatomic particle accelerators and detectors.
In 1924, French physicist Louis de Broglie turned Einstein’s quantified light inside-out, by wondering whether electrons and other elementary particles exhibit wavelike behavior. A fascinated Austrian physicist, Erwin Schrödinger, took the idea and ran with it. He unknowingly injected uncertainty into quantum mechanics with his 1926 publication, which described an electron as an ongoing wave function, rather than a particle at any point in time. This became known as Schrödinger’s equation.
A most significant consequence of describing electrons as waveforms, as Schrödinger had done, was to make it mathematically impossible to state both the position and momentum of an electron at any point in time. Werner Heisenberg’s 1926 observation of this became known as the uncertainty principle: a measurement may be made to get a sense of either a quantum’s position or momentum, but not both at the same time.
In 1927, English physicist George Thomson, son of J.J., passed a beam of electrons through a thin metal film and observed interference patterns (electron diffraction), proving that subatomic quanta were simultaneously particles and waves.
Quantum particles cannot be described as a point-like object with a well-defined velocity, because quanta inherently behave as a wave; and for a wave, momentum and position cannot both be defined accurately for any instant. This is true both in Nature and mathematically.
What to make of this inherent uncertainty? Physicists heatedly disagreed about what it meant.
Considering wave/particle duality a reality, Schrödinger at first took uncertainty literally. He later recanted, declaring himself utterly confused.
“I do not like it, and I am sorry I ever had anything to do with it.” ~ Erwin Schrödinger
Thinking that God “does not play dice,” Einstein fell back on faith to deride the uncertainty of the quantum particle/wave duality which he himself discovered.
“Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real thing.” ~ Albert Einstein
◊ ◊ ◊
“All particles must be transported by a wave into which it is incorporated.” ~ Louis de Broglie in 1927
Louis de Broglie, who brought the matter up in the first place, came up with the pilot wave theory, which rendered local events determinate by virtue of a coherent force that provides every wave with its own guidance. The price was acknowledging that the entire universe was entangled: affording nonlocal interactions between particles. Though pilot wave theory is largely ignored, de Broglie was essentially correct.
Niels Bohr interpreted the uncertainty principle holistically: the universe is basically an unanalyzable whole, in which the idea of separation of particle and environment is an abstraction, except as an approximation.
“In the long run, only the entire universe can be regarded as self-determinate, while any part may be independent in general only for some limited period of time. The very mode of interaction between constituent parts depends on the whole, in a way that cannot be specified without first specifying the state of the whole.” ~ American physicist David Bohm & British quantum physicist Basil Hiley
◊ ◊ ◊
Whether uncertainty is actuality remains controversial. But uncertainty certainly looks like the real thing.
“The statistical view is not compatible with the predictions of quantum theory.” ~ English particle physicists Terry Rudolph, Matthew Pusey, & Jonathan Barrett
Quantum theory is founded upon the premise that so-called particles are fields with anomalies. Fields are, by definition, a synchrony of waves. Denying the reality of the wave function, and its inherent uncertainty, eviscerates quantum mechanics by denying the existence of the foundation upon which the theory is built.
“The linearity of quantum mechanics is intimately connected to the strong coupling between the amplitude and phase of a quantum wave.” ~ German physicist Wolfgang Schleich
The entanglement of wave/particle duality and inherent uncertainty at the originating level of Nature suggests a deeper reality.
“The particles and fields are very, very crude statistical descriptions. Those particles and those fields are not true representatives of what’s really going on.” ~ Dutch theoretical physicist Gerard ‘t Hooft, who believes the universe is a deterministic but immaterial information system.
Since antiquity, the properties of light have fascinated physicists. In 1658, French mathematician Pierre de Fermat proposed that light always travels most efficiently: from one point to another in the least time, even when refracted: traveling through different media with distinct velocities, such as moving through air and then into water. This inscrutable optimality of light is a well-established fact.
In the figure, a ray of light going from a to b would travel the least distance via the hypothetical straight line. Instead, light actually traverses a longer distance that takes less time, as light moves slower through water than air – the straight-line path would incur a longer, sluggish passage in water.
Fermat’s principle was broadened to encompass all propagating energy waves by Dutch physicist Christiaan Huygens in 1678. In 1827, Irish physicist William Hamilton took wave transmission optimality as a universal for all dynamics in any physical system.
Via mathematics, Hamilton’s principle is encompassed in all of physics. Such matchless motion necessitates omniscience: always knowing all the information in the universe.
This profundity is no casual conclusion. For any energy wave to behave as it does, all information about actuality must be instantaneously incorporated.
Optimal propagation clearly indicates a unified, coherent intelligence from the Planck level on up, and strongly suggests teleology: that the game afoot which we call Nature has intention.
The Standard Model
“Whether you can observe a thing or not depends on the theory which you use. It is the theory which decides what can be observed.” ~ Pakistani physicist Abdus Salam
Early particle accelerators and their detectors explored the subatomic world. What they found was a multitudinous zoo. Nature’s fondness for diversity became abundantly apparent at the quantum level.
Energetic collisions of protons and neutrons in atomic nuclei produced smaller, more “elemental” particles: hadrons. Particle accelerators proliferated hadrons into such a prodigious variety that it prompted Austrian physicist Wolfgang Pauli to remark: “had I foreseen this, I would have gone into botany.”
Hadrons were found to be comprised of even tinier constituents, called quarks, of which there are different flavors, determined by their spin and symmetry.
Going bottoms-up: quarks combine to form families of hadrons, which join together in threesomes to form protons and neutrons (2 types of baryons), which are enslaved by the nuclear force to create atomic nuclei, which combine with electrons, bound together by the electromagnetic force, to create atoms. Via a diverse variety of attractions, atoms congregate into molecules, which make up everyday matter. But bear in mind that matter is ultimately nothing more than intense interactions of localized coherent energy fields, posing as something solid.
Physicists understand matter and energy in terms of kinematics and the interactions of elementary particles. Kinematics, which is a classical-mechanics construct, characterizes the motions of bodies, without considering the forces that cause movement. This conceptual bifurcation – between matter and the forces that move matter – would live on in quantum physics’ Standard Model.
The Standard Model (SM) is a myth about how Nature constitutes itself from basic quantum building blocks of matter, which are constantly caressed by carriers of Nature’s fundamental forces. SM proposes a set of elementary particles which compose existence at the quantum level.
The Standard Model was formulated in the 1970s. From the early 1980s, experiments verified various facets of SM. SM cannot explain many observed quantum phenomena, nor does it include all subatomic particles discovered.
As observations have often differed from SM theory, the Standard Model has undergone repeated patchwork; so much so that SM is now a set of theorems cobbled together to render an approximate fit to what has been observed, with a lot left out.
Under the Standard Model, there are 2 elementary particle types: fermions and bosons. Whereas fermions are the particles that comprise matter, bosons are force carriers.
The SM particle zoo has 17 main characters: 12 fermions and 5 bosons, complemented by an equivalent set of anti-particles, and a hypothetical hanger-on: the graviton, which is the elusive particle representing gravity.
There is also an ancillary mob to cover various observed oddities. To conclude the parade are virtual particles, which are assumed to flit in and out of existence in Planck time to lend their support to the proceedings.
Photons are the best-known boson. They are the quanta of light, which manifests as electromagnetic waves. Despite not interacting among themselves, and remaining utterly unaffected by electrical and magnetic fields, photons mystically porter the force of electromagnetism. Except for gravity, electromagnetism is the interaction responsible for practically all phenomena encountered in everyday life.
Electrons carry the field of electricity; but they are fermions, and so it is beyond their purview to act as a fundamental force. Such a lordly task is restricted to bosons. Hence bosonic photons, despite being decidedly standoffish, magically manage to insinuate electromagnetism everywhere, even in the dark, where no such light-hearted quanta putatively lurk. Photons finagle this fantastic feat by the same means of all quantum interactions: sheer mathematics.
Magnets supposedly attract each other because they exchange virtual photons. Each photon has its own frame of reference. In their supposed interaction, virtual photons exchange momentum, thereby producing electromagnetism as a relativistic effect.
“These particles do not have a pebble-like reality, but are rather the quanta of corresponding fields, just as photons are the quanta of the electromagnetic field. They are elementary excitations of a moving substratum: minuscule moving wavelets.” ~ Italian physicist Carlo Rovelli
◊ ◊ ◊
3 properties are typically used to characterize quantum particles: mass, charge, and spin. These properties are not what someone with a knowledge of classical physics would intuitively expect. Let’s look at mass, which gets a lot of coverage in covering quanta.
In the everyday world, mass is considered a measure of an object at rest. Special relativity shows that rest mass and rest energy are essentially equivalent. But rest mass (invariant mass) does not apply to subatomic particles, as they are never at rest.
For a quantum, mass is a euphemism. Subatomic particle mass is a mathematical representation of its isotropy (uniformity), not a measurement of anything actual. In other words, quantum mass measures the level of oddity in the wave from which a particle appears.
A practical conception of quantum mass is the threshold energy at which a certain subatomic particle may appear; put another way, the energy required for a specific quantum species to make an appearance.
A quantum is not literally a particle, like some itty-bitty billiard ball. It is instead a little localized chunk of ripple in a field that puts on a particle costume. Comprehending quanta is more about wave interactions than about particle properties. Quantum mechanics is a story of coherent field behavior, not a fable of fantastically fleeting fragments.
“Particles are epiphenomena arising from fields. Unbounded fields, not bounded particles, are fundamental.” ~ American physicist Art Hobson
◊ ◊ ◊
All this is what any physicist will relate as indisputable: all matter is made of atoms formed from more elementary quanta. The actuality of quanta is that they are fundamentally coherent, localized fields of energy.
However convincing, the physicality of quanta is nothing more than an illusory appearance. Which leads us to the crucial takeaway point: that what we take for physical existence, by way of objects, is a mirage.
“Although quantum field theory tells us what we can measure, it speaks in riddles when it comes to the nature of whatever entities give rise to our observations. The theory accounts for our observations in terms of quarks, muons, photons, and sundry quantum fields, but it does not tell us what a photon or a quantum field really is.” ~ German physicist Meinard Kuhlmann
There are many observations in the quantum world that do not fit into the Standard Model, which may be generously characterized as incomplete. Other physics theories can account for what SM cannot. But they too have flies in their ointment of exposition.
In 1928, Paul Dirac described his relativistic approach to characterizing a fermion field. He had in mind electrons, which have both mass and charge.
Within his mathematical solutions, Dirac found the positron, which is the electron’s antiparticle. The positron has the same mass as the electron, but the opposite charge: positive rather than negative. Positrons were experimentally confirmed the year after the Dirac equation appeared, becoming the first antiparticle found.
In 1929, German mathematician and physicist Hermann Weyl showed that Dirac’s equation could be simplified for massless fermions.
The next year, Wolfgang Pauli proposed neutrinos to explain the continuous energy spectrum coming out of radioactive decay. To respect the law of energy conservation, neutrinos had to be chargeless.
Neutrinos were first detected in 1956. Early experimental data suggested that neutrinos lacked mass. From that, it was assumed that Dirac’s neutrinos were merely massless Weyl fermions. Later investigation surmised that the masses of neutrinos are slight, but that remains uncertain.
In 1937, Italian physicist Ettore Majorana took neutrinos to an even more ethereal state, by proposing a class of quanta that was both massless and chargeless. Majorana particles were first glimpsed in 2012.
Whereas Dirac fermions have an antiparticle counterpart, such as electrons and positrons, Majorana fermions are their own antiparticle.
3 distinct classes of fermions have been identified: Dirac (with mass and charge), Weyl (massless, charged), and Majorana (massless, chargeless). What all fermions have in common is the same spin, which is the direction of internal angular momentum relative to the direction of linear momentum. Spin is the property that distinguishes fermions from bosons.
The asymmetrical spin of fermions explains why they cannot occupy the same space at the same time; but bosons can, because their spin is symmetrical. This fermionic limitation is termed the Pauli exclusion principle, which Wolfgang Pauli discovered in 1925.
Don’t think for a Planck moment that fermions always mind their manners. It all depends upon the environment they are in. Fermions might go bosonic when stressed.
The mathematics of existence can be quite slippery. Nature’s fondness for diversity often rides roughshod over textbook behaviors. Such is the case when fermions find themselves in the tight confines of a crystalline space.
Crystals are highly ordered solids which may form any one of 230 distinct lattice structures. Figuring out the extent of lattice space groups was a tour de force of 19th-century crystallography.
In 1930, Werner Heisenberg wondered what would happen if space itself was quantized instead of continuous. Heisenberg was inspired to speculate about a Planck Gitterwelt (lattice world) out of a desire to rid quantum mechanics of the vexatious infinities that kept arising in equations.
(Quantum mechanics’ mathematics was never able to shake off infinity. So, the beautiful symmetries and inscrutable infinities that appear everywhere are purposely broken by spontaneous symmetry breaking, which is a statistical abuse to force equations to behave so that physicists feel comfortable with them.)
What Heisenberg got in Gitterwelt was inexplicably peculiar: electrons might lose their mass, or morph into protons. This strangeness drove him to abandon “this completely crazy idea.”
But Gitterwelt happens. An electron moving through a honeycomb lattice of graphene carbon atoms loses its mass, transforming itself from a Dirac fermion to a Weyl one. If the lattice is superconductive, the electron may drop its charge and change into a Majorana.
Lattices offer even stranger transformations. A Weyl fermion trapped in a lattice world might alter its spin to that of a boson, while still being fermionic in obeying the Pauli exclusion principle. Other quantum oddities of lattice worlds are still being explored.
Prime numbers can only be divided by 1 and themselves. The first few primes are 2, 3, 5, 7 and 11. The pattern of primes becomes more sporadic higher in the number line. Though seemingly random in their distribution along the number line, primes have a deeply hidden order.
“Primes have multiscale order characterized by dense Bragg peaks.” ~ Italian American scientist Salvatore Torquato et al
Ionizing radiation is an energy transmission strong enough to rip electrons from their atoms. Such energy loses its oomph as it traverses matter. This retarding force is called stopping power. In certain materials, stopping power cumulates, reaching an apex – Bragg peak – before the radiative energy precipitously drops to nothing. English physicist and chemist William Henry Bragg discovered this diffraction pattern in 1903.
Crystals scatter energy in an orderly way characterized by Bragg peaks. Systems exhibiting order at large distances follow a pattern known as hyperuniformity. Besides primes and crystals, hyperuniformity is found in supercooled liquids and glasses, the arrangement of avian vision color-receptor (cone) cells, certain rare meteorites, and in the large-scale structure of the universe.
The Ground State
“There is no such thing as a real void.” ~ Carlo Rovelli
The ground state is the lowest-energy state of a quantum-mechanical system, with supposedly zero-point energy. In quantum field theory, the ground state is called the vacuum state, or simply vacuum.
Yet the ground state is far from grounded. Vacuum energy is calculated to be 10113 joules per cubic meter: unimaginably enormous power. And this is a comedown from the early universe, when the ground state was even more energetic.
According to classical thermodynamics, the ground state is supposed to be at absolute zero temperature (0 K). That is a theoretical fiction, as the ground state is not a void, or empty space. Virtual particles testify to that.
“Quantum mechanics teaches us that the vacuum is not empty at all, but, on small scales, contains virtual particles, their anti-particles, and the quanta of their interactions.” ~ Italian particle physicist Alessandro Bettini
The Dance of Spacetime
In 2017, Chinese physicist Qingdi Wang and his colleagues investigated the gravitational nature of the ground state. What they found was a dynamo of emergence.
“Spacetime is not as static as it appears; it’s constantly moving. This happens at very tiny scales, billions and billions of times smaller even than an electron.” ~ Qingdi Wang
“It’s similar to the waves we see on the ocean. They are not affected by the intense dance of the individual atoms that make up the water on which those waves ride.” ~ Canadian physicist William Unruh
From a fluctuating fabric of spacetime emerges the illusion of a stable cosmos.
Our 4-dimensional (4D) existence emerges from a holistic dimensionality (HD) which includes extra dimensions (ed). Only a tiny fraction of the mass found in atomic nuclei comes from the quarks within. Over 99% comes from being bound into a hadron.
The glue that holds hadrons together is a swarming stew of virtual particles: subatomic mystery matter that is ed, with only transient 4d appearance. Their energetic interactions multiply the mass that makes up everyday matter.
In adding mass and other attributes to virtually all quantum bits, virtual particles from vacuum are a paradox which physicists cannot explain. The conventional comprehension is that these fleeting waveforms pop in and out of “existence” from the ground state; a rather ridiculous notion. Instead, the virtuosity of virtuality is a demonstration of dimensional phase-shifting.
The ground state is simply the limit boundary to perceptible existence (4D). Its incredible energy, and the virtual horde which incessantly emerges to add heft and vitality to phenomena, is further proof that existence is a chimera.
In 1834, Scottish engineer John Scott Russell saw a solitary wave in a canal travel for over 8 miles without changing shape or amplitude. Fascinated, Russell reproduced solitons in a wave tank.
Soliton dynamics vary depending upon the medium in which they appear. Solitons arise at both the macroscopic and quantum scales, in both matter (fermions) and pure energy (bosons). Solitons may occur in light beams, magnets, DNA molecules, proteins, and cell membranes.
Superfluidity readily happens in Bose-Einstein condensate (BEC), which is a supercooled dilute gas of bosons. Solitons can arise in a BEC. Atomic BEC always act in a coherent wavelike manner, even when chilled near absolute to create a quantum phase transition.
“In no moment do atoms of BEC become classical particles; they always behave as waves that evolve in synchrony with each other.” ~ Chinese quantum physicist Cheng Chin
Under certain conditions, fermions may also experience frictionless flow. But, to attain superfluidity, fermions must first turn into bosons. They do so by forming entangled pairs which adopt the requisite spin.
What all solitons exhibit are startling robustness in their coherence. Solitons can encounter each other and still maintain their integrity.
“Equations with soliton solutions have a profound mathematical structure.” ~ English mathematician Mason Porter
The Stability of Existence
Under the Standard Model, the Higgs boson is a grainy chunk of the Higgs field, which permeates all space. Elementary particles gain their mass by bathing in the ubiquitous Higgs field; a constant process called the Higgs mechanism.
The relationship is circular in its entanglement. While other particles gather mass by their immersion in the Higgs field (the Higgs mechanism), the Higgs boson depends upon those particles for its own existence. Unsurprisingly, the heaviest fermion – the top quark – has the largest impact on the mass of the Higgs boson.
Physicists use the measurement of particle masses, and properties of the Higgs field, to deduce the stability of existence; a conclusion related to the energy of the ground state. What they conclude, according to the set of equations that define the Standard Model, is that spacetime is in a precarious predicament. The stability of the universe is at risk from a treacherous vacuum, which may at any time move to a lower energy state, instantaneously wiping out existence.
“If the Higgs mass and the top quark mass were a little bit different, we would either be in a completely stable vacuum or in an unstable vacuum that would have decayed a long time ago. The world seems to be on an edge. We don’t have enough precision to say whether our vacuum is stable or not.” ~ American astrophysicist Sean Carroll
In the Standard Model, the masses of bosons are modified via interrelations with other bosons and fermions. This creates what are called ghost fields. Continually perturbing the ground state, matter radiates over these ghost fields, stirring up what has been termed quantum foam.
Boson-fermion interactions are called ghost fields because they are presumed to not exist. Ghost fields are instead treated as a computational tool: a mathematical necessity to maintain consistency in the Standard Model.
But then, ghost fields originate the virtual particles that appear 4d out of the ground state that comprises only vacuum energy. Virtual particles are now taken for granted as existing.
There is a paradox in granting virtual particles existence but considering their creator – ghost fields – to be a fictional construct. It portrays the Standard Model as purely mathematical mumbo-jumbo, despite SM’s many points of convergence with actuality.
Ghost fields play a critical role in producing a loopy hierarchy of particles in SM, thus creating considerable complexity in the Standard Model construct. This hierarchy problem prompted theoretical physicists to derive a more elegant mathematical solution, called supersymmetry (SUSY).
Alas, several SUSY predictions are contradicted by evidence. For example, supersymmetry predicts that electrons have a slightly oval deformation, owing to their having an electric dipole moment, which has yet to be found. Instead, electrons are perfectly spherical.
Supersymmetry is not the only alternative to the Standard Model. Another – string theory – predates SM. While not mainstream, string theory and its offshoot, brane theory, have many adherents.
Before winging into strings, a brief digression on quanta that aren’t, but act as if they are.
“These particles are just smoke and mirrors, handy mathematical tricks and nothing more. Or are they?” ~ English physicist Andrea Taroni
Quasiparticles are emergent phenomena that behave as quanta but are illegitimate in the sense of being a fermion or boson per se. Quasiparticles are to quantum mechanics what epigenetics is to genetics: potent, but not quite kosher. Both illustrate the deep, entangled intricacy that characterizes Nature.
Quasiparticles are employed to explain oscillations, which are the fluctuations in fields. Phonons explain mechanical vibrations. Plasmons are quantized plasma oscillations. Magnons quantize the waveform which personifies the spin property of all quanta. Excitons are the quantized excitement of nothingness (holes) that exist after electrons have departed.
In 1907, Einstein suggested that solids came about from vibrating particles, now termed phonons. Einstein was guessing. The structure of atoms was not discovered until 1911.
Yet Einstein’s phonon serendipity hit a note that resonated. Phonons are relevant to characterizing several exotic thermodynamic phenomena. A phonon is a quasiparticle that represents the excited state which brings electrons together into an entangled Cooper pair, which synchronously perform miraculous feats like superconductivity.
“Phonons are not actually real. They are really just a way of simplifying a very complicated problem.” ~ English physicist Jon Goff
Phonons were formally conceptualized by Russian physicist Igor Tamm in 1932 as the particle form of wave/particle fields working at a specific vibration.
The first string theory was proposed in 1926, during the swirl of the quantum revolution provoked by the uncertainty principle. The idea was lost, only to be rediscovered decades later.
In 1968, Italian physicist Gabriele Veneziano was working with the Euler Beta function: an equation used to characterize scattering amplitude. He noticed that it could explain particle reactions involving the strong nuclear force, which binds together the nuclei of atoms. Others then realized that the equation made sense to them when they thought of subatomic particles as connected by infinitesimal strings, vibrating their tiny hearts out.
The concept was controversial. Shortly thereafter, the Standard Model swept aside strings as the great explainer of particle interactions. That did not deter physicists from continuing to pluck at string theory.
String theory conceptualizes subatomic particles as infinitesimally thin strings, vibrating through a holistic dimensionality (HD) that has more than 4 dimensions.
The “string” in string theory seems somewhat misleading, as the significance is that quantized fields have resonances at distinct frequencies, harmonically interacting with their brethren. Vibe theory sounds more appropriate.
In 1995, American particle theorist Edward Witten, who had been fiddling strings for over a decade, had a vision of unifying the variant quantum field theories. The result was M-theory, which postulates 11 dimensions of spacetime: 10 of space and 1 of time. ‘M’ stood for membrane.
M-theory is naturally extensible in the number of dimensions. In M-theory, a single string may be a membrane of greater dimensions.
String theorists Petr Horava and Joseph Polchinski independently extended M-strings into higher-dimensional objects: D-branes (a Horava term). Among other things, D-brane theory attempts to characterize string endpoints.
D-branes add rich mathematical texture to M-theory, paving the way for constructing more intricate cosmological models with greater explanatory power. Numerous braneworld (brane cosmology) models have emerged.
String theory has been derided by partisans for its lack of track record. But the theory has been able to explain liquidity experimentally found at trillion-Kelvin conditions, and near absolute zero. Meantime, the Standard Model stood mute. The hot quark soup and ultracold lithium broth exhibited collective behavior, flowing with the lowest possible viscosity. String theory successfully modeled these phenomena as strongly coupled particles, linked by ripples traveling extra-dimensionally.
“The statistical predictions of quantum mechanics are incompatible with separable predetermination.” ~ John Stewart Bell
Quantum mechanics has an obvious deficiency: its mechanics. Quantifying quantum phenomena is the elephant in the room of interpreting quantum theory.
Measuring fundamental particles is an existential oxymoron. Watching a wave function collapse is a probabilistic event. The math itself is nontrivial, and the appropriateness of the bandied equations contentious.
But some quantum field phenomena have been seen. The most inexplicable is nonlocality: what Einstein called “spooky action at a distance.”
Our world works on the principle of locality: that an object can only be affected by its immediate surroundings. In contrast, nonlocality is the notion that distance is ultimately an illusion.
A 1935 paper by Einstein and 2 other physicists posited a paradox over quantum uncertainty: that either locality or uncertainty must be true. Empirically minded Einstein opted for locality (and certainty), thereby concluding that the wave function must be an incomplete description.
Despite upsetting the apple cart of classical physics with his relativity theories, Einstein still preferred the cosmos as comfortably commonplace. After all, relativity only applied when traveling near the speed of light, or at the scale of galactic expanse. These realms were practically abstract.
In response to Einstein’s 1935 paper, Irish physicist John Stewart Bell tackled the quantum measurement problem in 1964; whence arose Bell’s theorem.
Science in general, and physics in particular, long assumed that both locality and objectivity were both true.
Locality means that distance affects the probability of interactions. Locality is colloquially codified in the everyday concept of cause and effect.
Objectivity insists that reality is ultimately objective, and therefore independent of observation. With special relativity, Einstein suggested that existence was subjective.
Bell’s theorem stated that either locality or objectivity had to go. In opting for the uniformity of objective reality, Bell pitched locality.
Ironically striving to spite his own theory of special relativity, Einstein struggled to the end of his days for a theory to uphold causality, protesting the view that there is no objective physicality other than that which is revealed through quantum-mechanical formalism.
In squaring off the principle of locality against counterfactual definiteness, Bell’s theorem went the other way: stating that some quantum effects travel faster than light ever can, thus violating locality.
Bell’s theorem painted special relativity into a corner; rendering it applicable only at the macro scale, and irrelevant at the quantum level. Then even that corner was painted over in the 21st century, by nonlocality showing up in the ambient world.
◊ ◊ ◊
Cause and effect is how we understand physics in the everyday (ambient) sense. In physics, causality as predictable is called counterfactual definiteness (CFD).
CFD goes to measurement repeatability: whether what has happened in the past is a statistical indicator of the future. Locality goes along with sequential causality: cause resulting in effect.
At the quantum level, CFD butts heads with locality, by stating that past probability as indicative of the future is a chimera. Instead, uncertainty always reigns.
Here we have a basic conflict. In the physics of existence, either certainty or uncertainty is true. The two are mutually exclusive.
Bell’s theorem showed that quantum uncertainty was a certainty: the principle of locality breaks down at the quantum level. Einstein was appalled: “No reasonable definition of reality could be expected to permit this.”
Later findings demonstrated that nonlocality functions at the macroscopic level too. With spooky-action-at-a-distance a reality, superluminal (faster-than-light) effects exist. Bell’s theorem of nonlocality/entanglement is considered a fundamental principle of quantum mechanics, having been supported by a substantial body of evidence.
“Nonlocality is so fundamental and so important for our worldview of quantum mechanics.” ~ Swiss quantum physicist Nicolas Gisin
The supposed trade-off between locality and objective reality is a false one. While strict quantum locality has been disproven, there is no proof that existence is actually objective. It just appears that way as a social convention: we consider the world “objective” when others agree with us; and so, objectivity is taken axiomatically, just as locality was for so long.
“If quantum physics hasn’t profoundly shocked you, you haven’t understood it yet.” ~ Niels Bohr
That one body may act upon another at a distance through a vacuum, without the mediation of anything else, by and through which their action and force may be conveyed from one to another, is to me so great an absurdity that I believe no man who has in philosophical matters a competent faculty of thinking can ever fall into it. ~ Isaac Newton
Basic notions in physics depend upon a time continuum: cause preceding effect. The principle of locality must exist for cause and effect to work. If causality is kicked aside, such as with simultaneous (“spooky”) action at a distance, locality is violated. With nonlocality a well-established fact, quantum entanglement has repeatedly been demonstrated.
The fundamental properties of chemistry rely upon entanglement. Solids form, and retain their solidity, via quantum entanglement of the electrons in the material. Superconductivity works through entangled electron pairs.
Superluminal communication presents a challenge to theoretical physics that has not been resolved. It is a dilemma that can never be met by insisting upon the universe as a 4D closed system; an axiom of which Newton and Einstein were so confident, but simply is not so.
A practical pointer to time as an emergent property occurs by entangling particles that don’t exist at the same time. In other words, nonlocality can also be nontemporal.
A scheme termed entanglement swapping – chaining entanglement through time between subatomic particle pairs – has been demonstrated, using 4 photons.
Entanglement demonstrates that time, as well as space, is emergent: constantly coming into being, as contrasted to preexisting and incrementally evolving, as it appears to us.
“Space and time will end up being emergent concepts; i.e. they will not be present in the fundamental formulation of the theory and will appear as approximate semiclassical notions in the macroscopic world.” ~ Israeli physicist Nathan Seiberg