Tag Archives: Quantum theory

The Elephant In The Room

Coincidently with the previous post here defending Pilot Wave Theory, a pair of articles appeared defending the standard interpretation of Quantum Mechanics. Both are dismissive of PWT in a way that underscores the basic incoherence of the modern, math-first approach to doing physics. The first of these articles is a typical screed from Ethan Siegel in which he martials a lot of known scientific facts and then assembles them into a circular argument that assumes the “standard model” to be correct, thus “proving” once again that the “standard model” as described by Siegel and acclaimed by all right-thinking scientists is the One True Model that all must revere. I’m not going to waste time on picking apart this particular circular argument. It is the arguments against PWT that are the subject of this post.

The other article by Philip Ball appeared in Quanta Magazine. In 2018 Ball published an even-handed account of quantum physics, Beyond Weird. This current work is primarily a discussion of a theory of decoherence developed by the physicist Wojciech Zurek. Decoherence is an attempt to explain how a particle evolves from the indeterminate “superposition of states” condition described by the standard interpretation to the “deterministic” state of everyday reality and standard physics where particles are not smeared out but always locally distinct. As with the Siegel article I’m only going to focus on the arguments used to dismiss Pilot Wave Theory from consideration as an alternative to the preferred dogma.

Both authors take the standard interpretation of quantum physics as a given. That standard interpretation can be characterized as an offshoot of Neil Bohr’s claim that the wavefunction represented all that could be known about the quantum state – it was fundamentally indeterminate and nothing could be known about the properties (location, spin, etc.) of a quantum particle prior to its observation (detection, measurement). To this was subsequently added the “explanation” that the indeterminacy was caused by the fact that the particle was not in any particular location prior to detection but was, in fact, smeared out in a “superposition of states”.

It was further stipulated that the smeared out state could not be observed because the very act of observing, or detecting it would cause a mathematical formalism, the wavefunction, to collapse instantaneously ensuring that the particle would always be found at some particular location. That particular location however, could not be predicted except as a probability by the wavefunction. It is necessary at this point to step back and consider what a breathtakingly absurd, not to mention utterly unscientific, account of physical reality that is.

Absurd the standard interpretation may be, but it is nonetheless an unquestioned premise of both Ball and Siegel in their respective articles. The reason it is so casually accepted is not merely because of what might be called the groupthink, dogma, and inertia that pervades the scientific academy. There is at root, the underlying premise of mathematicism – the scientifically baseless belief that mathematics somehow underlies and determines the nature of physical reality. Mathematicism is what allows otherwise rational people to believe that a particle must be smeared out in a superposition of states – because some mathematical wavefunction does not describe where the particle is.

In distinct contrast to this widely accepted wavefunction-only account of quantum behavior there stands Pilot Wave Theory which is not merely an alternative interpretation of the wavefunction-only story but constitutes a mathematically and physically distinct model of the quantum realm. In addition to the wavefunction, PWT also has a guiding equation which describes the interaction between a particle and a pilot wave which produces the statistical outcomes described by the wavefunction. The statistical outcomes of PWT are the same because it has the same statical formalism, the wavefunction.

This brings us to the objections raised against PWT by those who prefer the standard interpretation. Ethan Siegel has this to say on the matter:

Contained in these non-local hidden variable theories are the hopes of everyone who seeks to make deterministic sense of the quantum Universe, and the hope that somewhere, somehow, there’s a way to extract more information about reality and what outcomes will occur than standard, conventional quantum mechanics permits.

This is basically a strawman argument that seeks to characterize PWT as representing a “hope” to extract information or something about reality that the standard model does not permit. This deliberately sidesteps the fact that PWT produces exactly the same statistical results as the standard interpretation while providing a physically realistic description of the behavior of quantum particles as arising from a typical interaction of a particle with a wave.

For his part Philip Ball is a little more nuanced but his objection ultimately boils down to an aesthetic choice:

Others invoke the description postulated by Louis de Broglie and later developed by David Bohm, in which a particle does have well-defined properties, but it is steered by a mysterious “pilot” wave that produces the strange wavelike behavior of quantum objects, such as interference…

All this has always struck me as fanciful. Why not just see how far we can get with conventional quantum mechanics? If we can explain how a unique classical world arises out of quantum mechanics using just the formal, mathematical framework of the theory, we can dispense with both the unsatisfactory and artificial cut of Bohr’s “Copenhagen interpretation” and the arcane paraphernalia of the others.

Ball apparently finds the idea of a pilot wave as “mysterious” and “arcane” while seeming rather sanguine about the empirically baseless idea that a particle can be in a “superposition of states”. An absurd metaphysical proposition like that is somehow more palatable than a “mysterious” but physically plausible one.

Like Siegel, Ball makes no reference to the fact that PWT is not simply a different interpretation of the wavefunction-only model, like Many-Worlds, but constitutes a separate and distinct qualitative and quantitative model of quantum reality, one that produces the exact same statistical outcomes as the standard, reality-challenged version preferred by both authors. PWT does this while providing a plausible physical mechanism that is consistent with the physics of larger scales.

What is most striking about these rather casual dismissals is the seeming preference for mathematical mysticism over the open-ended nature of the scientific endeavor which seeks to understand the physical cause of observations that might at first seem inconsistent and mysterious. Modern theoretical physicists prefer to explain “mysterious” observations with imaginary things that cannot be observed, measured, or detected, rather than do the hard work of investigating the physical cause of those observations that only seem, at first encounter, to be mysterious.

Which brings us to the elephant in the room. The elephant in the room is indicated by a straightforward scientific question, Does the pilot wave of PWT refer to something physical as well as mathematical? A reasonable answer to that question is that there is a good deal of theoretical and observational evidence indicating that the pilot wave is a real physical entity that arises at the interface between a charged particle and the Ambient Electromagnetic Radiation that permeates the Cosmos. There is AER present in every laboratory in which quantum experiments are performed.

Modern Theoretical Physics pays no attention to the AER it swims in. MTP knows that the mass-energy content of an electron is equivalent to the energy of gamma radiation. MTP knows that a charged particle has a polarizing effect on nearby radiation. Despite this MTP does not incorporate the AER present in a typical laboratory environment into its analysis of the quantum behaviors it observes. Instead we get metaphysical prattling about superposition of states and wave-particle duality

MTP knows all about the individual frequencies and the linear rays of light by which we observe distant cosmological objects. The totality of that omnidirectionally emitted radiation, however, makes no appearance in the standard model of cosmology. In its place MTP has substituted the scientifically inert concept of a substantival spacetime. The resulting Big Bang model bears no resemblance to empirical reality.

An elucidation then, of the Pilot Wave Theory beyond its mathematical formalisms to a robust qualitative description of quantum scale behavior points directly at the AER as a causally interacting physical component of physical reality — and that analysis holds on all scales. The pilot wave of the theory is produced, in this conception, by the interaction of a charged particle (electron) with the AER of the laboratory. This qualitative description is made visually compelling by the work on Pilot Wave Hydrodynamics done at MIT by John Bush and his colleagues.

All of these considerations lead inexorably back to a question raised forty years ago by John S. Bell:

Why is the pilot wave picture ignored in text books? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show us that vagueness, subjectivity, and indeterminism, are not forced on us by experimental facts, but by deliberate theoretical choice?

—— Quoted in the Stanford Encyclopedia of Philosophy article Bohmian Mechanics.

The ALER And Quantum Theory (revised)

Note to Subscribers: This is a substantial modification of yesterday’s post. I’ve published it separately so that subscribers would receive a notification of the revision. I also substantially revised the preceding post on the ACER but directly modified the original which did not produce a notification to subscribers. I apologize for any confusion and will henceforth publish all substantive revisions to existing posts separately. Thank you for your patience.

The preceding post discussed the ACER (Ambient Cosmic Electromagnetic Radiation) and some of its implications. The ACER is present and detectable, at least in part, at the surface of the Earth. The Cosmic Microwave Background was discovered by a ground based antenna and in fact all pre-satellite-era cosmological observations were made at the surface of the Earth and involved the detection of various component frequencies of the ACER.

The situation is somewhat different in a closed room. Most of the ACER that does penetrate the atmosphere will not penetrate the walls and ceiling of the room. There will nonetheless be considerable ambient electromagnetic radiation some of it internally generated (artificial lighting, thermal radiation) and some measure of externally sourced penetrating radiation like radio frequencies. So in an enclosed room there will be an Ambient Local Electromagnetic Radiation (ALER) analogous to the ACER but having a somewhat different frequency distribution.

Some measure of ALER is present in every room humans might typically occupy including rooms dedicated to scientific experiments. Consider the room you are in as you read this. Not only is it suffused with visible light, there are also thermal radiation, radio frequencies and possibly some stray high energy UV and X, and gamma rays. 

We live our lives immersed in a sea of electromagnetic radiation whether sitting in a room or sitting out under a clear/cloudy, day/night sky. In the aggregate electromagnetic radiation provides the underlying 4-Dimensional framework of the Cosmos — at all scales.

Of particular interest with regard to Quantum Theory are the rooms where quantum experiments such as the double-slit are performed. Those rooms are also steeped in electromagnetic radiation, the ALER. 

In one version of the double-slit experiment electrons are slowly fired at the double slit screen and are subsequently detected at a second screen. As the electrons accumulate they form an interference pattern which is said to prove or at least demonstrate that electrons are both waves and particles. This would be a reasonable conclusion except this explanation takes no notice of the ALER through which the electron is traveling.

An electron affects an electromagnetic field and and is in turn affected by the field. This is not news; it is well established physics. Standard quantum theory pays no attention to the presence of the ALER and its interaction with the free electrons that are the subject of the experiment. 

The ALER is a chaotic field, streaming from all directions and comprised of many frequencies and yet the theoretical explanation of the double-slit experiment takes no account of the inescapable fact of the ALER’s presence and its unavoidable interaction with the electrons. Instead we get a feckless story about a wave-particle duality state that cannot be verified by observation, only believed in.

Such is the state of Quantum Theory — belief is obligatory because reality is irrelevant. In this it is of a piece with the rest of Modern Theoretical Physics. It presents an absurd, incoherent and at root irrational account of physical reality, one that bears almost no resemblance to the physical reality our lying eyes and instruments actually reveal to us. As with Ptolemaic cosmology, the math can be said to “work” but the physics is simply wrong.

Surprisingly this inability to grapple with the nature of physical reality is a matter of choice. There has been an alternate mathematical treatment of quantum mechanics known for more than 70 years. Bohmian mechanics models the double-slit experiment as an interaction between a particle and a guiding or pilot wave.

The mathematics of Bohmian mechanics does not explicitly describe the ALER only an abstract mathematical wave but the implications are clear. It is entirely within the reach of current mathematics to model physical reality without veering off into unrealistic metaphysical conjectures about wave-particle duality or particles that exist in a “superposition of states”. 

The price of this realistic framework is that the math is a bit more complicated than the “wavefunction only” model preferred by a “consensus” of the members of the scientific academy. The high priests of this consensus are quite obviously willing to sacrifice a coherent, logical and realistic account of quantum phenomena on the altar of mathematical convenience. Such is the state of Modern Theoretical Physics.

The ALER and Quantum Theory

The preceding post discussed the ACER (Ambient Cosmic Electromagnetic Radiation) and some of its implications. The ACER is present and detectable, at least in part, at the surface of the Earth. The CMB was discovered by a ground based antenna and in fact all pre-satellite-era cosmological observations were made at the surface of the Earth and involved the detection of various component frequencies of the ACER.

The situation is somewhat different in a closed room. Most of the ACER that does penetrate the atmosphere will not penetrate the walls and ceiling of the room. There will nonetheless be considerable ambient electromagnetic radiation some of it internally generated (artificial lighting, thermal radiation) and some measure of externally sourced penetrating radiation like radio frequencies. So in an enclosed room there will be an Ambient Local Electromagnetic Radiation (ALER) analogous to the ACER but having a somewhat different frequency distribution.

Some measure of ALER is present in every room humans might typically occupy including rooms dedicated to scientific experiments. Consider the room you are in as you read this. Not only is it suffused with visible light, there are also thermal radiation, radio frequencies and possibly some stray high energy UV and X, and gamma rays. We live our lives immersed in a sea of electromagnetic radiation whether sitting in a room or sitting out under a clear or cloudy, day or night, sky.

Of particular interest in this regard are the rooms where quantum experiments such as the double-slit are performed. They also are bathed in electromagnetic radiation. In one version of the double-slit experiment electrons are slowly fired at the double slit screen and are subsequently detected at a second screen. As the electrons accumulate they form an interference pattern which is said to prove or at least demonstrate that electrons are both waves and particles. This would be a reasonable conclusion except this explanation takes no note of the ALER through which the electron is traveling.

An electron affects an electromagnetic field and and is in turn affected by the field. This is not news; it is well established physics. Standard quantum theory pays no attention to the presence of the ALER and its interaction with the free electrons that are the subject of the experiment. The ALER is a chaotic field, streaming from all directions and comprised of many frequencies and yet the theoretical explanation of the double-slit experiment takes no account of the inescapable fact of the ALER’s presence and its unavoidable interaction with the electrons. Instead we get a dubious story about a wave-particle duality state that cannot be verified by observation, only believed in.

Such is the state of Quantum Theory – belief is obligatory because reality is irrelevant. In this it is of a piece with the rest of Modern Theoretical Physics – presenting an absurd, incoherent and at root irrational account of physical reality, one that bears almost no resemblance to the physical reality our lying eyes and instruments actually reveal to us. As with Ptolemaic cosmology, the math can be said to “work” but the physics is simply wrong.

Photons are not particles

In 1900, the German physicist Max Planck was studying black-body radiation, and he suggested that the experimental observations, specifically at shorter wavelengths, would be explained if the energy stored within a molecule was a “discrete quantity composed of an integral number of finite equal parts”, which he called “energy elements”. In 1905, Albert Einstein published a paper in which he proposed that many light-related phenomena—including black-body radiation and the photoelectric effect—would be better explained by modelling electromagnetic waves as consisting of spatially localized, discrete wave-packets. He called such a wave-packet a light quantum.
https://en.wikipedia.org/wiki/Photon (20Jun24)

Photon energy is the energy carried by a single photon. The amount of energy is directly proportional to the photon’s electromagnetic frequency and thus, equivalently, is inversely proportional to the wavelength. The higher the photon’s frequency, the higher its energy. Equivalently, the longer the photon’s wavelength, the lower its energy… The photon energy at 1 Hz is equal to 6.62607015×10−34 J
https://en.wikipedia.org/wiki/Photon_energy (20Jun24)

The SI units are defined in such a way that, when the Planck constant is expressed in SI units, it has the exact value h = 6.62607015×10−34 J⋅Hz−1.
https://en.wikipedia.org/wiki/Planck_constant (20Jun24)

The meaning of the foregoing should be clear. Despite the claims of particle physicists that the photon is a particle, it was in its original conception and is in its current quantitative description a wave phenomenon. A photon is not a particle like a proton or a billiard ball. A photon is never at rest with respect to any material body; it is always moving at the local speed of light with respect to all material bodies. A photon does not behave like a three dimensional particle, it is a wave quantum.

A wave quantum is the smallest possible wave; it is a wave of one wavelength as defined above. The illustration below is of a wave consisting of two consecutive photons. This is a poor representation of the reality of electromagnetic radiation which on astronomical and cosmological scales is emitted omni-directionally by stars, galaxies and other luminous bodies. To a local observer though light arriving from a distance seems to consist of streams or rays of light. This image is of a subsection of such a stream.

Electromagnetic radiation does not consist of a stream of tiny particles simply because Max Planck was forced to treat the emission of light as having a discrete minimum. What is described by the math is a single wavelength which is the minimum for a wave of any frequency. Half waves and quarter waves etc. don’t exist.

That does not mean a wave with half the wavelength of a longer wave cannot exist, just that for any given frequency a single complete wave cycle of one wavelength defines the minimum wave energy. Converting this wave minimum to a “particle” was a categorical error and it has a formal name, QED or Quantum Electrodynamics.

In Richard Feynman’s 1985 book QED, based on four popular lectures he had given a few years earlier, he makes this rather odd case for the light as particle “theory:

The strange phenomenon of partial reflection by two surfaces can be explained for intense light by a theory of waves, but the wave theory cannot explain how the detector makes equally loud clicks as the light gets dimmer. Quantum electrodynamics “resolves” this wave-particle duality by saying that light is made of particles (as Newton originally thought) but the price of this great advancement of science is a retreat by physics to the position of being able to calculate only the probability that a photon will hit a detector, without offering a good model of how it actually happens.

So to summarize that last sentence, saying light is made of particles was a great advancement for science that represented a retreat by physics into incoherence. I can’t argue with that. It is also not clear why the particle theory is superior to the wave quantum understanding of Einstein. Surely wave mechanics could have been modified to accommodate the fact that one wavelength is the minimum wave.

Instead Feynman goes on to describe a strange system resembling vector addition where the direction of “arrows” representing possible particle paths is determined by a frequency counter clock, in a backdoor maneuver to introduce wave-like behavior into the particle model so it can mimic wave interference patterns. This fits nicely with the standard quantum babble about a superposition of states, the condition where a particle’s position cannot be predicted except as a probability distribution which is interpreted to mean that the particle is in many positions at once. Thus the retreat into incoherence.

The particle theory of light is just another screwup of 20th century theoretical physics (there were quite a few). It should be put on the historical-curiosity shelf along with the Big Bang next to geocentric cosmology. Future historians can point to these physically absurd dead-end theories as textbook examples of how not to do science. Theory driven physics always winds up as empirically baseless metaphysical nonsense; the human imagination has never been a good guide to physical reality.

Bohmian Mechanics & A Mathematicism Rant

26JUN22 Posted as a comment to this Quanta article. Probably won’t be published there until tomorrow.

Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement.

This is just the usual Copenhagen mush, that substitutes a feckless by-product of the pseudo-philosophy of mathematicism for scientific rigor. The rather irrational, anthropocentric view that a particle doesn’t have a position until it is measured is entirely dependent on the belief that the Schrodinger wavefunction is a complete and sufficient description of physical reality at the quantum scale. It is not.

The Schrodinger equation only provides a statistical distribution of the possible outcomes of a measurement without specifying the underlying physical processes that cause the observed outcomes. The only scientifically reasonable conclusion would seem to be that the equation constitutes an incomplete description of physical reality. The Copenhagen interpretation of QM – that the wavefunction is all that can be known is not a rational scientific viewpoint. In physical reality things exist whether humans observe them or not, or have models of them or not.

There has long been a known alternative to the wavefunction only version of QM that was championed by John Bell himself. In Bohmian mechanics, in addition to the wavefunction, there is a particle and and a guiding wave. In that context the wavefunction provides the outcome probabilities for the particle/guiding-wave interactions. Bohmian mechanics constitutes a sound scientific hypothesis; the Copenhagen interpretation (however defined) offers nothing but incoherent metaphysical posturing. As in:

The physicists showed that, although making a measurement on one ion does not physically affect the other, it changes the context and hence the outcome of the second ion’s measurement.

So what is that supposed to mean exactly, in physical terms? The measurement of one ion doesn’t affect the second ion but it does alter the outcome of the second ion’s measurement? But what is being measured if not the state of the second ion? The measurement of the second ion has changed but the ion itself is unaltered, because the “context” changed? What does that mean in terms of physics? Did the measurement apparatus change but not the second particle? It’s all incoherent and illogical, which is consistent with the Copenhagen interpretation I suppose, but that’s not saying much.

Bohmian mechanics makes short work of this matter. There are two charged particles, each with a guiding wave; those guiding waves interact in typical wave-like fashion in the 4 dimensional frame of electromagnetic radiation. The charged particles are connected in, by, and across that 4D frame. By common understanding, such a 4D frame has no time dimension. That is what accounts for the seemingly instantaneous behavior. That’s physics, it could be wrong, but it is physics. Contextuality is math; math is not physics.

Theoretical physics in its current guise is an unscientific mess because physical reality has been subordinated to mathematical and metaphysical considerations. And so we have the ongoing crises in physics in which the standard models are so discordant with physical reality in so many different ways that it seems difficult to say what precisely is wrong.

The simple answer is that you can’t do science that way. Attempting to build physics models outward from the mathematical and metaphysical realms of the human imagination is wrong. Basing those models on 100 year old assumptions and holding those outdated assumptions functionally inviolable is wrong.

Science has to be rooted in observation (direct detection) and measurement. Mathematics is an essential modeling tool of science but it is only a tool; it is not, of itself, science.

Theoretical physics, over the course of the 20th century, devolved into the study of ever more elaborate mathematical models, invoking ever more elaborate metaphysical conjectures of an invisible realm knowable only through the distorted lenses of those standard models.

Physical reality has to determine the structure of our theoretical models. Currently theorists with elaborate models dictate to scientists (those who observe, measure, and experiment) that they must search for the theoretical components of their models, or at least some signs and portents thereof. Failure to find the required entities and events does not constitute a failure of the model however, only a failure, thus far, of detection (dark matter, etc.) or the impossibility of detection (quarks, etc.). In any event the models cannot be falsified; they need only be modified.

Modern theoretical physics is an exercise in mathematicism and it has come to a dead end. That is the crisis in physics and it will continue until the math first approach is abandoned. It is anybody’s guess when that will happen. The last dark age of western science persisted for a millennium.