Adventures in Theoretical Physics I – Entropy

Updated 3Jun21

Entropy is one of those slippery concepts floating around in theoretical physics that defies easy explication. The following quote is from a small subsection of an extensive Wikipedia entry on entropy:

The following is a list of additional definitions of entropy from a collection of textbooks:

-a measure of energy dispersal at a specific temperature.

-a measure of disorder in the universe or of the availability of the energy in a system to do work.

-a measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work.

In Boltzmann’s definition, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium. Consistent with the Boltzmann definition, the second law of thermodynamics needs to be re-worded as such that entropy increases over time, though the underlying principle remains the same.

Note that this is a “list of additional definitions”, there are seemingly innumerable others peppered throughout the article. It would be a hard to caricature the overall impression of incoherence, oozing like quicksand and sucking under any rational attempt to say concisely what, exactly, entropy is, beyond the usual handwaving about a vaguely specified measure of disorder.

The first sentence of the Wiki article states unequivocally that “Entropy is… a measurable physical property…”. That statement is false. In section 7.8 it is made clear that entropy cannot be measured directly as a property. Rather, discrete energy transfers are dropped into an equation that ostensibly “describes how entropy changes dS when a small amount of energy dQ is introduced into the system at a certain temperature T.” The only measurements made are of energy and temperature. Entropy is a made-up mathematical concoction; It is not a real property of physical reality.

The reader will also be pleased to note that the same article informs us that “In 1865, Clausius named the concept of “the differential of a quantity which depends on the configuration of the system,” entropy… “.

The real problem with the concept of entropy is not that it may be useful in doing some thermodynamic calculations. Clearly, it is of some mathematical value? What it does not have is any clear explanatory power or physical meaning, and well, math just isn’t physics. Nonetheless, entropy has been elevated to a universal, physical law – the Second Law of Thermodynamics, where the situation doesn’t get any less opaque. In fact the opacity doesn’t get any more opaque than this gem:

Nevertheless, this principle of Planck is not actually Planck’s preferred statement of the second law, which is quoted above, in a previous sub-section of the present section of this present article, and relies on the concept of entropy.

Section 2.9 of the 2nd Law wiki entry above

Here is Planck’s quote from “…a previous sub-section of the present section of this present article…”:

Every process occurring in nature proceeds in the sense in which the sum of the entropies of all bodies taking part in the process is increased. In the limit, i.e. for reversible processes, the sum of the entropies remains unchanged.

Of course, the only question that remains then is, what exactly is entropy?

The final word on the nature of entropy should be this comment by John von Neumann which appears in a sidebar of this section, wherein he advises Claude Shannon to call his information uncertainty concept entropy because “… In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.” Can’t argue with that logic. It appears that the definition of entropy is entropic. It is getting ever more disordered with the passage of time.

So what’s the problem? This is after all, a relatively trivial matter compared to the Grand Delusions of Theoretical Physics, like Dark Matter, Dark Energy, Quarks & etc. Or is it? Well, actually the Entropy Paradigm represents a major blind spot in fundamental physics. Theoretical physics has this misbegotten, laboratory based idea that all ordered systems are in some sense, “running down” or losing the ability to do work, or becoming more disordered or, or… something, called Entropy.

Entropy is all downhill, all the time. In the context of a designed laboratory experiment, the original, closed-system condition is some prearranged, ordered-state, that is allowed to “run down” or “become disordered” or whatever, over time and that, more or less is Entropy! Fine so far, but when you make Entropy a universal law – The Second Law of Thermodynamics, all is lost, or more exactly half of reality is lost.

Why? Because there are no closed systems outside the laboratory. What you get when you ignore that fact is this kind of bloated nonsense:

The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations – then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation – well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. 

Sir Arthur Stanley Eddington, The Nature of the Physical World (1927)

Entropy is correct, if all it boils down to is, ordered, closed, systems will become disordered. But entropy doesn’t tell you how the systems came to be ordered in the first place. If you extrapolate from a closed-system experiment, a Universal Law that says, or implies, that all systems rundown, but you have no theory of how those systems came to be ordered in the first place, you have exactly half a picture of reality – because over here in the real world, ordered systems do arise from disorder.

It’s hard to do good physics if you have this half-blind view of reality. You start babbling about the Arrow of Time, the heat death of the Universe, not to mention whatever this is. You may have to invoke an unobservable Creation Miracle like the Big Bang to get the ball rolling.

Babble Update (3Jun21)

“If you want your clock to be more accurate, you’ve got to pay for it,” study co-author Natalia Ares, a physicist at the University of Oxford, told Live Science. “Every time we measure time, we are increasing the universe’s entropy.”

Livescience.com

Meanwhile, physical reality doesn’t work like that. Physical reality is locally cyclic – like life itself. There is, for instance, a considerable body of observational evidence suggesting that galaxies reproduce themselves. That body of evidence has been dismissed with spurious statistical arguments, and the well-established astronomer who made them, Halton Arp, was denied telescope time to continue making those observations. No one has since been permitted to pursue that line of observational research.

Meanwhile, there is also no comparable body of observational evidence for the Big Bang supported model of galaxy formation, via accretion, from a primordial, post-BB cloud. Like the rest of the Big Bang’s defining features, galaxy formation by accretion is not an observed phenomenon in the Cosmos.

So, an over-generalization (to the point of incoherence), of some simplistic closed-system laboratory experiments done in the 18th and 19th centuries, has come to aid and abet, the absurd Big Bang model of the Cosmos. Modern theoretical physics is an incoherent, unscientific mess.

Leave a Reply

Your email address will not be published. Required fields are marked *