The equation S = k log W engraved on the Boltzmann tombstone in Vienna Central Cemetery. |

in the

with

Some consider S = k ln W to be easily the second most important formula of physics, next to E = mc² or at par with it. [1] This formula is sometime called "Boltzmann formula" (or Boltzmann entropy formula) and entropy calculated from this formula is sometimes called Boltzmann entropy.

Einstein

In 1901, German physicist Max Planck, in his "On the Law of Distribution of Energy in the Normal Spectrum", introduced the S = k log W + c formula, in semi

In 1904, and thereafter, German physicist Albert Einstein began to repeatedly criticize the formula

“The equationS = klogW + constappears without an elementary theory — or however one wants to say it — devoid of any meaning from a phenomenological point of view.”— Albert Einstein (1910), popular 2007+ re-quote version

“Usually W is set equal to the number of ways (complexions) in which a state, which is incompletely defined in the sense of a molecular theory (i.e. coarse grained), can be realized. To compute W one needs a complete theory (something like a complete molecular-mechanical theory) of the system. For that reason it appears to be doubtful whether Boltzmann's principle alone, i.e. without a complete molecular-mechanical theory (Elementary theory) has any real meaning. The equation S = k log W + const. appears [therefore] without an Elementary theory—or however one wants to say it—devoid of any meaning from a phenomenological point of view.”— Albert Einstein (1910), Ezecheil Cohen 2005 abbreviated translation

“Usually W is put equal to the number of complexions.... In order to calculate W, one needs a complete (molecular-mechanical) theory of the system under consideration. Therefore it is dubious whether the Boltzmann principle has any meaning without a complete molecular-mechanical theory or some other theory which describes the elementary processes. . seems without content, from a phenomenological point of view, without giving in addition such an Elementartheorie.”— Albert Einstein (1910), Abraham Pais 1982 abbreviated translation

(add discussion)

Derivation

The actual full rigorous step-by-step derivation of the logarithmic formulation or interpretation of entropy (S = k ln W) is a bit difficult to track down for a number of reasons, the first of which being that it was originally done in German and readily available English translations are wanting.

The start of the derivation traces to Austrian physicist Ludwig Boltzmann's 1872 H-theorem, a formula to approximate the velocity distributions of the particles of a body of gas, as described in his "Further Studies on the Thermal Equilibrium of Gas Molecules".

The "famous expression" that equates entropy with the logarithm of the state probability, supposedly in the form of S = k ln W, was first presented, according to science historians Helge Kragh and Stephen Weininger, in 1877 (fact check); possibly in Boltzmann's article “On the Relation of a General Mechanical Theorem to the Second Law of Thermodynamics”. [13]

An absorbing (left) and radiating (right) "resonator", which is a sphere with a hole in it (originally a type of iron stove with a hole in it, sooted black on the inside) that absorbs electromagnetic radiation when cold, acting like a black body; or, conversely, emits electromagnetic radiation when hot. This is termed "black body radiation" and is the physical model on which the S = k ln W equation was derived by German physicist Max Planck in 1901. [1] |

The modern formulation, S = k ln W (or S = k log W, base e assumed), as it is now known as a staple of science, was proposed or rather stated as a matter-of-fact by German physicist Max Planck as being the standard formula for the measure of the entropy of black bodies, as discussed in his 1901 "On the Law of Distribution of Energy in the Normal Spectrum", albeit done without derivation and without clear justification as to why this formulation applies to the measurement of the entropy of what Planck defined as an "irradiated, monochromatic, vibrating resonator".

A crude approximate derivation is as follows, as given in 1946 by Belgian-born English thermodynamicist Alfred Ubbelohde. [8] First we start with a body of ideal gas containing an Avogadro number

The first condition we stipulate is that the process is done very slowly, the model of the infinitely slow rate, idealizing the view that the temperature of the gas never differs from that of the surroundings. Thus, the process is thus defined as being an isothermal process. A second assumption behind the idealization of the mention "infinitely slow" means that model the process as being reversible, and thus use the equals sign " = " instead of the Clausius inequality in our expression for the first law equation for this process. Therefore, according to Joule's second law, which states that the internal energy of an ideal gas is solely a function of temperature:

The Boltzmann entropy equation tattooed on the stomach of man, in circa 2006, who says “he’s not very good at math”, done by Kimsu at Body Graphics, New Jersey. [12] |

the internal energy change for this process will be zero and, thus, according to the first law :

the heat change dQ occurring in the process will be converted completely into the work dW of expansion:

To then calculate this work W we use French physicist Emile Clapeyron's 1834 expression for the pressure-volume work:

Classical ideal gas law

The pressure function for this body will be the ideal gas equation, which in classical ideal gas law notation is:

and with substitution:

Statistical ideal gas law

Alternatively, as was introduced in the 1900 work of German physicist Max Planck, the statistical version of the ideal gas law can be written as:

where

Definite integral of 1/x

The above integral has the form of what is called a "definite integral", one with upper and lower limits, which integrates according to the following rule (a rule to note which seems to have been the method in which the logarithm found its way into thermodynamics): [9]

For more on graphical nature of this rule, see the WordPress blog: “why is the integral of 1/x equal to the natural logarithm of x?”. [14]

Integration

Therefore, using the above rule for the definite integral of 1/x, we have:

This can be reduced, using the rule that the

to the following form:

With substitution of this into the reduced first law, step three (above), we have:

Then, bringing the temperature over, we have:

And, by definition (Rudolf Clausius, 1865), this is thus the entropy change of the body of gas during the expansion:

It is at this point that the derivation becomes a

and with substitution:

Here, we have introduced Boltzmann's notion of

To note, if our integral had the form of what is called as an "indefinite integral", one without upper and lower limits, an additive constant would be used:

and our formulation of entropy for our expanding gas body would be:

or, if the

This version, although it seems we are

and where SN is the entropy of a system or black body composed of

This is the formula that is famously displayed on the Boltzmann tombstone (pictured above), which was erected in the 1930s, at the Central Cemetery (Zentralfriedhof), Vienna, Austria. [2]

In any event, in this "crude derivation", as Ubbelohde calls it, somewhere we are missing a few end steps and details to arrive at the exact formulation introduced (without proof) by Planck.

Note

In the decades to follow Planck's probabilistic "disorder-dependent" model of entropy seems to have been adopted with abandon to be the universal measure of entropy, often being applied in biothermodynamics and in human thermodynamics. Ubbelohde even seems to corroborate on this, in his concluding statement that "the probability increase accompanying any process is quite general, and holds for all the diversity of spontaneous happenings for which the measurement of entropy has any meaning."

The discerning theorist, however, should hold reserve in using this formulation as an absolute universal measure of entropy, for a number of obvious reasons, the foremost of which being that the equation contains the has constant R hidden in its body:

When we thus see entropy calculations using this equation in non-ideal gas situations, e.g. the rubber band model of entropy, entropy measures in cells, entropy measures of systems of humans, etc., one begins to question the validity of such a calculation?

ln vs log

The use of "log", in the historical sense, refers to the natural logarithm (base e) in this formula, which seems to be the case when in 1900 German physicist Max Planck first scripted the version of the statistical entropy formula, as displaced in Boltzmann's tomb. It was only sometime afterwords (date needed), that log was used to denote the base 10 logarithm and ln the base e logarithm.

Discussion

The statement that Boltzmann defined entropy as function of the number of

The keen scientist should be aware that American electronics researcher Ralph Hartley's 1927 H-formula, quantifying the information content or Boolean logic content of telegraphy messages (depicted above), has absolutely nothing to do with Austrian physicist Ludwig Boltzmann's 1872 H-theorem, a kinetic theory-based definition of heat, quantifying the movements of the particles of an ideal gas system, or with German physicsit Max Planck's 1901 logarithmic definition of entropy, an attempt to quantify heat in terms of the energy states of the particles of bodies. |

Warning | Information theory

In the 1927, American electronics researcher Ralph Hartley published his "Transmission of Information" in which he outlined a way to mathematically quantify telegraph signals being sent down a telegraph wire using the following formula:

where

Hartley's H-formula telegraph wire information function later came to be confused with Austrian physicist Ludwig Boltzmann's 1878 H-theorem in the 1948 paper "A Mathematical Theory of Information" by American electrical engineer Claude Shannon, and forever since people have been confusing the mathematical subject of information theory with physical science subject of thermodynamics, to no end. [4]

This mess was then compounded when information theory researchers discovered Szilard's demon as described in Hungarian physicist Leo Szilard's 1929 “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings”, and attempt at disproving the existence of a Maxwell's demon, wherein he argued that the logarithmic interpretation of entropy could be used to determine the entropy produced during the ‘measurement’ of information the demon discerns when viewing the speeds and positions of the particles in his two compartments. [5]

Thus, in sum, these two of these topics (mathematics of telegraphy and hypothetical demons) have justified to many that entropy has an information interpretation. Some scientists, such as physical chemist Arieh Ben-Naim, have gone to extreme lengths to embed the assertion that entropy or rather heat is simply information that is measurable in units of bits in stead of joules, and have even proclaimed that the entire SI unit system should be thrown out the window and that the entire field of science all the way down to the sub-atomic and Planck length scale should be re-written in units of bits. This is one of the most absurd things in all of science, yet one that has many pied piper followers.

A proof or disproof of the relation of entropy to information, in a modern sense, requires a formal treatise, which has not been done. The modern reader should be warned that use of logarithms to measure entropy for systems other than ideal gas systems, is mostly a baseless conjecture.

Human thermodynamics

In human thermodynamics, owing to the a mixture of both information and multiplicity view of entropy, Boltzmann's entropy formula has since been used to model the entropy of any number of anthropomorphic quantities or qualities, with unabandon, in nearly every scenario or situation conceivable. A random example, one of many, is American physicist Edwin Jaynes' 1991 article “How Should we Use Entropy in Economics”, in which is he introduces some tentative outlines of how an economic system can be modeled as a thermodynamic system, such as how Willard Gibbs' 1873

where (X,Y,Z ...) are some type of macroeconomic variables, which he doesn't really go into, and W is the multiplicity factor of the macroeconomic state, which he describes as the "number of different microeconomic ways in which it can be realized", whatever that means, and tries to connected in some way to French mathematician

The justification for these types of theories or models, however, are near baseless, in that multiplicity approximations of entropy are only good for ideal gas systems, generally, and information interpretations of entropy are based on Maxwell demon and telegraph wire proofs, which are fictionalized abstractions, having almost nothing to do with Clausius' logic of 'transformation equivalents' (entropy) or uncompensated transformations (entropy change). [7]

References

1. (a) Planck, Max. (1901). “On the Law of Distribution of Energy in the Normal Spectrum.”

(b) Muller, Ingo. (2007).

2 (a) Planck, Max. (1901). “

(b) Schmitz, John E.J. (2007).

(c) Boltzmann equation – Eric Weisstein’s World of Physics (states the year was 1872).

3. Photo of Boltzmann tomb (Vienna, 2005).

4. Hartley, Ralph V. L. (1928). “Transmission of Information”,

5. Szilárd, Leó. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” (Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen),

6. Planck, Max. (1909).

7. Gladyshev, Georgi. (2010). “On the Thermodynamics of the Evolution and Aging of Biological Matter.”

8. Ubbelohde, Alfred René. (1947).

9. Definite integral – Math World.

10. Indefinite integral – Math World.

11. Jaynes, Edwin. (1991). “How Should we Use Entropy in Economics: Some Half-baked Ideas in Need of Criticism”, Feb 01.

12. Boltzmann equation (stomach tattoo) Capo3433, 19 Dec 2006 – News.BMEzine.com.

13. (a) Boltzmann, Ludwig. (1877). “On the Relation of a General Mechanical Theorem to the Second Law of Thermodynamics” (“Uber die Beziehung eines Allgemeine Mechanischen Satzes zum zweiten Hauptsatze der Warmetheorie”),

(b) Kragh, Helge and Weininger, Stephen J. (1996). “Sooner Science than Confustion: the Tortuous Entry of Entropy into Chemist” (abs),

14. Author. (2011). “Why is the integral of 1/x equal to the natural logarithm of x?”, ArcSecond.WordPress.com, Dec 17.

15. Swendsen, Robert H. (2006). “Statistical Mechanics of Colloids and Boltzmann’s Definition of the Entropy” (abs),

16. (a) Einstein, Albert. (1910). “The Theory of the Opalescence of Homogeneous Fluids and Liquid Mixtures near the Critical State” (“Theorie der Opaleszenz von homogenen Flüssigkeiten und Flüssigkeitsgemischen in der Nähe des kritischen Zustandes” (pdf)),

(b) Pais, Abraham. (1982).

(c) Cohen, Ezechiel G.D. (2005). “Boltzmann and Einstein: Statistics and Dynamics – An Unsolved Problem”,

(d) Tsallis, Constantino, Gell-Mann, Murray and Sato, Yuzuru. (2005). “Asymptotically Scale-Invariant Occupancy of Phase Space Makes Entropy

(e) Bais, F. Alexander and Farmer, J. Doyne. (2007). “Physics of Information” (pdf), Santa Fe Institute working paper; in:

(f) Klyce, Brig. (2013). “The Second Law of Thermodynamics” (Ѻ), Panspermia.org.

Further reading

● Braunstein, Jerry. (1969). “States, Indistinguishablity, and the Formula S = k ln W in Thermodynamics” (abs),

● Johnson, Eric. (2018).

External links

● Boltzmann’s entropy formula – Wikipedia.