In thermodynamics, the amount of disorder, in contrast to order, in system is often considered as a representation of entropy or as a proportional measure or correlate of entropy. [1] In a sense, the higher the entropy of a structure or system the greater disorder. [2] This interpretation of entropy, stems from the statistical theories of particularly Austrian physicist Ludwig Boltzmann (1870s) and the later interpretation of his work by German physicist Max Planck (1901) as well as the earlier terminologies of German physician and physicist Hermann von Helmholtz (1882), as well as having a connectivity to Walther Nernst's heat theorem (1907) and the third law of thermodynamics, all of which seem to be captured in Planck's principle of elementary disorder.

Entropy
In 1862, German physicist Rudolf Clausius introduced the aggregation and disgregation model of entropy.

Austrian physicist Ludwig Boltzmann seems to have been the origin of the disorder view of entropy, but the exact article needs to be tracked down.


In 1882, German physicist Hermann Helmholtz used the word "Unordnung" (disorder) to describe entropy. [3] Specifically, in February 1882 in an address to the Akademie der Wissenschaften zu Berlin, Helmholtz stated the following on the topic of the second law: [6]


“Unordered motion, in contrast, would be such that the motion of each individual particle need have no similarity to its neighbors. We have ample grounds to believe that heat-motion is of the latter kind, and one may in this sense characterize the magnitude of entropy as the measure of disorder.”

This famous disorder definition of entropy, of course, soon led to the great “evolution second law debate”, being the public debate on the seeming conflict between evolution and the second law. In short, the argument goes as such: if, in the 1865 words of Clausius, the second laws declares that (a) the entropy of the universe tends to a maximum, and if, according to the 1882 views of Helmholtz, that (b) entropy is a measure of disorder, then the sum of these two latter points, on the surface, seem to contradict the earlier 1859 view of Darwin that (c) species of higher orders have evolved over time from species of lesser orders. Mathematically speaking, when these comments are taken on face value:

a + b ≠ c

In 1973, French sociological anthropologist Roger Caillois famously summarized the debate as such: “Clausius and Darwin cannot both be right.” [7]

One of the first to summarize the development of this conception was American mechanical engineer Joseph Klein in his 98-page, 1910 book Physical Significance of Entropy or of the Second Law. [3] In the preface, he summarizes what he calls the “interpretations reached by Boltzmann and Planck” and that he draws most heavily upon Planck, in that he views Planck as being “the clearest expositor of Boltzmann”. Klein summarizes the view that Boltzmann and Planck have reached the result that “the entropy of any physical state is the logarithm of the probability of that state, and this probability is identical with the number of complexions of the state.” Moreover, “this number is the measure of the permutability of certain elements of the state and in this sense entropy is the measure of the disorder of the motions of a system of mass points. [3]

The most notable derivation of a thermodynamic measure or interpretation of disorder was given in the famous 1944 book What is Life?, chapter six: "Order, Disorder and Entropy", by Austrian physicist Erwin Schrödinger, who stated in loose terms that for any body in question entropy is proportion to the logarithm of disorder. Schrödinger argues, it is argued that based on the statistical thermodynamics investigations of Austrian physicist Ludwig Boltzmann and American engineer Willard Gibbs, that disorder is given by the following expression:

 \text{entropy} = k \log D \,

where k is the Boltzmann constant and D is a "quantitative measure of the atomistic disorder of the body in question". [5]

References
1. (a) Entropy – a measure of disorder; the higher the entropy the greater the disorder (Source: Oxford Dictionary of Chemistry, 2004).
(b) Entropy – in thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy (Source: Barnes & Noble's Essential Dictionary of Science, 2004).
(c) Entropy – a measure of disorder in the universe or of the availability of the energy in a system to do work (Source: Gribbin's Encyclopedia of Particle Physics, 2000).
2. Entropy a measure of disorder; the higher the entropy the greater the disorder (Source: Daintith, John. Oxford Dictionary of Science, 2005).
3. Anderson, Greg (2005). Thermodynamics of Natural Systems. (pg. 105). Cambridge University Press.
4. Klein, Joseph Frederic. (1910). “Physical Significance of Entropy or of the Second Law”, (Preface). New York: D. Van Nostrand Co.
5. Schrödinger, Erwin. (1944). What is Life? (ch. 6 “Order, Disorder, and Entropy). pgs. 67-75 Cambridge: Cambridge University Press.
6. Capri, Anton Z. (2007). Quips, Quotes, and Quanta (ch. 1: Thermodynamics: Founders and Flounderers, pgs. 1-10) [PDF]. World Scientific.
7. Caillois, Roger. (1973). Coherence Adventures: Aesthetics Generalized, in the Heart of Fantastic, the Asymmetry (Coherences Aventureuses: Esthetique Generalisee, au Coeur du Fantasitique, la Dissymetrie). Paris: Gallimard.

External links
● Denbigh, K. G. (1989). “Note on Entropy, Disorder and Disorganization”. Brit. J. Phil. Sci.; 40: 323 -332.


TDics icon ns