Science of Man (Dolloff, 1975) 2
The view of Shannon-Wiener entropy according to Norman Dolloff (1975), who sees information entropy as an "open sesame" to the sciences of man, thereby connecting information measures of things such as DNA to thermodynamics. [7]
In information theory, entropy, symbol H, or information entropy or Shannon entropy, is an approximate measure of choice or information, measured in units of bits, in an electrical signal or message.

In 1948, American electrical engineer Claude Shannon proposed the following formula, modeled on Hartley information (1927) (ΡΊ), as the measure of binary digit based information:

Shannon entropy

where K is a positive constant and pi are a set of probabilities. [1]

This troublesome terminology, i.e. equating entropy with information, was borrowed from statistical thermodynamics, through the suggestion of American chemical engineer John von Neumann and introduced in 1948 by Shannon, but without explicit conditional warnings as to its realm of applicability. [2] Because of its mathematical simplicity, namely that it refers, supposedly, to the thermodynamic concept of "entropy", but is cut off atomic and molecular reality (difficulties), it soon, within a period of eight years, according to Shannon, "ballooned to an importance beyond its actual accomplishments (a technical tool for communication engineers)" into fields such as the thermodynamic modeling of life, evolution, cybernetics, biology, psychology, linguistics, fundamental physics, economics, the theory of organization, Maxwell's demon, and many others. [3]

For thermodynamicists, the 1948 use of the term "entropy" to model information in signals has been, in a general sense, an irritation. [5] The model employed by Shannon suggests to many individuals that anything which fits into the form of a logarithm has something to do with the second law of thermodynamics, which is not the case.

There are some, for instance, who believe that “information” will soon replace the Clausius version of “entropy”. Israeli physical chemist Arieh Ben-Naim, in his 2008 book A Farewell to Entropy, argues that “thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term entropy with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the driving force of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.” The essential mistake in this statement is that the "driving force" of natural processes is free energy or affinity, which are functions of enthalpy and entropy, not informational changes.

In any event, Ben-Naim advocates replacing entropy by information, a term that has become widely used in many branches of science. [4] This type of logic, however, is highly illogical: information is not a fundamental quantity, whereas "energy", or its derivatives, "heat", "temperature", or "work", are. The idea of reducing thermodynamics down to information, to note, is similar to Greek mathematician Constantin Carathéodory’s 1909 efforts, in his Axiomatic Formulation of Thermodynamics, to reduce the second law down to a purely mathematical basis using a geometrical approach. [6]

1. Shannon, C.E. (1948). "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
2. Tolman, R.C. (1938). Principles of Statistical Mechanics, Oxford, Clarendon.
3. Shannon, C.E. (1956). "The Bandwagon" (PDF), IRE Transations - Information Theory.
4. Ben-Naim, Arieh. (2008). A Farewell to Entropy. World Scientific Publishing Co.
5. Muller, Ingo. (2007). A History of Thermodynamics - the Doctrine of Energy and Entropy (ch 4: Entropy as S = k ln W, pgs: 123-126). New York: Springer.
6. Georgiadou, Maria. (2004). Constantin Carathéodory: Mathematics and Politics in Turbulent Times, (section: "The Axiomatic Foundations of Thermodynamics", pgs. 47-50). Springer.
Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.

TDics icon ns