In thermodynamics, information thermodynamics or "information theory thermodynamics" is a umbrella term distinguishing a set of related fields on the subject of "thermodynamics of information processes" that attempt to interpret information, such as is found in communication signals or transmission lines, thermodynamically, e.g. information entropy or conservation of information, such as in information theory (1948), cybernetics (1948), general systems theory (1968), chaos theory (1970s), black hole thermodynamics (1975), e.g. the study of "information" lost in black holes as entropy, and other marginally connected subjects such as emergence or complexity theory, etc. [1] The term “information thermodynamics” came in use as early as 1979. [2]

In 1929, Hungarian-American physicist Leó Szilárd devoted his thesis “On the Increase of Entropy in a Thermodynamical System by the Action of Intelligent Beings” to Maxwell’s demon, noting that the demon would require the use of information. [3] The 1948 paper “A Mathematical Theory of Communication” attempted to make a mathematical connection between statistical thermodynamics (gas phase systems) and information in telephone lines (current or voltage signals), thus initiating the science of information theory. [4]

The 1956 book Science and Information Theory by French-born American physicist Léon Brillouin attempted to blend thermodynamics, such as Maxwell’s demon, with communications and computing. [5] The famous cited 1957 paper “Information Theory and Statistical Thermodynamics” by American physicist Edwin Jaynes attempted to connect equilibrium thermodynamics, with the statistical mechanics of American engineer Willard Gibbs, with information interpretations, and in doing so initiated what has come to be known as the maximum entropy (MaxEnt school) school of thermodynamics. [6] The 1961 textbook Thermostatics and Thermodynamics: an Introduction to Energy, Information and States of Matter, by American engineer Myron Tribus, was one of the first publications to attempt to base the laws of thermodynamics on information theory rather than on the classical arguments. [7]

In 1969, American mechanical engineer Robert Evans put forward an essergy theory, in which he attempted to reformulate chemical thermodynamics in terms of information. In his own words: [11]

“A possible consequence of the proof may be a more general formulation for the concept of information based on Brillouin's principle of the equivalence of thermodynamic information and potential work. The proof indicates that negentropy is not as general a measure of potential work as is the quantity, essergy. This result could imply that essergy is a more general measure of thermodynamic information than negentropy, an implication that might lead to a broader formulation about information and, thus, new insight into the foundations of information theory.”

In the 1987 book Evolution, Thermodynamics, and Information: Extending the Darwinian Program, American biochemist Jeffrey Wicken adopts Shannon’s view of information as a model of entropy (information), but proceeds to deploy the concept of information as a causal agency in biological evolution and defines organisms as “informed thermodynamic systems”. [8]

The 2007 book a Farewell to Entropy: Statistical Thermodynamics Based on Information, by Israeli physical chemist Arieh Ben-Naim, argues that thermodynamics and statistical mechanics will benefit from replacing the Clausius term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement, as Ben-Naim argues, would facilitate the interpretation of the "driving force" of many processes in term of informational changes. [9]

The general science of information thermodynamics, if it is indeed to be considered as a science, from a rigorous physics perspective, is more often than not seen as a contrived sort of mathematical extrapolation from true thermodynamics, which is based on the heat and work interactions in the steam engine. Indeed, the suggested name of “information”, i.e. to be called entropy, on the statistical thermodynamic sense, was done as a sort of mathematical joke between Neumann and Shannon. To exemplify this, Wicken, for instance, called Shannon’s terminological choice of borrowing the word entropy from statistical thermodynamics: [10]

“Loose language [that served] the dark god of obfuscation.”

The general poor state of the subject, in modern terms, could possibly been attributed to the association and origination of the subject at Bell Labs, by a number of its founders, such as John Neumann, Claude Shannon, Léon Brillouin, etc., who may have been driven beyond the limit of academic credibility-justification by the lucrative state of the fervent financial flow at AT&T during these years?

1. (a) Wiener, Norbert. (1961). Cybernetics - or Control and Communication in the Animal and the Machine (2nd ed.). Cambridge, Massachusetts: The MIT Press.
(b) Bertalanffy, Ludwig von. (1968). General Systems Theory - Foundations, Development, Applications. New York: George Braziller.
(c) Yu, Francis T.S. (2000). Entropy and Information Optics. CRC Press.
(d) Trincher, Karl S. (1965). Biology and Information: Elements of Biological Thermodynamics. Consultants Bureau.
(e) Zotin, Aleksandr I. (1990). Thermodynamic Bases of Biological Processes: Physiological Reactions and Adaptations, (pg. 41). Walter de Gruyter.
2. (a) Bulletin de L'Académie Polonaise Des Sciences, (pgs. 119, 125, etc.). by Polska Akademia Nauk, Published by L'Académie, 1979.
(b) Mathematical Reviews (pg. 4950), by American Mathematical Society, 1982.
3. (a) Szilárd, Leó. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings”, Zeitschrift fur Physik, 53, pgs. 840-56.
(b) English translation of “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” by Anatol Rapoport and Mechthilde Knoller in Maxwell’s Demon 2 (pgs. 110-19) by Harvey Leff and Andrew Rex.
4. Shannon, Claude E. and Weaver, Warren. (1949). The Mathematical Theory of Communication. Illinois: The University of Illinois Press.
5. Brillouin, Léon. (1956). Science and Information Theory. New York: Academic Press.
6. (a) Jaynes, E. T. (1957) “Information theory and statistical mechanics”, (PDF), Physical Review 106:620.
(b) Jaynes, E. T. (1957) “Information theory and statistical mechanics II”, (PDF), Physical Review 108:171.
7. Tribus, Myron. (1961). Thermostatics and Thermodynamics: an Introduction to Energy, Information and States of Matter. Van Nostrand.
8. (a) Wicken, Jeffrey S. (1987). Evolution, Thermodynamics, and Information: Extending the Darwinian Program. Oxford University Press.
(b) Corning, Peter A. (2005). Holistic Darwinism: Synergy, Cybernetics, and the Bioeconomics of Evolution, (pg. 467). University of Chicago Press.
9. Ben-Naim, Arieh. (2007). Farewell to Entropy: Statistical Thermodynamics Based on Information, World Scientific Publishing Co.
10. Peterfreund, Stuart. (1990). Literature and Science: Theory and Practice, (pg. 223). Northeastern University Press.
11. Evans, Robert B. (1969). A Proof that Essergy is the Only Consistent Measure of Potential Work (abs), PhD thesis, Dartmouth College, Hanover H H Thayer School of Engineering.

Further reading
Campbell, Jeremy. (1982). Grammatical Man - Information, Entropy, Language, and Life. New York: Simon and Schuster.
● Gleick, James. (1987). Chaos - Making a New Science. New York: Penguin Books.
● Weber, Bruce H., Depew, David J., Smith, James D. (1988). Entropy, Information, and Evolution: New Perspectives on Physical and Biological Evolution. MIT Press.
● Sardar, Ziauddin and Abrams, Iwona. (1998). Introducing Chaos. USA: Totem Books.
● Applebaum, David. (1996). Probability and Information - an Integrated Approach. Cambridge: Cambridge University Press.
● Avery, John (2003). Information Theory and Evolution. New Jersey: World Scientific.
● Baeyer, Hans Christian von. (2004). Information - the New Language of Science. Cambridge, Massachusetts: Harvard University Press.
● Yockey, Hubert P. (2005). Information Theory, Evolution, and the Origin of Life. Cambridge: Cambridge University Press.

TDics icon ns