thermodynamic information
A 2020 Google search return for key "thermodynamic information", showing the proliferation of false belief that information is a quantity of thermodynamics, and or defined by thermodynamics, which is NOT the case. [1]
In hmolscience, thermodynamic information fallacy is the oft-made, incorrectly presumed assumption or grasping idea, per reasons of Sokal affair based confusion, that “information” is a “quantity” of thermodynamics; the abused idea, generally, that information is on par with that of energy, matter, or entropy, the latter of which it is incorrectly said, in many instances, to be equivalent with.

Overview
In fall 1940 to spring 1941, John Neumann told Claude Shannon to all his new logarithmic quantity for the transmission of ‘information’ (of the 1927 telegraph transmission Hartley type), by the name “entropy”, as a supposed joke of some kind (see: Neumann-Shannon anecdote).

In 1948, Shannon, in his “A Mathematical Theory of Communication”, taking Neumann’s suggestion to heart, introduced a new H function model of “information, choice, and uncertainty” passed as ones and zeros in telegraph lines, stylized in some crude grasping way Ludwig Boltzmann's 1872 H-function model of the “heat”, symbol H, of an ideal gas, and christened his new formula by the name “entropy”.

In the decades to follow, the popularity of this paper, passed along the incorrect idea that heat of thermodynamics as something to do with information coding and transmission of computer science; only a few were keen to the situation:

“The entropy introduced in information theory is NOT a thermodynamical quantity.”
Dirk ter Haar (1954), Statistical Mechanics

“It is misleading in a crucial way to view ‘information’ as something that can be poured into an empty vessel, like a fluid or even energy.”
Anatol Rapoport (1956), “The Promise and Pitfalls of Information Theory” [2]

“The term ‘entropy’ is now widely used in social science, although its origin is in physical science. There are three main ways in which the term may be used. The first invokes the original meaning, referring to the unidirectionality of heat flow, from hot bodies to cold ones. The second meaning can be derived from the first via statistical mechanics; this meaning is concerned with measures of ‘evenness’ of ‘similarity’. The third meaning derives from information theory. The three distinct meanings are carefully described and distinguished, and their relationships to each other are discussed. The various uses of the three concepts in the social sciences are then reviewed, including some uses which confuse the different meanings of the term. Finally, modern work in thermodynamics is examined, and its implications for economic analysis are briefly assessed.”
John Proops (1987), “Entropy, Information, and Confusion in the Social Sciences”

Others, less knowledgeable about physics and information theory, were prone to gullibility. The 1978 use of this incorrect model by James Miller, later expanded on by Richard Adams (1988) and his "energy form" model, being just one of hundreds (if not thousands) of confused examples found in print. [1] As a general rule of thumb, to see if a non information science themed book is based on nonsense, is to key word search the book for "Shannon", and if his name returns, then one can be sure the book is sinking ship argument.

See also
Information entropy (quotes)
Shannon bandwagon

References
1. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
2. Rapoport, Anatol. (1956). “The Promise and Pitfalls of Information Theory” (abs), Behavioral Science, 1:303-09; in: Modern Systems Research for the Behavioral Scientist (editor: Walter Buckley) (pgs. 137-42). Aldine, 1968.

TDics icon ns