Etymology

The name "Shannon information" refers to the description of information as described by American electrical engineer Claude Shannon in his 1948 article "A Mathematical Theory of Communication", in which information, in the form of

In this paper, on the earlier joke suggestion of American chemical engineer John Neumann, Shannon devotes one irreparably corrupted paragraph to allude to the mathematical isomorphism idea that all logarithmic formulations of information, involved in transmissions and storage, are nothing more than the thermodynamic entropy of statistical mechanics, as defined by Austrian Ludwig Boltzmann's 1878 H-theorem. This false association soon gained currency.

To avoid this false convolution, the name "Shannon information" should specifically be used to avoid this corruption of semantic meanings. To be clear, Shannon information (or Shannon entropy as it is often synonymously called) has absolutely nothing to do with thermodynamics (or with Boltzmann's H-theorem, which measures the average speeds of atoms in a gas body).

References

1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication",

External links

● Shannon information – Iscid.org.