In computer science, Shannon information is the logarithmic quantification of sequential transmissions of bits (1s and 0s) transmitted between a source and a receiver, such as through a telegraph cable; is, in general, a measure of bandwidth, and it is calculated to exclude physical effects so that it measures the amount of abstract meaning that can be carried.
Etymology
The name "Shannon information" refers to the description of information as described by American electrical engineer Claude Shannon in his 1948 article "A Mathematical Theory of Communication", in which information, in the form of transmission, high or low currents or voltages in a telegraph wire, signal, or fiber optics line, etc., or storage, a device with two stable positions, such as a relay or flip-flop, is quantified mathematically. [1]
In this paper, on the earlier joke suggestion of American chemical engineer John Neumann, Shannon devotes one irreparably corrupted paragraph to allude to the mathematical isomorphism idea that all logarithmic formulations of information, involved in transmissions and storage, are nothing more than the thermodynamic entropy of statistical mechanics, as defined by Austrian Ludwig Boltzmann's 1878 H-theorem. This false association soon gained currency.
To avoid this false convolution, the name "Shannon information" should specifically be used to avoid this corruption of semantic meanings. To be clear, Shannon information (or Shannon entropy as it is often synonymously called) has absolutely nothing to do with thermodynamics (or with Boltzmann's H-theorem, which measures the average speeds of atoms in a gas body).
References
1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
External links
● Shannon information – Iscid.org.