In information theory, Shannon entropy or "information entropy "is a synonym for the name of the H function, a probability equation, introduced by American electrical engineer Claude Shannon in his 1948 paper “A Mathematical Theory of Communication”, posited to represent “information, choice, and uncertainty” of a discrete information source. [1] At the statement of this definition, Shannon alluded to the idea that this information H function is similar to Ludwig Boltzmann's H-function for the distribution of particle speeds in a body of gas, and then first alluded to idea that the two functions might be related in some way and then second, to add confusion to confusion, named his new function by the term "entropy", thus making a round-a-bout claim to stating that his theory had something to do with the second law thermodynamics. This, however, is not the case. In any event, Shannon's formulation of information soon came to be called "Shannon entropy". The status of the so-called Shannon entropy, according to both Americans biochemist Jeffrey Wicken (1987) and philosopher Charles Dȳke (1988), is summarized as follows: [5]

“Despite the similarities, it is very doubtful that the second law holds for Shannon entropies.”

The term "Shannon entropy", aside from the commonly used term information entropy, is also referred to as "information theoretic entropy" or "Shannon-Weaver entropy", the latter signifying the efforts of American mathematician Warren Weaver, co-author of the the follow-up 1949 book The Mathematical Theory of Communication with Shannon on the same subject.

Statistical entropy
Many will argue that Shannon entropy is equivalent to statistical entropy used in physics (Boltzmann entropy). [2] In this sense, the term Boltzmann-Shannon entropy is often used as well as Gibbs-Shannon entropy or Boltzmann-Gibbs-Shannon (BGS) entropy. [3]

Clausius entropy
A rare few will go so far as to argue that Shannon entropy is equivalent to Clausius entropy; although connection is difficult to find. In 1948, for instance, American engineer Myron Tribus was examined for his doctoral degree, at UCLA, he was asked to explain the connection between entropy defined by Claude Shannon, and the entropy defined by Rudolf Clausius (1865). In retrospect, in 1998 Tribus comment on this question that: [4]

“Neither I nor my committee knew the answer. I was not at all satisfied with the answer I gave. That was in 1948 and I continued to fret about it for the next ten years. I read everything I could find that promised to explain the connection between the entropy of Clausius and the entropy of Shannon. I got nowhere. I felt in my bones there had to be a connection; I couldn’t see it.”

References
1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
2. Wachter, Armin and Hoeber, Henning. (2006). Compendium of Theoretical Physics (pg. 419). Birkhauser.
3. Mitra, Partha and Bokil, Hemant. (2007). Observed Brain Dynamics (pg. 11). Oxford University Press.
4. Tribus, M. (1998). “A Tribute to Edwin T. Jaynes”. In Maximum Entropy and Bayesian Methods, Garching, Germany 1998: Proceedings of the 18th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis (pgs. 11-20) by Wolfgang von der Linde, Volker Dose, Rainer Fischer, and Roland Preuss. 1999. Springer.
5. Dȳke, Charles. (1988). The Evolutionary Dynamics of Complex Systems: A Study in Biosocial Complexity (Shannon entropy, pg. 114). Oxford University Press.

See also
● Fisher entropy
● Havrda-Charvat entropy
● Kolmogorov entropy
● Kullback-Leibler entropy
● Rényi entropy
● Tsallis entropy

External links
Shannon entropy – KnowledgeRush.com.

TDics icon ns