|
A 1993 chronology of the development of information theory, which starts with the work of Harry Nyquist (1924), Ralph Hartley (1928), and Claude Shannon (1948). [19] |
See main: Information entropy (quotes); Shannon bandwagonIn 1948, American electrical engineer Claude Shannon adopted the physics term "entropy" as the new name for his logarithm of a probability formulation of data transmission, telegraphy in particular, alluding to the premise that the his formula and the formula for entropy in statistical mechanics has the same "form", after which people began to continuously assume a connection to the two fields. To exemplify this confusion, the 2005 Oxford Dictionary of Science, gives a the following addendum to the definition of information theory: [1]
"Several branches of physics have been related to information theory. For example an increase in entropy has been expressed as a decrease in information. It has been suggested that it may be possible to express the basic laws of physics using information theory. See also: Landauer's priniciple; Zeilinger's principle."Here we see the recursive confusion that results from Shannon's misfortune naming choice of the formula for the mathematical quantification of the "measure of information, choice, and uncertainty" in a signal transmission by the name entropy; meaning that, the above science dictionary definition, in the Shannon-namesake sense of the matter, reduces to the following nonsensical statement, as the unacquainted reader would see things: "an increase in entropy has been expressed as a decrease in entropy."
Claude Shannon - American electrical engineer who founded information theory with his 1948 paper "A Mathematical Theory of Communication"; in which he argued that entropy is a measure of information, thus initiating the questionable field of information theory thermodynamics. |
where f(p,q,t) is a distribution function, or probability of finding, at a given time t a particle with a position q and momentum p, in which the distribution is assumed to evolve with time, owing to the proper motion of the molecules and their collisions. [9] The H theorem of Boltzmann states that this function decreases with time and tends towards a minimum which corresponds to the Maxwell distribution: If the distribution of the velocities is Maxwellian, the H function remains constant during time. For a system of N statistically independent particles, H is related to the thermodynamic entropy S through:
S = -NkH
The Boltzmann tombstone showing the S = k log W entropy formula, which many people mistakenly assume, can be used to quantify "bits" in data storage and transmission. |
S = k log W
W = K log m
H = n log S
where K is a positive constant. Shannon then states that “any quantity of this form, where K merely amounts to a choice of a unit of measurement, plays a central role in information theory as measures of information, choice, and uncertainty.” Then, as an example of how this expression applies in a number of different fields, he references R.C. Tolman’s 1938 Principles of Statistical Mechanics, stating that “the form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space… H is then, for example, the H in Boltzmann’s famous H theorem.” The following excerpt is the key section in which Shannon defined his new variable, with implied thermodynamic connotations:
Shannon then gives what he calls "the entropy H in the case of two possibilities with probabilities p and q = 1- p" and then goes on to declare:
Excerpt (page 11) of Claude Shannon's 1948 "A Mathematical Theory of Communication" in which he connects information to entropy; where the reference 8 is (See, for example, R. C. Tolman, Principles of Statistical Mechanics, Oxford, Clarendon, 1938).
“The quantity of H [entropy] has a number of interesting properties which further substantiate it as a reasonable measure of choice or information.”
I = K ln P
See main: Neumann-Shannon anecdoteIn 1940s, after Shannon had been working on his equations for some time, he happened to visit the mathematician and chemical engineer John von Neumann. During their discussions, regarding what Shannon should call the “measure of uncertainty” or attenuation in phone-line signals with reference to his new information theory, Shannon stated the following (a conversation that varies depending on the source): In short, Neumann told Shannon:
“You should call [your measure of choice or information] entropy, because nobody knows what entropy really is, so in a debate you will always have the advantage.”
S = K log I
S = c log R
“No doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.”