A 1993 chronology of the development of information theory, which starts with the work of Harry Nyquist (1924), Ralph Hartley (1928), and Claude Shannon (1948). [19] |

Naming confusions

In 1948, American electrical engineer Claude Shannon adopted the physics term "entropy" as the new name for his logarithm of a probability formulation of data transmission, telegraphy in particular, alluding to the premise that the his formula and the formula for entropy in statistical mechanics has the same "form", after which people began to continuously assume a connection to the two fields. To exemplify this confusion, the 2005See main: Information entropy (quotes); Shannon bandwagon

"Several branches of physics have been related toHere we see the recursive confusion that results from Shannon's misfortune naming choice of the formula for the mathematical quantification of the "measure of information, choice, and uncertainty" in a signal transmission by the name entropy; meaning that, the aboveinformation theory. For example an increase in entropy has been expressed as a decrease in information. It has been suggested that it may be possible to express the basic laws of physics using information theory.See also:Landauer's priniciple; Zeilinger's principle."

In short, beginning in the last half of the 20th century, there has been a push to blend information theory concepts together with the laws of thermodynamics, e.g. by suggesting that an increase in entropy can be expressed as in decrease in information, to yield new laws, e.g. the law of conservation of information, or branches of sciences, such as information theory and evolution, chaos theory, or complexity theory. [10]

Thermodynamicists, however, view the connection between entropy and information to be only superficial, very illogical, and not justified. [11] To cite one example, in 1950 American electrical engineer Claude Shannon estimated the "entropy" of written language to be 0.6 to 1.3 bits per character; and in modern times, through many convoluted probability arguments, one can find dozens of researchers arguing that this is the same as German physicist Rudolf Clausius' 1865 definition of "entropy" in thermodynamics, which has units of joules per kelvin per mole (J/K∙mol). [15]

American molecular machines theorist

Overview

In the 1940s, while American electrical engineer Claude Shannon was developing the basic mathematics of information theory of telephone signals, an associate, American chemical engineer John von Neumann, suggested that Shannon call the informational uncertainty associated with a random variable "entropy" because his basic formula was similar to the statistical formula for the entropy of an ideal gas as developed by Ludwig von Boltzmann in 1872. Every since, many individuals, especially outside the field of thermodynamics, such as mathematicians, have inadvertently assumed a physical existence between thermodynamic "irreversibility" in heat engine cycles and "information uncertainty" in communication lines. This has led to a great confusion. The three main researchers regarded as being responsible for the alleged equivalence between information and negative entropy are French physicist Leon Brillouin (1950), American physicist Edwin Jaynes (1957), and Hungarian-American physicist Leo Szilard (1964). [14]

Claude Shannon - American electrical engineer who founded information theory with his 1948 paper "A Mathematical Theory of Communication"; in which he argued that entropy is a measure of information, thus initiating the questionable field of information theory thermodynamics. |

In more detail, information theory was founded by Shannon with the publication of his 1948 article "A Mathematical Theory of Communication." [4] The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel. The most fundamental results of this theory are Shannon's source coding theorem, which establishes that, on average, the number of

Origin of assumed thermodynamic connection

In 1872, Austrian physicist Ludwig Boltzmann formulated the H theorem, a statistical interpretation of Rudolf Clausius’ entropy, or “transformation content” of a working body, for an ideal gas system of particles with no appreciable interaction, thus finding a proof for the phenomenon of atomic and molecular irreversibility in steam engine cycles. [3] The H theorem was defined by Boltzmann as: [9]

where

S = -NkH

The Boltzmann tombstone showing the S = k log W entropy formula, which many people mistakenly assume, can be used to quantify "bits" in data storage and transmission. |

In 1901, German physicist Max Planck put Boltzmann’s statistical formula in the form shown below, and also carved on his tombstone (adjacent):

S =klog W

where

Bell telephone labs

In a completely different field of study, in 1924, while working at Bell telephone labs, American electrical engineer Harry Nyquist published a paper called “Certain Factors Affecting Telegraph Speed”

W=Klogm

where

In 1928, American electronics researcher Ralph Hartley published the paper “Transmission of Information”, in which he used the word

H=nlogS

where

Pulling this all together, so to say, in 1944, while working at Bell telephone labs, Claude Shannon sought to solve the basic problem in communication, namely: "that of reproducing at one point, either exactly or approximately, a message selected at another point." In this direction, Shannon developed a number of fundamental theorems, such as defining the “bit” as the basic unit of information.

A Mathematical Theory of Communication

In 1948 Shannon published his famous paper “A Mathematical Theory of Communication”, in which he devoted a section to what he calls Choice, Uncertainty, and Entropy. In this section, Shannon introduces an “H function” of the following form:

where

Shannon then gives what he calls "the entropy

Excerpt(page 11) of Claude Shannon's 1948 "A Mathematical Theory of Communication" in which he connects information to entropy; where the reference 8 is (See, for example, R. C. Tolman,Principles of Statistical Mechanics,Oxford, Clarendon, 1938).

“The quantity ofH[entropy] has a number of interesting properties which further substantiate it as a reasonable measure of choice orinformation.”

and from hereafter, German physicist Rudolf Clausius' 1865 thermodynamic quantity entropy

Brillouin's conceptions of information

By 1956, owing to the work of Shannon and Schrödinger, French physicist Leon Brillouin, in his book

I = KlnP

where

Neumann's famous suggestion?

In 1940s, after Shannon had been working on his equations for some time, he happened to visit the mathematician and chemical engineer John von Neumann. During their discussions, regarding what Shannon should call the “measure of uncertainty” or attenuation in phone-line signals with reference to his new information theory, Shannon stated the following (a conversation that varies depending on the source): In short, Neumann told Shannon:See main: Neumann-Shannon anecdote

“You should call [your measure of choice orinformation] entropy, because nobody knows what entropy really is, so in a debate you will always have the advantage.”

This is a famous comment classifies as both a famous entropy quotation and as a classic entropy misintrepretation. One must always remember that any thermodynamic connotations made in information theory are always traced back to this ridiculous suggestion. To note, Neumann could have just as easily told Shannon to call is new quantity “information sensation” based on similarity to German physiologist Justav Fechner’s 1860 statistical logarithmic psychological sensation formula:

S = K log I

or

S = c log R

where

is the same as Shannon’s entropy, from the phenomenon of information loss in communication as defined in Shannon’s “Mathematical Theory of Communication” paper:

Entropy = Information | Objections

Every since 1948, with the publication of Shannon's paper, there has been a growth in the assumed equivalence of heat engine entropy and the entropy of a message, as well as growth in the objections to this point of view. In the 1999, to cite one example, American chemistry professor Frank Lambert, who for many years taught a course for non-science majors called "Enfolding Entropy" at Occidental College in Los Angeles, stated that another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy" of Claude Shannon. [12] In Shannon’s 1948 paper, as discussed, the word "entropy” was adopted by the suggestion von Neumann. This step, according to Lambert, was “Wryly funny for that moment,” but “Shannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications". [13] According to Lambert, “certainly most non-experts hearing of the widely touted information entropy would assume its overlap with thermodynamic entropy. However, the great success of information "entropy" has been in areas totally divorced from experimental chemistry, whose objective macro results are dependent on the behavior of energetic microparticles. Nevertheless, many instructors in chemistry have the impression that information "entropy" is not only relevant to the calculations and conclusions of thermodynamic entropy but may change them. This logic, according to Lambert, is not true. [12] In sum, according to Lambert, information "entropy"

In the 2007 book

“No doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientistmustbe clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.”

Müller clarifies the matter, by stating that: “for level-headed physicists, entropy (or order and disorder) is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.” [11]

See also

● Computer science thermodynamics

References

1. Daintith, John. (2005).

2. Nyquist, Harry. (1924). “Certain Factors Affecting Telegraph Speed”

3. Boltzmann, Ludwig. (1872). “Weitere Studien uber das Warmegleichgeleichgewicht unter gasmolekulen.” Sitzungsberichte der Akademie der Wissencschafte, Wein, II, 66, 275. [English translation in: S.G. Brush. (1966).

4. Shannon, Claude E. (1948). "A Mathematical Theory of Communication",

9. Perrot, Pierre. (1998).

10. (a) Campbell, Jeremy. (1982).

(b) Yockey, Hubert P. (2005).

(c) Baeyer, Hans Christian von. (2004).

(d) Avery, John (2003). Information Theory and Evolution. World Scientific.

(e) Sardar, Ziauddin and Abrams, Iwona. (2004).

11. Muller, Ingo. (2007).

12. Lambert, Frank L. (1999). “Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms: Examples of Entropy Increase? Nonsense!”

13. Including: Golembiewski, R. T.

14. Mayumi, Kozo. (2001).

15. (a) Shannon, Cluade E. (1950), "Prediction and Entropy of Printed English", Bell Sys. Tech. J (3) p. 50-64.

(b) Mahoney, Matt. (1997). "Refining the Estimated Entropy of English by Shannon Game Simulation," Florida Institute of Technology.

16. (a) Fancher, R. E. (1996).

(b) Sheynin, Oscar. (2004), "Fechner as a Statistician" (abstract), The British journal of mathematical and statistical psychology, 57 (Pt 1): 53-72, May.

17. Ben-Naim, Arieh. (2008).

18. Coveney, Peter V. and Highfield, Roger. (1992).

19. Aftab, O., Cheung, P., Kim, A., Thakkar, S., and Yeddanapudi, N. (2001). “Information Theory and the Digital Age” (§: Bandwagon, pgs. 9-11), Project History, Massachusetts Institute of Technology.

Robert Fano – Wikipedia.

20. (a) Schneider, Thomas D. (1991). “Theory of Molecular Machines. II. Energy Dissipation from Molecular Machines” (abs),

(b) Schneider, Thomas D. (1997). “Information is Not Entropy, Information is Not Uncertainty!”, Frederick National Laboratory for Cancer Research, Jan 4.

(c) Schneider, Thomas D. (2000). “Pitfalls in Information Theory and Molecular Information Theory”, Frederick National Laboratory for Cancer Research, Mar 13.

(d) Shannon entropy is a misnomer (section) – Schneider.ncifcrf.gov.

Further reading

● Thomson, Lauralee A. (1972).

External links

● Information theory – Wikipedia.