American electronics researcher Ralph Hartley's 1928 paper "Transmission of Information", in which he explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission; a model later used by American electrical engineer Claude Shannon in 1948 to found the science of information theory. [8] |

Overview | Terminology misuse

In 1939-1940, newly-minted electrical engineer and mathematician Claude Shannon, while completing a post-doctoral fellowship at Princeton’s Institute for Advanced Study, was vacillating on what new scientific coining to give to his newly formulated logarithmic statistical equation quantification of data involved in signal transmission by the name of ‘information’ or ‘uncertainty’, to which query Hungarian-born American chemical engineer and mathematician John Neumann suggested that Shannon use neither names, but rather use the name ‘entropy’ of thermodynamics, because: (a) the statistical mechanics version of the entropy equations have the same mathematical isomorphism and (b) nobody really knows what entropy really is so he will have the advantage in winning any arguments that might erupt. [14] Shannon, unfortunately took this jocular advice to heart, and in his famous to some infamous to others 1948 “A Mathematical Theory of Communication” article, building on the 1928 "Transmission of Information" article of American electronics researcher Ralph Hartley, which defined the “practical measure of information [as] the logarithm of the number of possible symbol sequences”, employed the namesake ENTROPY, coined originally by German physicist Rudolf Clausius in 1865 as the differential formulation of heat quantities going into our out of a body, as the logarithmic binary digit representation of information measurement in signal transmission, and comments specifically that this new information formulation “will be recognized as that of entropy as defined in certain formulations of statistical mechanics”, referring specifically to Austrian physicist Ludwig Boltzmann’s so-called ‘minimum theorem’ (or H-theorem as it latter came to be called), as introduced in famous 1872 paper “Further Studies on the Thermal Equilibrium of Gas Molecules”, which derived a kinetic theory expression for the entropy of an ideal gas. The long and the short of this near-inane situation, in the 1990 retrospect views of English science historians Peter Coveney and Roger Highfield, is that: [15]

“The two—informationtheoretic ideas and thermodynamic entropy—have been repeatedly confused since the time of von Neumann.”

The misuse of conceptual misunderstanding of information is also summarized well by American philosopher Christian de Quincey, who in 2002 stated: [11]

“When physicists, chemists, biologists, neuroscientists, and psychologists adopt‘information’as an explanatory term, they often do so based on a confusion of categories, and a misunderstanding of its function in communications theory.”

In other words, information has a very specific meaning in (a) transmission of signals, i.e. hi or low voltage pulses, representative by 1s and 0s, (b) storage, defined in semiconductor structure, and (c) processing or programming, represented the flow of current through AND, OR, or NOT gates, defined by Boolean logic, but a meaning that has absolutely nothing to do with thermodynamics. [16]

Boltzmann | 1894 ?

In 1949, American civil engineer and mathematician

“Shannon’s work roots back, as von Neumann has pointed out, to Boltzmann’s observations, in some of his work onstatistical physics(1894), that entropy is related to ‘missing information’, inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. Leo Szilard (Zeitschrift fur Physik, Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Mathematical Foundation of Quantum Mechanics, Berlin, 1932, Chap V) treated information in quantum mechanics and particle physics. Shannon’s work connects more directly with certain ideas developed some twenty years ago byHarry Nyquistand Ralph Hartley, both of Bell Laboratories; and Shannon has himself emphasized that communication theory owes a great debt to Norbert Wiener for much of its basic philosophy [cybernetics]. Wiener, on the other hand, points out that Shannon’s early work on switching and mathematical logic antedated his own interest in this field; and generously adds that Shannon certainly deserves credit for independent development of such fundamental aspects of the theory as the introduction of entropic ideas. Shannon has naturally been specially concerned to push the applications to engineering communication, while Wiener has been more concerned with biological applications (central nervous system phenomena, etc.).”

The "1925"

In any event, this so-called 1894 "missing information" published statement by Boltzmann, remains to be tracked down? This assertion that Boltzmann was thinking about "entropy" and "information" in 1894, or in any year for that matter, seems dubious, a case of mis-attribution, started by Weaver's above 1949 footnote. [7] The Szilard mis-citation, by Weaver, only further compounds the dubiousness of issue.

Historical mis-attributions in information theory publications, to note, are common, especially in over-zealous agenda writings. In the 2011 book

In any event, in the years to follow, the Boltzmann information assertion quickly became scientific folklore. In 1951, to exemplify, American physicist Jerome Rothstein restated the matter as follows: [18]

“Boltzmann himself saw later that statistical entropy could be interpreted as a measure of missinginformation.”

English-born American investigative journalist Jeremy Campbell (1982, 1990) would later go on to popularize this notion as well.

Information transmission | Logarithms

In 1928, American electronics researcher Ralph Hartley published the paper "Transmission of Information", in which he used the word

H=nlogS

where

“What we have done is to take as our practical measure of information the logarithm of the number of possible symbol sequences.”

Maxwell's demon & information

In the years 1929-1930, Hungarian-born American physicist Leo Szilard and American physical chemist Gilbert Lewis gave nearly identical, albeit independent (it seems?), Maxwell demon type derivations in which they connected entropy to the term “information”, albeit

In 1929, Hungarian-born American physicist Leo Szilard argued that the logarithmic interpretation of entropy could be used to determine the entropy produced during the ‘measurement’ of information when "memory-type" Maxwell’s demon (Szilard demon) discerns the speeds and positions of the particles in his two compartments. As a starting point, Szilard stated that the entropy produced by any random information measurement of the demon could be approximated by: [9]

S =klog 2

“This was the point of departure chosen by Szilard (1929), who laid the groundwork for establishing a conversion factor between physical entropy and information.”

This, however, is a misattribution. Szilard never uses the term “information” in his article, but rather is concerned with the nature of “thermal fluctuations” (Brownian motion) in relation to the second law and the Maxwell’s demon argument, such as previously touched on in the work of Marian Smoluchowski, to show that the energy associated with the "action" of measurement a system parameter, say a coordinate mapping the fluctuation parameter of the system, by a Maxwell’s demon type intelligent being would, in the sensory-neurological system, involve a degradation or dissipation of energy, in the motor nervous system, whereby the quantitative entropy production measure, associated with this degradation/dissipation, involved in this neurological “act”, is, according to his estimates,

On 6 Jun 1930, independent, it seems (check?), from Szilard's publication (which was in German), American physical chemist Gilbert Lewis published his article “The Symmetry of Time in Physics”, wherein he employs a Maxwell’s demon argument, with a "cylinder closed at each end, and with a middle wall provided with a shutter", in which derives the result that in the simplest case, if we have one molecule which must be in one of the two flasks, the entropy becomes less by: [12]

S =kln 2

if we know which is the flask in which the molecule is trapped; after which he concludes, in what has become re-quoted frequently, that: [17]

“Gain in entropy always means the loss ofinformation, and nothing more.”

The 1930 Lewis article doesn’t cite Szilard, although the argument and derivations are oddly similar? American physicist Walter Grandy, in his 2008 historical outline of the development of the entropy concept, comments, in regards to Lewis’ 1930 article, that he was “apparently unaware of Szilard’s work”. [13] In comment on his article conclusions, in a 5 Aug 1930 letter to Irving Langmuir, Lewis stated that: [3]

“It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss ofinformation.”

It is blurry as to what he means by this?

Newmann-Shannon anecdote

In 1939-1940 Hungarian chemical engineer John Neumann suggested to American engineer Claude Shannon that he should called information by the name ‘entropy’ as the reasoning that the equations are similar (they both are in logarithmic form) and that nobody knows what entropy really is, so in a debate you will always have the advantage. In 1948, Shannon took Neumann’s advice and in his famous paper "A Mathematical Theory of Communication", would credit this derivation by Hartley as being the point at which the logarithmic function became the natural choice for information and in the same paper, to the ire of many thermodynamicists, equate Hartley's 1928 telegraph "system" model with that of Clausius' 1865 heat engine "system" model. In short, Shannon, using a similar formulation to that above, declared that H, being a measure of information, choice, and uncertainty, is the same H as used in statistical mechanics, specifically the H in Boltzmann's famous H theorem, concluding with: [10]See main: Neumann-Shannon anecdote

“We shall callHthe entropy of the set of probabilities.”

Forever after, countless numbers of information theory scientists have since taken any and all types of information, "which is a very elastic term, ... whether being conducted by wire, direct speech, writing, or any other method", in the Hartley's words, as being a direct equivalent to thermodynamic entropy, as derived from the study of the steam engine. In his paper, Shannon would go on to define the entropy H of the source in units of “bits per symbol”, in which in the Hartley derivation s = 2, corresponding to a source that can only send two voltage or current levels, high or low; hence Shannon’s entropy being measured in binary digits per symbol.

This variation of information entropy or “Shannon entropy” has permeated science to the affect that, for some, information is seen as the mediator or key aspect of life and evolution. [5] In 1972, American biophysicist Lila Gatlin stated the following view on entropy and information: [6]

“Storedinformationvaries inversely with entropy; lowered entropy means a higher capacity to store information.”

If this statement were taken literally, this would indicate the following relation:

It is difficult, however, to track down the origin of this derivation?

Other | Sense perception

In 1847, Scottish physicist James Maxwell stated that the only thing that can be directly perceived by the senses is force, information can also be defined as a force. In this framework, as studied in human chemistry, "information" or more correctly information reception, received by the body and transmitted to the mind, is understood in terms of field particle exchanges, such as is found in the exchange force that holds human molecules (people) together in human chemical bonds.

See also

● Entropy (information)

● Information entropy

● Shannon entropy

References

1. Information (definition) – Dictionary.com

2. (a) Thims, Libb. (2007).

(b) Thims, Libb. (2007).

3. Letter from Gilbert Lewis to Irving Langmuir, 5 August 1930. Quoted in Nathan Reingold,

5. (a) Brillouin, Leon. (1962).

(b) Campbell, Jeremy. (1982).

(c) Avery, John. (2003).

6. Gatlin, Lila L. (1972).

7. (a) Shannon, Claude E. and Warren, Weaver. (1949).

(b) Campbell, Jeremy. (1982).

Campbell, Jeremy. (1990). “Observer and Object, Reader and Text: Some Parallel Themes in Modern Science and Literature”, in:

(c) Hokikian, Jack. (2002).

(e) Baeyer, Hans C.V. (2004).

8. Hartley, R. V. L. (1928). “Transmission of Information”,

9. Szilárd, Leó. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” (Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen),

10. (a) Shannon, Claude E. (1948). "A Mathematical Theory of Communication" (bit, pg. 1),

(b) Shannon, Claude E. and Weaver, Warren. (1949).

11. De Quincey, Christian. (2002).

12. Lewis, Gilbert. (1930). “The Symmetry of Time in Physics”,

13. Grandy, Walter T. Jr. (2008).

14. Schement, Jorge R. and Ruben, Brent D. (1993).

15. Coveney, Peter and HIghfield, Roger. (1990).

16. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair”,

17. Ben-Naim, Arieh. (2012).

18. Rothstein, Jerome. (1951). “Information, Measurement, and Quantum Mechanics” (html),

19. Doyle, Bob. (2011).

20. Email communication between Libb Thims and Robert Doyle (17 Jan 2012).

21. Rapoport, Anatol. (1986).

Further reading

● Volkenstein, Mikhail V. (1986).

External links

● Information – Wikipedia.