Information (diagram)
American electronics researcher Ralph Hartley's 1928 paper "Transmission of Information", in which he explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission; a model later used by American electrical engineer Claude Shannon in 1948 to found the science of information theory. [8]
In science, information is a fact, unit of data, sensory input, or collection of knowledge that can be transmitted, processed, or stored. [1] In human chemistry, information can be understood as a component of the driving force. [2]

Overview | Terminology misuse
In 1939-1940, newly-minted electrical engineer and mathematician Claude Shannon, while completing a post-doctoral fellowship at Princeton’s Institute for Advanced Study, was vacillating on what new scientific coining to give to his newly formulated logarithmic statistical equation quantification of data involved in signal transmission by the name of ‘information’ or ‘uncertainty’, to which query Hungarian-born American chemical engineer and mathematician John Neumann suggested that Shannon use neither names, but rather use the name ‘entropy’ of thermodynamics, because: (a) the statistical mechanics version of the entropy equations have the same mathematical isomorphism and (b) nobody really knows what entropy really is so he will have the advantage in winning any arguments that might erupt. [14] Shannon, unfortunately took this jocular advice to heart, and in his famous to some infamous to others 1948 “A Mathematical Theory of Communication” article, building on the 1928 "Transmission of Information" article of American electronics researcher Ralph Hartley, which defined the “practical measure of information [as] the logarithm of the number of possible symbol sequences”, employed the namesake ENTROPY, coined originally by German physicist Rudolf Clausius in 1865 as the differential formulation of heat quantities going into our out of a body, as the logarithmic binary digit representation of information measurement in signal transmission, and comments specifically that this new information formulation “will be recognized as that of entropy as defined in certain formulations of statistical mechanics”, referring specifically to Austrian physicist Ludwig Boltzmann’s so-called ‘minimum theorem’ (or H-theorem as it latter came to be called), as introduced in famous 1872 paper “Further Studies on the Thermal Equilibrium of Gas Molecules”, which derived a kinetic theory expression for the entropy of an ideal gas. The long and the short of this near-inane situation, in the 1990 retrospect views of English science historians Peter Coveney and Roger Highfield, is that: [15]

“The two—information theoretic ideas and thermodynamic entropy—have been repeatedly confused since the time of von Neumann.”

The misuse of conceptual misunderstanding of information is also summarized well by American philosopher Christian de Quincey, who in 2002 stated: [11]

“When physicists, chemists, biologists, neuroscientists, and psychologists adopt ‘information’ as an explanatory term, they often do so based on a confusion of categories, and a misunderstanding of its function in communications theory.”

In other words, information has a very specific meaning in (a) transmission of signals, i.e. hi or low voltage pulses, representative by 1s and 0s, (b) storage, defined in semiconductor structure, and (c) processing or programming, represented the flow of current through AND, OR, or NOT gates, defined by Boolean logic, but a meaning that has absolutely nothing to do with thermodynamics. [16]

Boltzmann | 1894 ?
In 1949, American civil engineer and mathematician Warren Weaver (1894-1987), in the footnote one, to his opening chapter "Recent Contributions to the Mathematical Theory of Communication", of the book The Mathematical Theory of Communication (the second chapter authored by American electrical engineer Claude Shannon), states the following rather attack-buffering historical platform, in which makes the rather dubious assertion that in 1894 Austrian physicist Ludwig Boltzmann remarked or was thinking about entropy as being related to or as missing information:

Shannon’s work roots back, as von Neumann has pointed out, to Boltzmann’s observations, in some of his work on statistical physics (1894), that entropy is related to ‘missing information’, inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. Leo Szilard (Zeitschrift fur Physik, Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Mathematical Foundation of Quantum Mechanics, Berlin, 1932, Chap V) treated information in quantum mechanics and particle physics. Shannon’s work connects more directly with certain ideas developed some twenty years ago by Harry Nyquist and Ralph Hartley, both of Bell Laboratories; and Shannon has himself emphasized that communication theory owes a great debt to Norbert Wiener for much of its basic philosophy [cybernetics]. Wiener, on the other hand, points out that Shannon’s early work on switching and mathematical logic antedated his own interest in this field; and generously adds that Shannon certainly deserves credit for independent development of such fundamental aspects of the theory as the introduction of entropic ideas. Shannon has naturally been specially concerned to push the applications to engineering communication, while Wiener has been more concerned with biological applications (central nervous system phenomena, etc.).”

The "1925" Zeitschrift fur Physik Volume 53 Szilard “Article” referred to here, to note, seems to be a mis-citation, as Szilard’s famous Volume 53 article is his "1929" article “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings”, in which the citation to his early "1925" article “On the Extension of Phenomenological Thermodynamics to Fluctuation Phenomena”, in Volume 32 of Zeitschrift fur Physik, is found.

In any event, this so-called 1894 "missing information" published statement by Boltzmann, remains to be tracked down? This assertion that Boltzmann was thinking about "entropy" and "information" in 1894, or in any year for that matter, seems dubious, a case of mis-attribution, started by Weaver's above 1949 footnote. [7] The Szilard mis-citation, by Weaver, only further compounds the dubiousness of issue.

Historical mis-attributions in information theory publications, to note, are common, especially in over-zealous agenda writings. In the 2011 book Free Will: the Scandal in Philosophy, by American self-defined "information philosopher" Robert Doyle, we find the statement: "Kelvin's claim that information must be destroyed when entropy increases would be correct if the universe were a closed system." In review of his book, in 2012 email discussions, American electrochemical engineer Libb Thims queried Doyle about this, as follows: “Kelvin never ‘claimed that information must be destroyed when entropy increases’; neither is ‘information mathematically the opposite of entropy’; you seem to have the casual habit of bending and twisting things around to suit your agenda, each of which in the long run is eroding away your credibility.” To which Doyle replied: “You are quite right that Kelvin did not use the term ‘information’. I used it (anachronistically, to be sure) in my introduction chapter. Kelvin was a follow-on to Laplace's Demon, which I graphed as the conservation of information: i.e., the perfect knowledge of a god or demon who knew all the positions, momenta, and force laws. I guess we all twist things sometimes. Thanks for the critical reading.” [20] The same scenario could be the case with Shannon and Weaver.

In any event, in the years to follow, the Boltzmann information assertion quickly became scientific folklore.
In 1951, to exemplify, American physicist Jerome Rothstein restated the matter as follows: [18]

“Boltzmann himself saw later that statistical entropy could be interpreted as a measure of missing information.”

English-born American investigative journalist Jeremy Campbell (1982, 1990) would later go on to popularize this notion as well.

Information transmission | Logarithms
In 1928, American electronics researcher Ralph Hartley published the paper "Transmission of Information", in which he used the word information as a measurable quantity, reflecting the receiver's ability to distinguish that one sequence of symbols (high or low voltages) from any other, in a telegraph transmission, thus quantifying information as:

H = n log S

where S was the number of possible symbols, and n the number of symbols in a transmission. Hartley then concludes with: [8]

“What we have done is to take as our practical measure of information the logarithm of the number of possible symbol sequences.”

Maxwell's demon & information
In the years 1929-1930, Hungarian-born American physicist Leo Szilard and American physical chemist Gilbert Lewis gave nearly identical, albeit independent (it seems?), Maxwell demon type derivations in which they connected entropy to the term “information”, albeit not in the telegraphy information transmission sense of the term, but rather in the sense of the informational knowledge about the location of molecules in a two-compartment gas container arrangement.

Szilard

In 1929, Hungarian-born American physicist Leo Szilard argued that the logarithmic interpretation of entropy could be used to determine the entropy produced during the ‘measurement’ of information when "memory-type" Maxwell’s demon (Szilard demon) discerns the speeds and positions of the particles in his two compartments. As a starting point, Szilard stated that the entropy produced by any random information measurement of the demon could be approximated by: [9]

S = k log 2

where k is the Boltzmann constant and log is the natural logarithm, i.e. base e; which, to note, is often confused, in modern times, with the common logarithm (base 10). Szilard aim here was to show that "measurements themselves are accompanied by a production of entropy". Szilard, in turn, was a personal friend of Hungarian-born American chemical engineer John Neumann; in 1930, for example, they taught a theoretical physics seminar together with Erwin Schrodinger; and would seem that it was Szilard's influence through Neumann that reached Claude Shannon convincing him to call information by the name entropy. [2] Russian-born American mathematical Anatol Rapoport, in his 1986 General Systems Theory, comments on this 1929 Szilard paper that: [21]

“This was the point of departure chosen by Szilard (1929), who laid the groundwork for establishing a conversion factor between physical entropy and information.”

This, however, is a misattribution. Szilard never uses the term “information” in his article, but rather is concerned with the nature of “thermal fluctuations” (Brownian motion) in relation to the second law and the Maxwell’s demon argument, such as previously touched on in the work of Marian Smoluchowski, to show that the energy associated with the "action" of measurement a system parameter, say a coordinate mapping the fluctuation parameter of the system, by a Maxwell’s demon type intelligent being would, in the sensory-neurological system, involve a degradation or dissipation of energy, in the motor nervous system, whereby the quantitative entropy production measure, associated with this degradation/dissipation, involved in this neurological “act”, is, according to his estimates, k log 2 units of entropy. The term “information” nor “information theory” never enters into the discussion here. The connection between Szilard and information seems to be but a later adumbration by other writers set on situating agenda-based historical context.

Lewis
On 6 Jun 1930, independent, it seems (check?), from Szilard's publication (which was in German), American physical chemist Gilbert Lewis published his article “The Symmetry of Time in Physics”, wherein he employs a Maxwell’s demon argument, with a "cylinder closed at each end, and with a middle wall provided with a shutter", in which derives the result that in the simplest case, if we have one molecule which must be in one of the two flasks, the entropy becomes less by: [12]

S = k ln 2

if we know which is the flask in which the molecule is trapped; after which he concludes, in what has become re-quoted frequently, that: [17]

“Gain in entropy always means the loss of information, and nothing more.

The 1930 Lewis article doesn’t cite Szilard, although the argument and derivations are oddly similar? American physicist Walter Grandy, in his 2008 historical outline of the development of the entropy concept, comments, in regards to Lewis’ 1930 article, that he was “apparently unaware of Szilard’s work”. [13] In comment on his article conclusions, in a 5 Aug 1930 letter to Irving Langmuir, Lewis stated that: [3]

“It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss of information.”

It is blurry as to what he means by this?

Newmann-Shannon anecdote
See main: Neumann-Shannon anecdote
In 1939-1940 Hungarian chemical engineer John Neumann suggested to American engineer Claude Shannon that he should called information by the name ‘entropy’ as the reasoning that the equations are similar (they both are in logarithmic form) and that nobody knows what entropy really is, so in a debate you will always have the advantage. In 1948, Shannon took Neumann’s advice and in his famous paper "A Mathematical Theory of Communication", would credit this derivation by Hartley as being the point at which the logarithmic function became the natural choice for information and in the same paper, to the ire of many thermodynamicists, equate Hartley's 1928 telegraph "system" model with that of Clausius' 1865 heat engine "system" model. In short, Shannon, using a similar formulation to that above, declared that H, being a measure of information, choice, and uncertainty, is the same H as used in statistical mechanics, specifically the H in Boltzmann's famous H theorem, concluding with: [10]

“We shall call H the entropy of the set of probabilities.”

Forever after, countless numbers of information theory scientists have since taken any and all types of information, "which is a very elastic term, ... whether being conducted by wire, direct speech, writing, or any other method", in the Hartley's words, as being a direct equivalent to thermodynamic entropy, as derived from the study of the steam engine. In his paper, Shannon would go on to define the entropy H of the source in units of “bits per symbol”, in which in the Hartley derivation s = 2, corresponding to a source that can only send two voltage or current levels, high or low; hence Shannon’s entropy being measured in binary digits per symbol.

This variation of information entropy or “Shannon entropy” has permeated science to the affect that, for some, information is seen as the mediator or key aspect of life and evolution. [5] In 1972, American biophysicist Lila Gatlin stated the following view on entropy and information: [6]

“Stored information varies inversely with entropy; lowered entropy means a higher capacity to store information.”

If this statement were taken literally, this would indicate the following relation:

S \propto \frac{1}{I}

It is difficult, however, to track down the origin of this derivation?

Other | Sense perception
In 1847, Scottish physicist James Maxwell stated that the only thing that can be directly perceived by the senses is force, information can also be defined as a force. In this framework, as studied in human chemistry, "information" or more correctly information reception, received by the body and transmitted to the mind, is understood in terms of field particle exchanges, such as is found in the exchange force that holds human molecules (people) together in human chemical bonds.

See also
Entropy (information)
Information entropy
Shannon entropy

References
1. Information (definition) – Dictionary.com
2. (a) Thims, Libb. (2007). Human Chemistry (Volume One), (preview), (Google books). Morrisville, NC: LuLu.
(b) Thims, Libb. (2007).
Human Chemistry (Volume Two), (preview), (Google books). Morrisville, NC: LuLu.

3. Letter from Gilbert Lewis to Irving Langmuir, 5 August 1930. Quoted in Nathan Reingold, Science in America: A Documentary History 1900-1939 (1981), pg. 400.
5. (a) Brillouin, Leon. (1962). Science and Information Theory (2nd ed.). New York: Dover (reprint).
(b) Campbell, Jeremy. (1982). Grammatical Man - Information, Entropy, Language, and Life. new York: Simon and Schuster.
(c) Avery, John. (2003). Information Theory and Evolution. London: World Scientific.
6. Gatlin, Lila L. (1972). Information Theory and the Living System. Columbia University Press.
7. (a) Shannon, Claude E. and Warren, Weaver. (1949). The Mathematical Theory of Communication (pg. 3, footnote). University of Illinois Press.
(b) Campbell, Jeremy. (1982). Grammatical Man - Information, Entropy, Language, and Life (pg. 44). new York: Simon and Schuster.
Campbell, Jeremy. (1990). “Observer and Object, Reader and Text: Some Parallel Themes in Modern Science and Literature”, in: Beyond the Two Cultures: Essays on Science, Technology, and Literature (pg. 26), ed. Joseph Slade and Judith Lee. Iowa State University Press.
(c) Hokikian, Jack. (2002). The Science of Disorder: Understanding the Complexity, Uncertainty, and Pollution in Our World (pg. 57). Los Feliz Publishing.
(e) Baeyer, Hans C.V. (2004). Information: the New Language of Science (pg. 73). Harvard University Press.
8. Hartley, R. V. L. (1928). “Transmission of Information”, Bell Technical Journal, July, pgs. 535-64; Presented at the International Congress of Telegraphy and telephony, lake Como, Italy, Sept., 1927.
9. Szilárd, Leó. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” (Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen), Zeitschrift fur Physik, 53, 840-56.
10. (a) Shannon, Claude E. (1948). "A Mathematical Theory of Communication" (bit, pg. 1), Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
(b) Shannon, Claude E. and Weaver, Warren. (1949). The Mathematical Theory of Communication. Illinois: The University of Illinois Press.
11. De Quincey, Christian. (2002). Radical Nature: Rediscovering the Soul of Matter (thermodynamics, pgs. 32, 211; Teilhard, 8+ pgs.; dead matter, pg. 34). Invisible Cities Press.
12. Lewis, Gilbert. (1930). “The Symmetry of Time in Physics”, Science, 71:569-77, Jun 6.
13. Grandy, Walter T. Jr. (2008). Entropy and the Time Evolution of Macroscopic Systems, Volume 10 (pg. 19). Oxford University Press.
14. Schement, Jorge R. and Ruben, Brent D. (1993). Information Theory and Communication, Volume 4 (pgs. 43, 53). Transaction Publishers.
15. Coveney, Peter and HIghfield, Roger. (1990). The Arrow of Time: a Voyage Through Science to Solve Time’s Greatest Mystery (pg. 253). Fawcett Columbine.
16. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair”, Journal of Human Thermodynamics, 8(1):1-##.
17. Ben-Naim, Arieh. (2012). Discover Entropy and the Second Law of Thermodynamics: a Playful Way of Discovering a Law of Nature (pg. 12). World Scientific.
18. Rothstein, Jerome. (1951). “Information, Measurement, and Quantum Mechanics” (html), Science, 114: 171-75.
19. Doyle, Bob. (2011). Free Will: the Scandal in Philosophy (pg. 10). I-Phi Press.
20. Email communication between Libb Thims and Robert Doyle (17 Jan 2012).
21. Rapoport, Anatol. (1986). General Systems Theory: Essential Concepts and Applications (§Exorcising Maxwell’s Demon, pgs. 130-). Taylor and Francis.

Further reading
● Volkenstein, Mikhail V. (1986). Entropy and Information. Nauka Publishers (Springer, 2009).

External links
Information – Wikipedia.

TDics icon ns