“Workers in other fields should realize that the basic results of the subject [communication channels] are aimed in a very specific direction, a direction that is not necessarily relevant to such fields as psychology, economics, and other social sciences.”

Very few people in the decades to follow, however, ever received this memo, the result of which is that many people in modern times actually believe that Shannon's information theory is rooted in thermodynamics, in the sense that his so-called "Shannon entropy" (a measure of Hi's and Lo's in information transmissions) is exactly the same as "Boltzmann entropy" (a particle velocity/position interpretation of "Clausius entropy"), which is incorrect.

Genius rankings

As a boy, he rigged a telegraph machine to the barbed wired fence that ran along his country road, so that he could talk to his friend ½-mile away

The Bandwagon

In 1956, Shannon penned a now-infamous editorial memo article “The Bandwagon” wherein he makes a plea to everyone to stop using his new so-called information entropy theory outside of communications engineering proper. [10] Reaction bandwagon stylized articles to this bandwagon editorial article followed in the aftermath, including one from Norbert Wiener entitled "What is Information Theory?". [13] In 1958, American electrical engineerSee main:Shannon bandwagon

Thermodynamics

A huge misconception in the mind of many modern scientists is the view that Shannon's information theory is based on or derived from thermodynamics. This could not be farther from the truth. The truth behind the myth was that sometime in the 1940s, while Shannon was developing his theory, he happen to visit Hungarian chemical engineer John Neumann, a close friend to Leo Szilard, who previously had published a 1929 article on Maxwell's demon, containing a calculation of the entropy generated during the information collection and storage processing of the demon's mind. Owing to this influence, Neumann suggested, humorously, to calls voltage and current type information (highs and lows in telephone wires) by the name entropy. On this suggestion, in his 1948 paper, Shannon infamously states, in a subtle, yet very influential, way that seemed to connect his work to thermodynamics, that:

"The form of H will be recognized as that of entropy as defined in statistical mechanics.”

Then, after explaining his formulation of H as a function of a set of probabilities involved in the transmission of information (line currents), he concludes, in reference to Austrian physicist Ludwig Boltzmann's famous 1882 paper

“Everyone knows thatShannon’sderivation is in error.”

One of the greatest misconceptions in modern science, perpetuated most likely from the fragmenting and relative isolation of the various modern branches of science, is the view that Shannon was a great thermodynamicist. This is a hugely erroneous myth. Shannon was an electrical engineer who studied the transmission and coding of voltage and electrical currents in telephone wires, no more no less. Shannon never had any formal education or training in thermodynamics and his work had nothing to do with thermodynamics. Yet, for reasons which require further discussion, in online polls as to who is the greatest thermodynamicist of all-time?, Shannon’s name pops up, which is a puzzling phenomenon. [6]

Questionable applications

After hearing of Mandelbrot’s 1961 criticism, Shannon continued to express “misgivings about using his definition of entropy for applications beyond communication channels.” [3] In any event, Shannon's warnings didn't help and in the decades to follow his euphemistic verbalized derivation of his communication theory H function in relation to Boltzmann’s statistical thermodynamic H function soon led hundreds of individuals (examples: James Coleman (1964), Stephen Coleman (1975), Orrin Klapp (1978), Jay Teachman (1980), Kenneth Bailey (1990), etc.) to write theoretical papers and books on connections between communication, information, entropy, and thermodynamics; all of which, of course, being unsubstantiated. Shannon's formulation has come to be known as information entropy, Shannon entropy, information theoretic entropy, among others.

Tribus

To cite one dominant example of the influence of Shannon's thermodynamics-borrowed terminology, in 1948 American engineer Myron Tribus was asked during his examination for his doctoral degree, at UCLA, to explain the connection between Shannon entropy and Clausius entropy. In retrospect, in 1998, Tribus commented that he went on to spend ten-years on this issue: [3]

“Neither I nor my committee knew the answer. I was not at all satisfied with the answer I gave. That was in 1948 and I continued to fret about it for the next ten years. I read everything I could find that promised to explain the connection between the entropy of Clausius and the entropy ofShannon. I got nowhere. I felt in my bones there had to be a connection; I couldn’t see it.”

Information entropy

See main: Information entropy, Shannon entropy, etc.Shannon’s revolutionary idea of digital representation was to sample the information source at an appropriate rate, and convert the samples to a bit stream. He characterized the source by a single number, the entropy, adapting a term from statistical mechanics, to quantify the information content of the source. For English language text, Shannon viewed entropy as a statistical parameter that measured how much information is produced on the average by each letter. [2] The equation for H that Shannon defines as entropy is:

in which

Difficulties on theory

The essential difficulty of Shannon’s idea of entropy is that it’s terminology is a verbal-crossover theory, culled from statistical thermodynamics, but having little connection, if any at all, to thermodynamics. This has lead countless writers, having little training in thermodynamics, to proffer up some of the most illogical backwardly reasoned papers every written, however novel the intentions. These papers, from a thermodynamic point of view, become an almost strain on the mind to read. The 2007 views of German physicist Ingo Müller summarizes the matter near to a tee: [3]

“No doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientistmustbe clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.”

Müller clarifies that “for level-headed physicists, entropy (or order and disorder) is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.”

OMNI interview

In a 1987 interview, by Anthony Liversidge of

OMNI: Before you wrote your classic paper onThe Mathematical Theory of Communication, Norbert Wiener went around the office at Bell Labs announcing ‘information is entropy’. Did that remark provoke you in any way to come up with information theory?

Shannon: No, I hadn’t even heard of that remark when I started my work. I don’t think Wiener had much to do with information theory. He wasn’t a big influence on my ideas there, though I once took a course from him.

OMNI: Do you agree with Norbert Wiener, who is reported to have denied any basic distinction between life and non-life, man and machine?

Shannon: That’s a loaded question! Let me say this. I am an atheist to begin with. I believe in evolution theory and that we are basically machines, but a very complex type.

OMNI: Does your theory give a hint of how life might have evolved, seemingly in the face of the second law of thermodynamics, which says that order should slowly disintegrate?

Shannon: The evolution of the universe is certainly a puzzling thing to me as well as to everybody else. It’s fantastic we’ve ever come to the level of organization we have, starting from a big bang. Nonetheless, I believe in the big bang.

(add)

Education

Shannon completed a BS in electrical engineering and a BS in mathematics at the University of Michigan in 1936. Shannon completed a MS in 1937, with a thesis "A Symbolic Analysis of Relay and Switching Circuits", wherein he showed how the mathematics of British mathematician

Shannon then completed his PhD in 1940, under the direction of Vannevar Bush, with a dissertation "An Algebra for Theoretical Genetics" at the Massachusetts Institute of Technology, in which he tried to do for genetics what he had done for electronic circuits, but, according to according to physical economics historian Philip Mirowski, the attempt was premature. [15] It may be that this is from where Lila Gatlin gleaned her misaligned inspiration to apply information theory to DNA transcription and gene expression, but his is only a conjecture.

In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey, staying there for one year, during which time the famous Neumann-Shannon anecdote occurred. [14]

In 1941, Shannon joined Bell Labs, remaining affiliated with this group, university appointments aside, until 1972.

Quotes | On

The following are quotes on Shannon:

“Apparently,Shannonis a genius.”— Vannevar Bush (1939), written comment

“Shannonis the most important genius you’ve never heard of, a man whose intellect was on par with Albert Einstein and Isaac Newton.”— Jimmy Soni (2017), “10,000 Hours With Claude Shannon” (Ѻ)

Quotes | By

The following are quotes by Shannon:

“Neumann is the smartest person I have ever met.”— Claude Shannon (c.1941)

“Although this wave of popularity is certainly pleasant and exciting for those of us working in the [information science] field, it carries at the same an element of ‘danger’. Information theory is no panacea to solve nature’s secrets. It is too easy for our somewhat artificial propensity to collapse overnight when it is realized that the use of a few ‘exciting words’ likeinformation, entropy, redundancy, do NOT solve all our problems.Workers in other fields, should realize that the basic results of the subject[information science]areaimed in a very specific direction, a direction that isNOT relevant to such fields as: psychology, economics, and other social sciences.”— Claude Shannon (1956), “The Bandwagon” (Ѻ), Mar

“I think Wiener had a great brilliance. I’m not putting down his great mind. I think he really did have atremendous IQand a tremendous grasp of many things.”— Claude Shannon (1982)

References

1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication",

2. Claude Shannon 1916-2001 – Research.ATT.com.

3. Muller, Ingo. (2007).

4. Tribus, M. (1998). “A Tribute to Edwin T. Jaynes”. In

5. (a) IEEE Transactions on Information Theory, December 1955.

(b) Hillman, Chris. (2001). “Entropy in the Humanities and Social Sciences.” Hamburg University.

6. Orzel, Chad. (2009). “Historical Physicist Smackdown: Thermodynamics Edition”,

7. Avery, John (2003).

8. (a) Shannon, Claude. (1987). “Claude Shannon: Interview: Father of the Electronic Information Age”, by Anthony Liversidge,

(b) Liversidge, Anthony. (1987). “Profile of Claude Shannon”, reprinted in:

(c) Kahre, Jan. (2002).

9. Horgan, J. (1992). “Claude Shannon” (abs),

10. (a) Shannon, Claude. (1956). “The Bandwagon”,

(b) Mitra, Partha and Bokil, Hemant. (2007).

11. Aftab, O., Cheung, P., Kim, A., Thakkar, S., and Yeddanapudi, N. (2001). “Information Theory and the Digital Age”, Project History, Massachusetts Institute of Technology.

12. (a) Elias, Peter. (1958). “Two Famous Papers” (pdf),

(b) Mitra, Partha, and Bokil, Hemant. (2008).

13. Weiner, Norbert. (1956). “What is Information Theory”,

14. Claude Shannon – History-Computer.com.

15. Mirowski, Philip. (2002).

16. Poundstone, William. (2005).

17. Price, Robert. (1982). “Interview: Claude E. Shannon”, Winchester, MA, Jul 28.

Further reading

● Shannon, Claude E. (1950). "Prediction and Entropy of Printed English", Bell Sys. Tech. J (3) p. 50-64.

External links

● Claude Shannon – Wikipedia.