A 2010 parody of the Neumann-Shannon anecdote, by Israeli physical chemist Arieh Ben-Naim, form his three-page chapter section “Snack: Who’s Your Daddy?”, in which retells the story such that Shannon has a new baby and he and his wife are deciding on what name to pick: ‘information’ or ‘uncertainty’? Neumann suggests they name their son after Rudolf Clausius’ son’s name ‘entropy’, which Shannon decides to do—only to find out, in the years to follow, that people continually confuse his son with Clausius’ son and also misuse and abuse the name; after which, it is suggest to Shannon that he change is son’s name to: Smomi, a short acronym for “Shannon’s Measure Of Missing Information”. [2] |

Date

The date of the infamous Neumann-Shannon conversation is predominately attributed to the year 1940 and said to have occurred at the Institute for Advanced Study, Princeton, New Jersey, where at John Neumann was one of the main faculty members. [12]

To put the infamous conversation in chronological context,

Then, in the fall of 1940, Shannon accepted a National Research Fellowship that allowed him to work under German mathematician Hermann Weyl at the Institute for Advanced Study, Princeton, New Jersey. Then, in the spring of 1941, he was back at Bell Laboratories. [11]

This would put the conversation as having occurred between the fall of 1940 and the spring of 1941. It would seem likely that the conversation—being something important on Shannon’s mind—would have occurred early upon his arrival at the Institute, which would assign the conversation as having occurred in September to December of 1940.

A 2010 snapshot summary of the 1940 Neumann-Shannon anecdote from the 2010 lecture notes turned book Biomedical Informatics of German computer processing scientist and cognitive theorist Andreas Holzinger. [15] |

American information theory historians Jorge Schement and Brent Ruben, however, state that Shannon “spent a year (1939-1940) as a National Research Fellow at the Institute for Advanced Study, studying mathematics and Boolean algebra with Hermann Weyl. [1] This 1939 date, however, seems to be incorrect, as most sources state that Shannon had not yet finished his PhD at MIT until 1940. [12] In agreement with this Schement, in 2012, issued a correction retraction on the "1939" date, stating that they may have jumped the gun in regards to this dating timeframe.

Shannon used the word "entropy" once in his classified 1945 “A Mathematical Theory of Cryptography”. [16]

Overview

The Neumann-Shannon anecdote has been retold so many times that it has classified by some as an urban legend in science.

In Apr 1961, the story of the incident became public knowledge when American engineer Myron Tribus was invited to gave a seminar at MIT on a new way to derive thermodynamics based on information theory. The audience, according to Tribus, was a critical audience, comprised of students of American mechanical engineer Joseph Keenan, founder of the MIT school of thermodynamics, who “tried to rip it apart”, and also that French mathematician

It also happened to be the case that Shannon was in residence at MIT that week, so naturally enough Tribus went to see him. Shannon, according to Tribus, “was immediately able to dispel Mandelbrot’s criticism, but went on to lecture me on his misgivings about using his definition of entropy for applications beyond communication channels.” [3] During this meeting, Tribus queried Shannon as to his reason for choosing to call his information function by the name ‘entropy’, the details of which were first made public in his 1963

“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”

Likewise, Tribus, in his 1987 article “An Engineer Looks at Bayles”, recounts his discussion with Shannon on this question as follows: [5]

“The same function appears in statistical mechanics and, on the advice of John von Neumann, Claude Shannon called it ‘entropy’. I talked with Dr. Shannon once about this, asking him why he had called his function by a name that was already in use in another field. I said that it was bound to cause some confusion between the theory of information and thermodynamics. He said that Von Neumann had told him: ‘No one really understands entropy. Therefore, if you know what you mean by it and you use it when you are in an argument, you will win every time.”

Truncated variations of the above have been retold ever since.

In 1982, Shannon, in recorded interview (ΡΊ), commented, rather hazily, on this past event as follows:

Shannon:

Well, let me also throw into this pot, Szilard, the physicist. And von Neumann, and I’m trying to remember the story. Do you know the story I’m trying to remember?Price:

Well, there are a couple of stories. There’s the one that Myron Tribus says that von Neumann gave you the word entropy, saying touse it becausenobody,you’d win every timebecause nobody would understand what it was.Shannon:

[laughs]Price:

And furthermore, it fitted p*log(p) perfectly. But that, but then I’ve heard . . .Shannon:

von Neumann told that to me?Price:

That’s what you told Tribus that von Neumann told that to you.Shannon:

[laughs – both talking at once]Price:

Bell Labs too, that entropy could be used. That you already made that identification. And furthermore in your cryptography report in 1945, you actually point out, you say the word entropy exactly once in that report. Now this is 1945, and you liken it to Statistical Mechanics. And I don’t believe you were in contact with von Neumann in 1945, were you? So it doesn’t sound to me as though von Neumann told you entropy.Shannon:

No, I don’t think he did.Price:

This is what Tribus quoted.Shannon:

Yeah, I think this conversation, it’s a very odd thing that this same story that you just told me was told to me at Norwich in England. A fellow —Price:

About von Neumann, you mean?Shannon:

Yeah, von Neumann and me, this conversation, this man, a physicist there, and I’ve forgotten his name, but he came and asked me whether von Neumann, just about the thing that you told me, that Tribus just told you, about this fellow. . .Price:

That was Jaynes, I imagine the physicist might have been [Edwin] Jaynes.Shannon:

Yes, I think it was, I think so. Do you know him?Price:

Well, he’s published in the same book as Tribus, you see. This is a book calledThe Maximum Entropy Formalism. You’ve probably seen that book, but they have chapters in it, and Jaynes, the physicist —Shannon:

Now,I’m not sure where I got that idea, but I think I, somebody had told me that. But anyway, I think I can, I’m quite sure that it didn’t happen between von Neumann and me.Price:

Right. Well, I think that the fact that it’s in your 1945 cryptography report establishes that, well, you didn’t get it from von Neumann, that you had made the p*log(p) identification with entropy by some other means. But you hadn’t been —Shannon:

Well, that’s an old thing anyway, you know.Price:

You knew it from thermodynamics.Shannon:

Oh, yes, from thermodynamics. That goes way back.Price:

That was part of your regular undergraduate and graduate education of thermodynamics and the entropy?Shannon:

Well, not in class, exactly, but I read a lot, you know.

(add)

Restated versions of the anecdote

The following are regurgitated versions of the conversation, originally reported by Trybus (above), that varies depending on the source and point of view, and in which if one looks closely at the version reported by Tribus, we see him jumping from the report that Neumann used the phrase already used in "statistical thermodynamics" (1964) to "statistical mechanics" (1971) to "thermodynamics" (1983), which is somewhat humorous in itself:

Evolution of the Neumann-Shannon ConversationSourceConversation, between John Neumann and Claude Shannon, occurred between fall 1940 to spring 1941 at the Institute for Advanced Study, Princeton, New Jersey. April of 1961, Myron Tribus visits Shannon at his office at MIT and questions him about the reason behind his "entropy" name adoption. Tribus recounts the Shannon interview as such: "When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics (e.g. ref. 12). Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.Myron Tribus

(1963)

[8]“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.” Myron Tribus and Edward McIrving (1971)

[4]“You should call it ‘entropy’ for two reasons: First, the function is already used in thermodynamics under that name; second, and more importantly, most people don’t know what entropy really is, and if you use the word ‘entropy’ in an argument you will win ever time.” Myron Tribus (1983)

Philip Mirowski (2002)

[13]At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.” Jeremy Campbell (1982)

[9]Shannon, the pioneer of information theory, was only persuaded to introduce the word 'entropy' into his discussion by the mathematician John von Neumann who is reported to have told him: “it will give you a great edge in debates because nobody really knows what entropy is anyway.” Peter Coveney and Roger Highfield (1992)

[10]“The theory was in excellent shape, except that he needed a good name for ‘missing information’. ‘Why don’t you call it entropy’, von Neumann suggested. ‘In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage’.” John Avery (2003)

[7]

By 1990, Shannon’s

Discussion

There is some reference, to note, where it is claimed that Shannon says that he did not get the concept of entropy from either John Neumann, who has been working with entropy formulas since 1927, the or Alan Turning (Ellersick, 1984), whom Shannon frequently met with for lunch at Bell Labs in the 1940s, or from Norbert Wiener (Omni, 1987), who in 1946 was equating entropy with information, but rather that he derived his equation for the amount of information, and found it was identical to the formula that physicists use to calculate the equation known as entropy in thermodynamics (Bello, 1953). [6]

In his 1968

“The formula for the amount of information isidentical in formwith equations representing entropy in statistical mechanics, and suggest that there may be deep-lying connections between thermodynamics and information theory. Some scientists believe that a proper statement of the second law of thermodynamics requires a term related to information. These connections with physics, however, do not have to be considered in the engineering and other [fields?].”

Shannon, during his 1977 interview with graduate student science historian

This, however, conflicts to some extent with the fact that Shannon was a close associate of Warren Weaver who was the first to alert Leon Brillouin to the so-called importance of Szilard’s paper; also in 1950, at the Rockefeller Institute, formerly introduced Szilard and Brillouin. [19]

Shannon, during his 1982 oral history interview recorded on tape, conducted with communication engineer

References

1. Schement, Jorge R. and Ruben, Brent D. (1993).

2. Ben-Naim, Arieh. (2012).

3. Tribus, Myron. (1998). “A Tribute to Edwin T. Jaynes”. In

4. (a) Tribus, Myron and McIrving, Edward C. (1971). “Energy and Information”,

(b) McIrvine, Edward C. (1959). “A Quantum Theory of Thermal Transport Phenomena in Metals” (abs), PhD Dissertation, Cornell University, Feb.

(c) Ben-Naim, Arieh. (2010).

5. Tribus, Myron. (1987). “An Engineer Looks at Bayes”, Seventh Annual Workshop: Maximum Entropy and Bayesian Methods, Seattle University August, in:

6. (a) Shannon, Claude. (1987). “Claude Shannon: Interview: Father of the Electronic Information Age”, by Anthony Liversidge,

(b) Liversidge, Anthony. (1987). “Profile of Claude Shannon”, reprinted in:

(c) Kahre, Jan. (2002).

(d) Schement, Jorge R. and Ruben, Brent D. (1993).

(e) Bello, Francis. (1953). “The Information Theory”,

7. Avery, John (2003).

8. Tribus, Myron. (1963). "Information theory and thermodynamics", in:

9. Campbell, Jeremy. (1982).

10. Coveney, Peter V. and Highfield, Roger. (1992).

11. Slater, Robert. (1989).

12. (a) Claude Shannon – History-Computer.com.

(b) Mirowski, Philip. (2002).

(c) Day, Lance and McNeil, Ian. (1998).

13. (b) Tribus, Myron. (1983). “Thirty Years of Information Theory”, in:

(b) Mirowski, Philip. (2002).

14. Email communication with Libb Thims (20 Nov 2012).

15. Holzinger, Andreas. (2010).

16. (a) Price, Robert. (1982). “Interview: Claude E. Shannon”, Winchester, MA, Jul 28.

(b) Claude, Shannon. (1945). “A Mathematical Theory of Cryptography”, Memorandum MM 45-110-02, Sep 1, Bell Laboratories; declassified and published in 1949 in the

17. (a) Price, Robert. (1982). “A Conversation with Claude E. Shannon” (conducted: 28 July) (interview by Robert Price; edited by Fred Ellersick), in:

(b) Rogers, Everett M. (1994).

(c) Schement, Jorge R. and Ruben, Brent D. (1993).

18. (a) Guizzo, Erico M. (2003). “The Essential Message: Claude Shannon and the Making of Information Theory” (pg. 44). MS thesis, University of Sao Paulo, Brazil.

(b) Hagemeyer, Friedrich-Wilhelm. (1977). “Interview: Claude Shannon”, PhD dissertation “Die Entstehung von Informationkonzepten in der Nachrichtentecknik”, Free University of Berlin, Germany

(c) Erico Guizzo (2003) states that he has digital MP3 files, mailed to him by Hermann Rotermund, who transformed Hagemeyer’s tape-recorded analog interviews into MP3, in the form of a CD; which were used in the writing of his MS thesis.

19. (a) Lanouette, William. (1992).

(b) Mirowski, Philip. (2002).

20. (a) Shannon, Claude E. (1968). “Information Theory”,

(b) Guizzo, Erico M. (2003). “The Essential Message: Claude Shannon and the Making of Information Theory” (pg. 47). MS thesis, University of Sao Paulo, Brazil.

Further reading

β Dutta, Mahadev. (1968). “A Hundred Years of Entropy” (abs),

β Skagerstam, Bo-Sture K. (1975). “On the Notions of Entropy and Information”,

β Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url),