A 2010 parody of the Neumann-Shannon anecdote, by Israeli physical chemist Arieh Ben-Naim, form his three-page chapter section “Snack: Who’s Your Daddy?”, in which retells the story such that Shannon has a new baby and he and his wife are deciding on what name to pick: ‘information’ or ‘uncertainty’? Neumann suggests they name their son after Rudolf Clausius’ son’s name ‘entropy’, which Shannon decides to do—only to find out, in the years to follow, that people continually confuse his son with Clausius’ son and also misuse and abuse the name; after which, it is suggest to Shannon that he change is son’s name to: Smomi, a short acronym for “Shannon’s Measure Of Missing Information”. [2] |
A 2010 snapshot summary of the 1940 Neumann-Shannon anecdote from the 2010 lecture notes turned book Biomedical Informatics of German computer processing scientist and cognitive theorist Andreas Holzinger. [15] |
“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”
“The same function appears in statistical mechanics and, on the advice of John von Neumann, Claude Shannon called it ‘entropy’. I talked with Dr. Shannon once about this, asking him why he had called his function by a name that was already in use in another field. I said that it was bound to cause some confusion between the theory of information and thermodynamics. He said that Von Neumann had told him: ‘No one really understands entropy. Therefore, if you know what you mean by it and you use it when you are in an argument, you will win every time.”
Shannon:
Well, let me also throw into this pot, Szilard, the physicist. And von Neumann, and I’m trying to remember the story. Do you know the story I’m trying to remember?
Price:
Well, there are a couple of stories. There’s the one that Myron Tribus says that von Neumann gave you the word entropy, saying to use it because nobody, you’d win every time because nobody would understand what it was.
Shannon:
[laughs]
Price:
And furthermore, it fitted p*log(p) perfectly. But that, but then I’ve heard . . .
Shannon:
von Neumann told that to me?
Price:
That’s what you told Tribus that von Neumann told that to you.
Shannon:
[laughs – both talking at once]
Price:
Bell Labs too, that entropy could be used. That you already made that identification. And furthermore in your cryptography report in 1945, you actually point out, you say the word entropy exactly once in that report. Now this is 1945, and you liken it to Statistical Mechanics. And I don’t believe you were in contact with von Neumann in 1945, were you? So it doesn’t sound to me as though von Neumann told you entropy.
Shannon:
No, I don’t think he did.
Price:
This is what Tribus quoted.
Shannon:
Yeah, I think this conversation, it’s a very odd thing that this same story that you just told me was told to me at Norwich in England. A fellow —
Price:
About von Neumann, you mean?
Shannon:
Yeah, von Neumann and me, this conversation, this man, a physicist there, and I’ve forgotten his name, but he came and asked me whether von Neumann, just about the thing that you told me, that Tribus just told you, about this fellow. . .
Price:
That was Jaynes, I imagine the physicist might have been [Edwin] Jaynes.
Shannon:
Yes, I think it was, I think so. Do you know him?
Price:
Well, he’s published in the same book as Tribus, you see. This is a book called The Maximum Entropy Formalism. You’ve probably seen that book, but they have chapters in it, and Jaynes, the physicist —
Shannon:
Now, I’m not sure where I got that idea, but I think I, somebody had told me that. But anyway, I think I can, I’m quite sure that it didn’t happen between von Neumann and me.
Price:
Right. Well, I think that the fact that it’s in your 1945 cryptography report establishes that, well, you didn’t get it from von Neumann, that you had made the p*log(p) identification with entropy by some other means. But you hadn’t been —
Shannon:
Well, that’s an old thing anyway, you know.
Price:
You knew it from thermodynamics.
Shannon:
Oh, yes, from thermodynamics. That goes way back.
Price:
That was part of your regular undergraduate and graduate education of thermodynamics and the entropy?
Shannon:
Well, not in class, exactly, but I read a lot, you know.
Evolution of the Neumann-Shannon ConversationSource Conversation, between John Neumann and Claude Shannon, occurred between fall 1940 to spring 1941 at the Institute for Advanced Study, Princeton, New Jersey. April of 1961, Myron Tribus visits Shannon at his office at MIT and questions him about the reason behind his "entropy" name adoption. Tribus recounts the Shannon interview as such: "When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics (e.g. ref. 12). Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.Myron Tribus
(1963)
[8]“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.” Myron Tribus and Edward McIrving (1971)
[4]“You should call it ‘entropy’ for two reasons: First, the function is already used in thermodynamics under that name; second, and more importantly, most people don’t know what entropy really is, and if you use the word ‘entropy’ in an argument you will win ever time.” Myron Tribus (1983)
Philip Mirowski (2002)
[13]At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.” Jeremy Campbell (1982)
[9]Shannon, the pioneer of information theory, was only persuaded to introduce the word 'entropy' into his discussion by the mathematician John von Neumann who is reported to have told him: “it will give you a great edge in debates because nobody really knows what entropy is anyway.” Peter Coveney and Roger Highfield (1992)
[10]“The theory was in excellent shape, except that he needed a good name for ‘missing information’. ‘Why don’t you call it entropy’, von Neumann suggested. ‘In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage’.” John Avery (2003)
[7]
“The formula for the amount of information is identical in form with equations representing entropy in statistical mechanics, and suggest that there may be deep-lying connections between thermodynamics and information theory. Some scientists believe that a proper statement of the second law of thermodynamics requires a term related to information. These connections with physics, however, do not have to be considered in the engineering and other [fields?].”