“Neither I nor my committee knew the answer. I was not at all satisfied with the answer I gave. That was in 1948 and I continued to fret about it for the next ten years. I read everything I could find that promised to explain the connection between the entropy of Clausius and the entropy of Shannon. I got nowhere. I felt in my bones there had to be a connection; I couldn’t see it.”
Tribus' 1971 Maxwell's demon diagram, about which he says that the demon was finally "exorcised" in 1951 by French-born American physicist Leon Brillouin, who pointed out that if the demon were to identify the molecules, he would have to illuminate them in some way, causing an increase in entropy that would more than compensate for any decrease in entropy such a being could effect. [7] |
“Here was my Rosetta Stone! I went home and worked with that paper for a week, almost without sleep. All of my studies for a decade suddenly fell in place. A straight forward connection between Shannon’s entropy and the entropy of Clausius poured into the paper, almost without my hand.”
“I took an overnight train and showed up in his office, the next morning, surely acting like a crazed man. I remember going to the blackboard in his office and pouring out the derivation of the laws of thermodynamics.”
“Everyone knows that Shannon’s derivation is in error.”
“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”
“The same function appears in statistical mechanics and, on the advice of John von Neumann, Claude Shannon called it ‘entropy’. I talked with Dr. Shannon once about this, asking him why he had called his function by a name that was already in use in another field. I said that it was bound to cause some confusion between the theory of information and thermodynamics. He said that Von Neumann had told him: ‘No one really understands entropy. Therefore, if you know what you mean by it and you use it when you are in an argument, you will win every time.”
“After my book on Thermodynamics appeared, I sat back and waited for the call from the Nobel Committee. They never wrote. But other people did writer and it was not at all what I wanted to hear. My attempts at getting engineering professors to adopt a new way to look at thermodynamics was a complete failure. Not only that, but I was attacked as someone who had taken intellectual shortcuts. Once, during a presentation at IIT in Chicago, I showed how Ed Jaynes’ methods developed, in four lines, the equations of the Fowler and Guggenheim required hundreds of pages to reach. Now once did we mention ergodicity. The presentation did not convince most of the audience. Their reaction: Surely we had cheated somewhere.”— Myron Tribus (1998), “A Tribute to Edwin T. Jaynes” [8]