“In this little book the author has in the main sought to present the interpretation reached by Boltzmann and by Planck. The writer has drawn most heavily upon Planck, for he is at once the clearest expositor of Boltzmann and an original and important contributor. Now these two investigators reach the result that; entropy of any physical state is the logarithm of the probability of the state, and this probability is identical with the number of "complexions" of the state. This number is the measure of the permutability of certain elements of the state and in this sense: Entropy is the measure of the disorder of the motions of a system of mass points. To realize more fully the ultimate nature of entropy, the writer has, in the light of these definitions, interpreted some well-known and much-discussed thermodynamic occurrences and statements.”
Klein's 1910 chapter "Reach and Scope of Entropy", wherein he explains that entropy applies to any and all "bodies" of the universe, and goes through Max Planck's proof of this; compare: Moriarty-Thims debate (2009). |
“The driving motive (or impelling cause) in all natural events is the difference between the existing entropy and its maximum value.”— Ludwig Boltzmann (c.1887), Publication; cited by Joseph Klein (1910) in: Physical Significance of Entropy (pg. 98)
“The second law in its objective-physical form (freed from all anthropomorphism) refers lo certain mean values which are found from a great number of like and ‘chaotic’ elements. This law has no independent significance, for its roots go down deep into the theory of probabilities. It is therefore conceivable that it is applicable to some purely human and animate events as well as to inanimate, natural events, provided the variable elements present constitute adequate haphazard for the calculus of probabilities.”— Joseph Klein (1910), The Physical Significance of Entropy; cited by Jay Labinger (1995) [3]