# Transmission of Information

 Four varieties of possible information sequences (high: 1, low: 0, or no signal), transmitted in typical telegraph messages, at increasing lengths of cable transmission, A being a short transmission, D being a long cable length transmission.
In famous publications, “Transmission of Information” is a 1927 presentation given by American electronics engineer Ralph Hartley, at the International Congress of Telegraphy and Telephony, in which he explained how to quantify the capacity of a system to transmit information, mathematically, using logarithms, the systems being those of telegraph, telephone, and television transmission systems, sending information in the form of either pulsations in radio waves or changes in current and correlative voltage level in wire telegraph lines. [1] The mathematical model of information in this paper formed the basis of the later 1948 model of information, and hence "information theory", conceived by Claude Shannon, which he called "entropy" (or information theoretic entropy) on the half-joke suggestion of John Neumann, to the consternation of all thermodynamicists in the years to follow, up to the present day.

Overview
To illustrate his derivation, Hartley gives the situation in which a hand-operated submarine telegraph cable system in which an oscillographic recorder traces the received message or rather “information” on photosensitive tape. The sending operator has at his or her disposal three positions of a sending key which correspond to either a high voltage, low voltage, and no applied voltage. The following figure shows three different recordings of a given transmission, where A shows the sequence of the key positions as they were sent, and B, C, and D are traces made by the recorder when receiving over an artificial cable of progressively increasing length. Figure B shows a signal that can be reconstructed to read the original sequence, whereas C shows that more care is needed to reconstruct the original message, and D show a hopelessly indistinguishable message.

To put this information transmission into formulation, Hartley explains that at each point in the reading of the recorded tape of the transmitted single, the reader must select one of three possible symbols (high, no-signal, low). If the reader makes two successive selections, symbolized by n, he or she will have 3², or 9, different permutations or symbol sequences. This system can then be extended to that in which, instead of three different current or voltage levels to select from, the sender has n different current values to be applied to the line and to be distinguished from each other at the receiving end of the line. The number of symbols (or voltage levels) available at each selection is s and the number of distinguishable sequences is:

$s^n \,$

He then notes that the measure of the amount of information transmitted would increase exponentially with the number of selections. On this basis, Hartley states that the value ‘H’ is the amount of information associated with n selections for a particular system. Then, through some derivation, arrives at the following logarithmic expression for information:

$H = n \log s \,$

“What we have done is to take as our practical measure of information the logarithm of the number of possible symbol sequences.”

Information theory
In his 1948, American engineer Claude Shannon, in his famous paper "A Mathematical Theory of Communication", would credit this derivation by Hartley as being the point at which the logarithmic function became the natural choice for information and in the same paper, to the ire of many thermodynamicists, equate Hartley's 1928 telegraph "system" model with that of Clausius' 1865 heat engine "system" model. In short, Shannon, using a similar formulation to that above, declared that H, being a measure of information, choice, and uncertainty, is the same H as used in statistical mechanics, specifically the H in Boltzmann's famous H theorem, concluding with:

“We shall call H the entropy of the set of probabilities.”

For ever after, countless numbers of information theory scientists have since taken any and all types of information, "which is a very elastic term, ... whether being conducted by wire, direct speech, writing, or any other method", in the Hartley's words, as being a direct equivalent to thermodynamic entropy, as derived from the study of the steam engine. In his paper, Shannon would go on to define the entropy H of the source in units of “bits per symbol”, in which in the Hartley derivation s = 2, corresponding to a source that can only send two voltage or current levels, high or low; hence Shannon’s entropy being measured in binary digits per symbol.

References
1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication" (bit, pg. 1), Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
2. Hartley, R. V. L. (1928). “Transmission of Information”, Bell Technical Journal, July, pgs. 535-64; Presented at the International Congress of Telegraphy and telephony, lake Como, Italy, Sept., 1927.

Started By Thread Subject Replies Last Post
jtuhtan A Farewell to Entropy 4 Jan 8 2011, 5:20 AM EST by jtuhtan
Thread started: Jan 7 2011, 1:48 PM EST  Watch
The argument is valid, but only to an extent. Entropy in the statistical sense is the lack of, or missing information in a system. Reference 'A Farewell to Entropy' by A. Ben-Naim.
Do you find this valuable?
Keyword tags: (edit keyword tags)