In information theory thermodynamics, conservation of information refers to the hypothesis or argument that "information" is physically equivalent to "energy", or in some versions of the argument "entropy", and that there exists a conservation law for information, analogous to the other conserved quantities, e.g. mass, charge, etc., of physics.

History
The hypothesis of the so-called "conservation of information" seems to have originated in certain arguments in the 1975 "The Arrow of Time" by American astrophysicist David Layzer, who using a rather backward and inane blend of statistical mechanics and information theory, concludes: [5]

"Entropy and information are related by a simple conservation law, which states that the sum of the information and the entropy is constant and equal to the system's maximum attainable information or entropy under the given conditions."

Layzer then states that this is expressed mathematically as:

H + I = constant

and concludes that a "gain of information is always compensated for by an equal loss of entropy." This statement, to note, seems to be a rephrasement of American physical chemist Gilbert Lewis' 1930 Maxwell's demon molecule distribution based conclusion that "gain in entropy always means the loss of information, and nothing more." [6]

In 1988, citing Layzer, Canadian zoologist Daniel Brooks and American systems ecologist Edward Wiley give the following four bullet points of their so-called hierarchical information theory:

1. Total information capacity is conserved.
2. Order capacity plus disorder capacity equals total information capacity.
3. Disorder capacity always increases or stays the same.
4. Order capacity always decrease or stays the same.

which they call the "informational analog of the second law of thermodynamics for closed systems". In any event, they conclude with the inane statement that "this definition is entirely consistent with classical thermodynamics definitions". [7]

In 1998, American mathematician and intelligent design advocate William Dembski proposed a conservation of information type theory, modeled on the conservation of energy, in his paper “Intelligent Design as a Theory of Information”. In this paper, Dembski reasons that there exists a “conservation law that governs the origin and flow of information”; that “information is not reducible to natural causes”; and that “the origin of information is best sought in intelligent causes”, such that “intelligent design thereby becomes a theory for detecting and measuring information, explaining its origin, and tracing its flow.” [2] The corollaries of the proposed law are the following:

(a) The specified complexity in a closed system of natural causes remains constant or decreases.
(b) The specified complexity cannot be generated spontaneously, originate endogenously or organize itself (as these terms are used in origins-of-life research).
(c) The specified complexity in a closed system of natural causes either has been in the system eternally or was at some point added exogenously (implying that the system, though now closed, was not always closed).
(d) In particular any closed system of natural causes that is also of finite duration received whatever specified complexity it contains before it became a closed system.

The theory was later elaborated on further in the 2004 book Uncommon Dissent: Intellectuals Who Find Darwinism Unconvincing. [3] The concept was elaborated on further in Dembski’s 2007 book No Free Lunch: Why Specified Complexity Cannot be Purchased without Intelligence. [4]

Religion
In religious thermodynamics, the conservation of information or “law of conservation of information” states that natural causes are incapable of generating complex specified information (markers of design by an intelligent agent) and that the complex specified information in a closed system of natural causes remains constant or decreases. [1]

Difficulties on theory
Among the numerous difficulties with this contrived theory is that, by definition, something that decreases in amount cannot be considered as being conserved. Another salient difficulty is that Dembski uses the term "closed system" in place of what should be "isolated system". This type of misuse of thermodynamic terms and conceptions are common among mathematicans who attempt to extrapolate new thermodynamics theories. As famously stated in 1989 by Russian mathematician Vladimir Arnold: “every mathematician knows it is impossible to understand an elementary course in thermodynamics.” Beyond this, information is anthropomorphic conception and is not something defined in the inherent structure of the composition of the universe. In this light, a simple reading of the history of the Library at Alexandria, the ancient world’s largest library, confirms that information is not conserved.

References
1. Shermer, Michael. (2007). Why Darwin Matters: the Case Against Intelligent Design, (pg. 71-73). MacMillian.
2. Dembski, William A. (1998). “Intelligent Design as a Theory of Information”, Access Research Network.
3. (a) In his 1998 article, Dembski states that “a full treatment (of the law of conservation of information) will be given in Uncommon Descent.”
(b) Dembski, William A. (2004). Uncommon Dissent: Intellectuals Who Find Darwinism Unconvincing. ISI Books.
4. Tellgren, Erik. (2002). “On Dembski’s Law of Conservation of Information” (PDF), TalkReason.
5. Layzer, David. (1975). “The Arrow of Time”, Scientific American, 233:56-69.
6. Lewis, Gilbert. (1930). “The Symmetry of Time in Physics”, Science, 71:569-77, Jun 6.
7. Brooks, Daniel R. and Wilson, E.O. (1988). Evolution as Entropy: Toward a Unified theory of Biology (pg. 70). University of Chicago Press.

External links
Law of conservation of Information (overview) - by Les Lane.

TDics icon ns