In evolution thermodynamics, Daniel Brooks (1951-) is a Canadian zoologist noted for his 1982 so-called Brooks-Wiley theory, co-developed with American natural historian and systems ecologist Edward Wiley, as full-explained in their 1986 book Evolution as Entropy: Towards a Unified Theory of Biology; the gist of their thesis being that: [1]

Evolution is a manifestation of the second law of thermodynamics. The entropy functions, or, more correctly, the partial entropy functions, associated with organismic diversity, however, are not the usual entropy functions encountered in the thermodynamic behavior of non-living systems, rather they are they are the partial entropy functions associated with the genetic code and with other hierarchically organized aspects of the information systems of organisms.”

The first difficulty in this statement is that Brooks and Wiley attempt to reconcile the so-called life vs non-life "dividing wall" issue, with recourse to information theory, a type of scientific fool's good, when used outside of computer science proper.

Overview
It seems that the first collaboration of Brooks and Wiley was the 1984 chapter “Evolution as an Entropic Phenomenon”. [2]

The book is well-researched with numerous references, all-in-all presenting a sort of melting pot theory of thermodynamics and evolution, interspersing the discussion with a wide range of dissimilar theories, including Ludwig Boltzmann’s complexions, Gibbs free energy change, Helmholtz free energy change, Carnot efficiency, Ilya Prigogine's internal entropy model, the “energy flow” models of Charles Elton and Raymond Lindeman, Alfred Lotka, Robert Ulanowicz, Jeffrey Wicken, Harold Morowitz, arrow of time, Dollo’s law, Boltzmann-Planck entropy equation:

$S = k \ln W \,$

which they assume to be equivalent to Shannon entropy, all centered around the hypothesis that living organisms differ from nonliving systems in that organisms contain something Brooks and Wiley term “instructional information”.

Difficulty on theory
The weakness in their entire book is their theory being crouched in the telegraphy information theory transmission model of American electrical engineer Claude Shannon (1949), which has nothing to do with thermodynamics, entropy, or evolution, as well as the later variants of Leon Brillouin (1962), and John Collier (1986), among others. Canadian physical information theorist John Collier, in his review, summarized the reception of the book as follows: [3]

“Critics argued that they abused terminology from information theory and thermodynamics.”

One of the most derisive and onpoint critics was American biophysicist Harold Morowitz, whose cogent review entitled “Entropy and Nonsense” was retitled in clandestine manner innocuously as “Review of Brooks and Wiley”, in the reference section of their rebuttal second edition, in which Morowitz concludes that Brooks and Wiley simply have been “mesmerized by the language and equations of physics” to produce a vacuous theory. Morowitz states:

“At the beginning of the Preface, the authors of Evolution as Entropy point out that the first and foremost evolution is a process. With this in mind the title can be restated as ‘a process as an extensive thermodynamic state variable’. It has the same grammatical status as Evolution as Volume or Evolution as Mass. I begin with this linguistic nitpicking because it is important to realize that this book uses imprecise meanings and poor writing to cover up the fundamental nonsense and emptiness of the underlying ideas.”

Morowitz goes on paragraph after paragraph basically just ripping apart to so-called Brooks-Wiley hypothesis, which as Wiley says is “hardly anything to get excited about.” The reason for Morowitz even bothering to give a review is as follows:

“The only reason for reviewing such a work is that a number of biologists untrained in thermal physics and information science have been fooled into believing that there is some content in the ‘Unified Theory’ of Brooks and Wiley.”

This, of course, is the crux of the issue: information theory has become a type of scientific fool's gold and is creating profound confusion for many in the soft science community. Morowitz, again, spells the situation out as such:

“Science C.E. Shannon introduced the information measure in 1948 and showed a formal analogy between the information measure (-∑piln2pi) and the entropy measure of statistical mechanics (-k∑filnfi), a number of works have appeared trying to relate ‘entropy’ to all sorts of academic disciplines. Many of these theories involving profound confusion about the underlying thermal physics to bolster otherwise trivial and vacuous theories.”

Likewise, five years later, American systems scientist Cliff Joslyn, in his 1991 article "On the Semantics of Entropy Measures in Emergent Phenomena", which touches on the same topic, summarizes that the "literature on the relation between thermodynamics and information is vast." In other words, confusion breeds confusion. [5] This is what Morowitz is aiming at: to end the confusion and inter the every-growing proliferation of vacuous theory development.

Driven processes
To their credit, in introducing their thermodynamic argument, they do mention this in passing on one page of their book, stating that processes can be either enthalpy driven (ΔH < 0) or entropy driven (ΔS > 0), or a mixture of both (enthalpy-entropy compensation), mentioning that the most favorable conditions are those process favored both enthalpically and entropically ΔH > 0 and ΔS > 0, but also note that some reactions will proceed when ΔH > 0, as long as TΔS > ΔH. A significant point they leave out is that of thermodynamic coupling, namely that natural process are mixed with unnatural processes in such a way that nature always wins out over what is unnatural, as quantified by the Lipmann coupling inequality.

Correct view
The correct view is that the theory of life is a defunct theory, replaced by modern-day molecular view, in which moving structures, such as retinal, bacteria, fish, ants, mice, fleas, birds, butterflies, and humans, are considered as heat-driven, surface-catalyzed, reactive, animate molecules, with turnover rate, formed by synthesis, not by evolution, a process governed by the Lewis equality for natural processes.

References
1. (a) Willey, Edward O. and Brooks, Daniel R. (1982). “Victims of History: a Nonequilibrium Approach to Evolution” (abs), Systematic Zoology, 31:1-24.
(b) Brooks, Daniel R. and Wilson, Edward O. (1988). Evolution as Entropy: Toward a Unified theory of Biology (pg. #). University of Chicago Press.
2. Brooks, Daniel R. and Wilson, E.O. (1984). “Evolution as an Entropic Phenomenon”, in: Evolutionary Theory: Paths to the Future, ed. J.W. Pollard, 141-71. New York: Wiley.
3. Collier, John D. (1986). “Entropy and Evolution” (abs), Biology and Philosophy, 1:5-24.
4. Morowitz, Harold J. (1986). “Entropy and Nonsense: Review of Brooks and Wiley” (abs), Biology and Philosophy, 1:473-76.
5. Joslyn, Cliff. (1991). "On the Semantics of Entropy Measures of Emergent Phenomena", Cybernetics and Systems, 22(6): 631-40.