Robert DoyleIn existographies, Robert O. Doyle (1936-) (CR:18), or "Bob Doyle", is an American physicist and philosopher, noted for

Overview
In 2003, Doyle launched his site InformationPhilosopher.com, in which he extols on his second law based “information philosophy” and writes encyclopedia style articles on related biographies related subject matters. The gist of the site, as of 2008, seems to be a prolonged effort to frame out an ontic openings stylized anti-deterministic pro-free will polemic, which elaborates on a search to answer the following query: [1]

“Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating new information every day? Why are we not still in that state of equilibrium?”

The utility of InformationPhilopher.com is that it has over 200 online biographies of philosophers and scientists, many of which overlap thematically with Hmolpedia, in respect to entropy-connected topics.

Doyle’s first book, the 2011 Free Will: the Scandal in Philosophy, which is said to be based on the based on the “freedom section” of the Information Philosopher website, takes aim at the view that, in modern times, “academic philosophers are convincing many young students that they are deterministic biological machines with a “compatibilist free will”, by arguing, using his cache of philosophers as launching points, that humans have a sort of in-determinant emergent biological free will, or something along these lines. [2]

Shannon bandwagon 2
The gist of Doyle's entire effort is to use the music of the Shannon bandwagon, as on ontic opening, to sell the base-less argument that there is free will behind "choice", or something along these lines.
Entropy | Mis-information
Doyle, in short, is typical of many thinkers trying to over-sell and or falsely-sell the term "information" of communication theory as synonym of "entropy" of physics, such as Warren Weaver (1949) or Seth Lloyd (2016) have done (see following quotes):

Shannon’s [information theory] work, roots back, as von Neumann has pointed out, to Boltzmann’s observations [what ?], in some of his work on statistical physics (1894), that entropy is related to ‘missing information’.”
Warren Weaver (1949), The Mathematical Theory of Communication (footnote, pg. 3)

“The great nineteenth-century statistical physicists James Maxwell , Ludwig Boltzmann, and Willard Gibbs derived the fundamental formulas of what would go on to be called ‘information theory’."
Seth Lloyd (2016), Programming the Universe (§:Information and Physical Systems, pg. 163)

as some kind of invented history, confused panacea, open sesame, or magic flute (see: entropy pied piper).

Boltzmann, correctly, contrary to Weaver' claim, NEVER made “observations” about “missing information”. [5] Weaver made a “contrived history” footnote in order to bolster support for his friend Claude Shannon’s new information theory.

Likewise, the "formulas" of information theory, correctly, contrary to Lloyd's claim, were derived by Ralph Hartley in his 1928 "Transmission of Information", NOT by Maxwell, Boltzmann, or Gibbs.
-
In short, owing to what is called the hydraism problem, meaning that universal geniuses, supposedly, no longer exist, meaning that people can now, more often than not scam other people with ideas from one field of study, the latter being ignorant of that field, the origin of the above Sokal affair, scientific snake oil, invented history style scam, is that (a) because Boltzmann, in his “Further Studies on the Thermal Equilibrium of Gas Molecules” (1872), wherein he derives his H-theorem (the symbol ‘H’ for heat), meant to quantify the velocity distributions of the atoms and molecules of the body of gas, which he aimed to “closely relate” to Clausius’ heat state function formula entropy (1865), or inexact differentials of heat δQ divided by the absolute temperature T, going into or out of bodies, employed the “logarithm” (log), a mathematical tool invented by John Napier (c.1594), and (b) because Ralph Hartley in his “Transmission of Information” (1928), wherein he derives an H formula (where ‘H’ for Hartley, is the amount of information associated with n selections for a particular telegraph system), meant to quantify the capacity of a telegraph system to transmit information, in the form of high (1) and low (0) pulses in a telegraph wire, mathematically, that (c) John Neumann (1940) thought it would be “funny joke” if Claude Shannon (1949) convoluted all of this into the punchline that Boolean numbers (1,0,1,0,0,1 …), i.e. computer language and or telegraphic communication signals, stored or transmitted in the form of voltage pulses or electromagnetic waves, can now be synonymously called “entropy” of thermodynamics. [5] Millions of pages of published scientific confusion have resulted, in the last 80-years, from this "inside" joke.

Doyle, in similar fashion, invents, rewrites, and contrives history, as he sees fit, so to make published scientific confusion; the following are but two salient examples:

“Nevertheless, Gibbs's idea of the conservation of information [what ?] is still widely held today by mathematical physicists.”
— Robert Doyle (2013), “Gibbs” (Ѻ), InformationPhilosopher.com

“This led many statistical physicists, notably J. Willard Gibbs, to claim that information is the same [what ?] wherever the particles are.”
— Robert Doyle (2019), “Scandals” (Ѻ), InformationPhilosopher.com

Gibbs, correctly, NEVER theorized about information. In fact, in Gibbs entire collected works on thermodynamics (1876), including his work on work on statistical mechanics (1902), he only uses the term “information” ONE time, and done so in a letter, and in a standard English language sentence use of the term: [4]

“The thermochemical data on which such a prediction of E.M.F. and reversible heat is based must, be something more than the heat of union of the radicles. They must give information on the more delicate question of the temperature at which that heat can be obtained. In the terminology of Clausius they must relate to entropy as well as to energy—a field of inquiry which has been far too much neglected.”
— Willard Gibbs (1887), “Letter to Oliver Lodge on Electrochemical Thermodynamics”, Jan 9; in Scientific Papers (pg. 407)

Things like Doyle publishing articles and books claiming that Gibbs was theorizing about the "conservation of information" or Weaver asserting that Boltzmann was thinking about heat as "missing information" (in the Boolean number sense of things), etc., are but a blatant disservices to science, producing what Ingo Muller correctly refers to as “obfuscation”:

“No doubt, Shannon and Neumann thought that this was a funny joke, but it is not! It merely exposes Shannon and Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost.”
— Ingo Muller (2007), A History of Thermodynamics (pgs. 124)

thereby leading to "great" mis-education of the minds of young students, which must be later only unlearned, about one of the most important topics of human existence, namely: heat in respect to work. Doyle, for whatever reason, is oblivious to this, generally a repercussion of the fact that he is lost in the music of the Shannon bandwagon (as shown above). [5]

Education
Doyle completed his BS in physics at Brown University in 1958 and PhD in astrophysics, thesis on continuous spectrum of the hydrogen quasi-molecule, at Harvard University in 1968. Over the next three decades or so, Doyle invented a number of gaming, electronic, and programming devices. In 2003-2004, he launched his InformationPhilosopher.com, a combination of encyclopedia style biographies of a number of obscure “information philosophy” related thinkers, such as Edward Culverwell (Boltzmann H-theorem debater), and Doyle’s own blended philosophy, that atoms, molecules, humans, and larger structures are “information structures” governed by a blend of the second law, quantum mechanics, and the uncertainty principle, with which he investigates topics such as: freedom, value, purpose, and free will, among others. [3] In 2003, Doyle re-joined the astronomy department of Harvard in some way, to work on his humans as information structures philosophy. Doyle states that he has about 150+ books on free will.

References
1. Home – InformationPhilosopher.com.
2. (a) Doyle, Bob. (2011). Free Will: the Scandal in Philosophy (thermodynamics, 12+ pgs). I-Phi Press.
(b) Doyle, Bob. (2011). “Free Will: the Scandal in Philosophy”, Blog.i-phi.org, Jun 19.
3. InformationPhilospher.com (2004) – WayBack Machine.
4. (a) Gibbs, Willard. (1876). Scientific Papers of J. Willard Gibbs: Thermodynamics (information, 2-pgs). Longmans, 1906.
(b) Gibbs, J. Willard (1901). Elementary Principles in Statistical Mechanics - Developed with Special Reference to the Rational Foundation of Thermodynamics (information, 0-pgs). Dover.
5. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.

Further reading
● Doyle, Robert O. (2009). "Free Will: it’s a normal biological property, not a gift or a mystery" (abs), Nature, 459: 1052, Jun.
● Doyle, Robert O. (2010). “Jamesian Free Will: The Two-Stage Model of William James”, William James Studies, Jun.

External links
Bob Doyle (inventor) – Wikipedia.
Bob Doyle (about) – InformationPhilosopher.com.

TDics icon ns