**Theory of Heat
James Clerk MAXWELL **

Themes

Information and Energy

**Information is a
currency of reality, just like mass, energy, space and time**.
This has been proven by scientists who have been able to convert
information into energy. Among the various thought experiments
that led to the current observations, the idea of a mechanical device
invented by James Clerk Maxwell (see quote on the left side panel)
plays a central role. Analysis of the minimal set of functions that
must be implemented to design a genome driving the life of a minimal
cell revealed that several
dozens of Maxwell's demons are needed to animate the cell,
allowing it to have a functional assembly line. For a general
discussion see myopic
selection drives evolution, information
of the chassis and information of the program in synthetic cells,
bacteria as
computers making computer and life's
demons .

In discussions with students and through a variety of lectures where I talked about Maxwell's demon, thinking thinking it was a thought experiment familiar to my audience, I discovered that most of the participants did not know about this little being. Here (page created in January 2011) is a brief summary of its vivid history, far from over (see Information Processing and Thermodynamic Entropy). The general role of information in biology was discussed in my book La Barque de Delphes (Odile Jacob 1998, translated as The Delphic Boat, Harvard University Press 2003).

While, from antiquity
to the Middle Ages Nature was described using ten basic currencies:
οὐσία, προσότης, ποιότης, πρός τι, κεῖσθαι, ἔξις, τόπος, χρόνος,
πράττειν, παθεῖ, and in latin: *essentia, quantitas, qualitas, ad
aliquid, situs, habitus, locus, tempus, agere, pati*, an
essential step in the understanding of Nature required the
construction of a certain entanglement of these categories, a process
which gradually reduced them to four: mass, space, time, and
subsequently energy. A remarkable achievement was reached when,
following others, Einstein combined them together in a surprisingly
concise equation, E = mc^{2}. Yet it was clear that these
universal categories do not account for many phenomena: no one has
been able, for example, to derive the crystal lattice of a mineral as
simple as sodium chloride from the equations of microscopic physics.

Several features of the ancient categories are not
straightforward: *qualitas* (quality), *ad aliquid*
(relationships), *situs* and *habitus* (positioning in
space-time), in particular, are not immediate consequences of mass,
energy, space or time.

The first three first are fairly easy to grasp intuitively. Energy is more complicated. Indeed, it is associated to a wide variety of connotations, often with some psychological background. Generally, it is understood that energy produces work. A car engine uses energy. Throughout the nineteenth century, at the birth of the industrial world, scientists and engineers wondered how energy could be made available to produce work. Sadi Carnot was the first to show that, in fact, when building a steam engine, energy had to be divided into two parts, a usable part, which produced work, and another part, which depended on the temperature of the system and could not be used to produce work [Carnot, 1824]. Indeed, steam engines required the presence of two temperature sources, and work was produced when a fluid (steam) flowed from the hot part of the machine to the cold part.

In 1850, Rudolf
Clausius took up Carnot's point of view and began to formalise
it and in 1865 proposed to name *Entropie* the part of energy
associated with temperature that cannot be transformed into work
[Clausius, 1865]. Created from Greek, like all correct neologisms in
science, entropy expresses the idea of an internal metamorphosis (ἐν:
within, and τροπεῖν: to alter, change, convert, transform,
metamorphose), or *Verwandlung* in German.

Later, James
Clerk Maxwell, in his *Theory of Heat*, analysed the
process and related it to the second principle of thermophysics, which
states that in a closed material system temperature tends to become
uniform [Maxwell, 1871, 1891]. For this, he had to introduce the idea
of the "molecular theory of matter", where motion is central: "*The
opinion that the observed properties of visible bodies apparently at
rest are due to the action of invisible molecules in rapid motion is
to be found in Lucretius.
Daniel
Bernoulli was the first to suggest that the pressure of air
is due to the impact of its particles on the sides of the vessel
containing it; but he made very little progress in the theory which
he suggested. Lesage and Prevost of Geneva, and afterwards Herapath
in his 'Mathematical Physics' made several important applications of
the theory. Krönig also directed attention to this explanation of
the phenomena of gases. It is to Professor Clausius, however, that
we owe the recent development of the dynamical theory of gases.*"
In gases, this means that if you start with a non-symmetrical
distribution, with hot gas molecules in one compartment and cold gas
molecules in an adjacent compartment, the system will evolve in such a
way that the temperature will be averaged out after a certain amount
of time has passed. The temperature here measures the degree of
agitation of the gas molecules: fast when it is hot, slow when it is
cold. This shift from a continuous description of matter to a
discontinuous, atomistic view was later extended to biology with the
birth of molecular biology. It is interesting to note that this took
about a century, and that the current situation, where 'information'
is slowly gaining ground, repeats a similar slow path.

In creating a link between information and entropy, Maxwell introduced the idea of a hypothetical being, a ‘demon’, which uses a built-in information processing capability to reduce the entropy of a homogeneous gas (at a given temperature). In short, the demon is able to measure the speed of the gas molecules and open or close a door between two compartments according to the speed of the molecules, keeping them on one side if they are fast and on the other side if they are slow. This results in two compartments, one hot and one cold, reversing time and apparently acting against the second principle of thermophysics.

Much work has been done since this first view, and the idea that the creation of information requires energy was put forward by Leo Szilard to explain how Maxwell's demon could act [Szilard, 1929].

The role of thermodynamics in computation has been examined several times over the last half century. The physics of information processing has given rise to a considerable variety of attempts to understand how Maxwell's daemon might work. One of the most important contributions to this work was the account given by Marian Smoluchowski, professor at the Jagiellonian University in Kraków. At a lecture in Göttingen attended by the most creative physicists and mathematicians of the time, Smoluchowski gave details of the way Maxwell's demon could be implemented as a trap door, permitting information to be coupled to availability of energy and material states of molecules in the environment, [Smoluchowski, 1914].

Later on, Szilard proposed in a loose way to account for the relationship between infomation and entropy [Szilard, 1929], and von Neumann in the 1950s followed suit, stating that each logical operation performed in a computer at temperature T must use an energy of kTln2, thereby increasing entropy by kln2 [see von Neumann, 1966]. This remained the accepted intuition until the IBM company, which was concerned by the limits this would impose on computation, asked its engineers to explore the situation and possibly propose remedies.

Fortunately for computer sciences (you could not work on the machine you are using at this very time if this had reflected reality) this intuition proved to be wrong. Working at IBM, on the limits of physical computation — which would have been rapidly reached if the Szilard-von Neumann's intuition had been valid, Rolf Landauer demonstrated, fifty years ago, that computation could be made to be reversible, hence not consuming any energy [Landauer, 1961].To understand the meaning of this statement, let us summarize the bases of all computations. Three core boolean operations, AND, NOT and REPLICATE are enough to permit all kinds of logical operations. The operation AND is boolean intersection (multiplication), as we learnt in our first years at school: it takes two binary inputs X and Y and returns the output 1 if and only if both X and Y are 1; otherwise it returns the output 0. Similarly, NOT takes a single binary input X and returns the output 1 if X = 0 and 0 if X = 1. REPLICATE takes a single binary input X and returns two binary outputs, each equal to X. Any boolean function can be constructed by repeated combination of AND, NOT and REPLICATE. Another operation, that can be derived from those, ERASE, is essential to our topic. ERASE is a one-bit logical operation that takes a bit, 0 or 1, and restores it to 0.

Concretely, these operations are implemented as 'logic gates'. A logic gate is a physical device that performs a logical operation. Microprocessors are combining millions and even billions of logic gates to perform the complex logical operations that you find in computers such as the one you are using to read this text.

In his conceptual work, Landauer showed that reversible, one-to-one,
logical operations such as NOT can be performed **without consuming
energy**. He also showed that irreversible, many-to-one operations
such as RESET require consuming at least kTln2 of energy for each bit
of information lost. The core of the argument behind Landauer’s
theorem can be readily understood. Briefly, when a bit is erased, the
information it contains must go somewhere. It has only two possible
ways: either it moves to a place in the computer (or of the cell, if
we consider cells
as computers) corresponding to an observable degree of freedom,
such as another place with a known bit in its memory. If so, it has
obviously not been erased but merely moved. Or it goes into places
with unobservable degrees of freedom such as the microscopic motion of
molecules, and this results in an increase of entropy of at least
kln2. Landauer had a seminal role at IBM to implement the C-MOS
technology that was at the root of dense microprocessors construction.
Landauer was also the father of electronic circuits based on
reversible logic, thus exhibiting considerable reductions in energy
wasting over conventional irreversible circuits.

In 1973, Bennett extended Landauer's theorem, showing that all computations could be performed using only reversible logical operations, that is, without consuming energy [Bennett, 1973, 1988]. But, where does the energy come from? To perform a logical operation, it is commonly extracted from a store of free energy, then used in the processor that performs the operation, and finally returned to the initial store once the operation has been performed. We note here that in usual computers the store is a battery or an outside electric supply, whereas in cells energy is distributed throughout the matter of the cell. This may have considerable consequences for the computing power of cells (not discussed here). The property of reversibility has been implemented in real computers under the term "adiabatic logic", and real circuits have been described in details to explain how this works [Younis and Knight, 1994]. In the domain of Synthetic Biology, it is interesting to note that Tom Knight, one of the founders of iGEM at the MIT has been seminal in the actualisation of this work. Hence, the connection between information theory, computer sciences and biology is much deeper than what laypersons (and many biologists) would like to think.

Back to Maxwell's demon: In a real calculation, errors occur, and getting rid of the errors will require an irreversible operation, deleting the erroneous information and replacing it with the correct information. Consequently, this will lead to dissipate energy in order to restore the error-free situation. If the energy was not consumed, then the system would be able to go back in time, and we would have created perpetual motion. How does this work in reality? The situation is similar to that proposed for Maxwell's demon action: measure, store information, use it via replication of the measurement to restore the initial state, then erase the memory, to reset the demon's initial state. At the heart of this action are two logical processes, REPLICATION and RESET.

If the error rate is x bits per second, for example, then error correction processes can be used to detect these errors and reject them into the environment at an energy cost of x kT ln2 J s-1, where T is the temperature of the environment. In fact, biological processes, even at the microscopic level, do not proceed bit by bit, but rather are highly redundant and change quite a large number of bits simultaneously. Indeed, at 300K, the average temperature of the environment of life, thermal noise is quite high, so that redundancy is necessary to increase the signal-to-noise ratio. And the "quantum" of energy usually used is that of the hydrolysis of an "energy-rich" phosphate bond, typically the hydrolysis of ATP to ADP or GTP to GDP.

While these types of processes have not been presented as concrete
illustrations of Maxwell's demon, we have a wealth of examples
illustrating behaviours of that type. John Hopfield suggested that in
order to identify proofreading functions we should be exploring “*known
reactions which otherwise appear to be useless or deleterious
complications*”. And, indeed, in the protein translation process,
a proofreading step, using protein EFTu bound to charged transfer RNA,
tests whether the tRNA can read correctly the codon immediately
available after the tRNA carrying the growing polypeptide, and
hydrolyzes a GTP molecule when the correct association has been found,
thus acting as a Maxwell's demon [Hopfield, 1974]. We can note here
that this is why it is so important for cells to carry energy supports
(present in the covalent links making the backbones of macromolecules,
in thioesters and in phosphate bonds), making that it is of course
impossible that arsenic belongs
to the backbone of energy-rich bonds, contrary to a recent mass media
hype.

Such error-correcting routines are the norm in biological processes, and function as working analogues of Maxwell’s demon, obtaining information and using it to reduce entropy at an exchange rate of kT ln2 joules per bit, rejecting errors to the environment at a high rate to maintain reliable operations. This thinking is thus at the heart of what should be a renewed vision of the ageing process.

Bennett, C (1973) Logical reversibility of computation. *IBM
Journal of research and development* 17, 525-532.

Bennett, C (1988) Notes on the history of reversible computation. *IBM
Journal of research and development **44*, 270-277.

Carnot, S (1824) *Réflexions sur la puissance motrice du feu et
sur les machines propres à développer cette puissance*
(Bachelier, Paris)

Clausius, R (1850) Über die bewegende Kraft der Wärme und die
Gesetze, welche sich daraus für die Wärmelehre selbst ableiten lassen.
*Annalen der Physik* 155, 368-397

Clausius, R
(1865) Über verschiedene für die Anwendung bequeme Formen der
Hauptgleichungen der mechanischen Wärmetheorie.* Annalen
der Physik*
201, 353-400

Hopfield JJ (1974) Kinetic proofreading: a new mechanism for reducing
errors in biosynthetic processes requiring high specificity. *Proc
Natl Acad Sci U S A* 71:4135-4139.

Landauer, R (1961) Irreversibility and heat generation in the
computing process. *IBM Journal of research and development *1961,
*3*, 184-191.

Maxwell, JC (1871, reprinted 1902) *Theory
of Heat* (Longmans, Green and Co, London).

Smoluchowski, M (1914) Vorträge über die kinetische Theorie der Materie und der Elektrizität. Account of lecture at a conference held in Göttingen invited by the Foundation Wolfskehl (Teubner, Leipzig, 1914). This conference heard also M. Planck. P. Debye, W. Nernst A. Sommerfeld and H.A. Lorentz and was introduced by David Hilbert and H. Kamerlingh-Onnes...

Szilard, L (1929) Über die Entropieverminderung in einem
thermodynamischen System bei Eingriffen intelligenter Wesen.* **Zeitchrift
fur Physik* 53:840-856.

von Neumann, J (posthumous, 1966) *Theory of Self-Reproducing
Automata* (University of Illinois Press, Urbana).

Younis, SG, Knight, T (1994) Asymptotically zero energy computing using split- level charge recovery logic, Technical Report AITR-1500, MIT AI Laboratory