The Mystery of Life's Origin:

Reassessing Current Theories




CHAPTER 7

Thermodynamics of Living Systems


It is widely held that in the physical sciences the laws of thermodynamics have had a unifying effect similar to that of the theory of evolution in the biological sciences. What is intriguing is that the predictions of one seem to contradict the predictions of the other. The second law of thermodynamics suggests a progression from order to disorder, from complexity to simplicity, in the physical universe. Yet biological evolution involves a hierarchical progression to increasingly complex forms of living systems, seemingly in contradiction to the second law of thermodynamics. Whether this discrepancy between the two theories is only apparent or real is the question to be considered in the next three chapters. The controversy which is evident in an article published in the American Scientist 1 along with the replies it provoked demonstrates the question is still a timely one.

The First Law of Thermodynamics

Thermodynamics is an exact science which deals with energy. Our world seethes with transformations of matter and energy. Be these mechanical or chemical, the first law of thermodynamics---the principle of the Conservation of Energy---tells us that the total energy of the universe or any isolated part of it will be the same after any such transformation as it was before. A major part of the science of thermodynamics is accounting---giving an account of the energy of a system that has undergone some sort of transformation. Thus, we derive from the first law of thermodynamics that the change in the energy of a system (E) is equal to the work done on (or by) the system (W) and the heat flow into (or out of) the system (Q) Mechanical work and energy are interchangeable, i.e., energy may be converted into mechanical work as in a steam engine, or mechanical work can be converted into energy as in the heating of a cannon which occurs as its barrel is bored. In mathematical terms (where the terms are as previously defined):

E = Q + W (7-1)

The Second Law of Thermodynamics

The second law of thermodynamics describes the flow of energy in nature in processes which are irreversible. The physical significance of the second law of thermodynamics is that the energy flow in such processes is always toward a more uniform distribution of the energy of the universe. Anyone who has had to pay utility bills for long has become aware that too much of the warm air in his or her home during winter escapes to the outside. This flow of energy from the house to the cold outside in winter, or the flow of energy from the hot outdoors into the air-conditioned home in the summer, is a process described by the second law of thermodynamics. The burning of gasoline, converting energy "rich" compounds (hydrocarbons) into energy "lean" compounds, carbon dioxide (CO2) and water (H20), is a second illustration of this principle.

The concept of entropy (S) gives us a more quantitative way to describe the tendency for energy to flow in a particular direction. The entropy change for a system is defined mathematically as the flow of energy divided by the temperature, or,

S [Q / T] (7-2)

where S is the change in entropy, Q is the heat flow into or out of a system, and T is the absolute temperature in degrees Kelvin (K).

[Note: For a reversible flow of energy such as occurs under equilibrium conditions, the equality sign applies. For irreversible energy flow, the inequality applies.]
A Driving Force

If we consider heat flow from a warm house to the outdoors on a cold winter night, we may apply equation 7-2 as follows:

ST = Shouse + Soutdoors - Q / T1 + Q / T2 (7-3)

where Sr is the total entropy change associated with this irreversible heat flow, T1 is the temperature inside the house, and T2 is the temperature outdoors. The negative sign of the first term notes loss of heat from the house, while the positive sign on the second term recognizes heat gained by the outdoors. Since it is warmer in the house than outdoors (T1 > T2), the total entropy will increase (Sr > 0) as a result of this heat flow. If we turn off the heater in the house, it will gradually cool until the temperature approaches that of the outdoors, i.e., T1 = T2. When this occurs, the entropy change (S) associated with heat flow (Q) goes to zero. Since there is no further driving force for heat flow to the outdoors, it ceases; equilibrium conditions have been established.

As this simple example shows, energy flow occurs in a direction that causes the total energy to be more uniformly distributed. If we think about it, we can also see that the entropy increase associated with such energy flow is proportional to the driving force for such energy flow to occur. The second law of thermodynamics says that the entropy of the universe (or any isolated system therein) is increasing; i.e., the energy of the universe is becoming more uniformly distributed.

It is often noted that the second law indicates that nature tends to go from order to disorder, from complexity to simplicity. If the most random arrangement of energy is a uniform distribution, then the present arrangement of the energy in the universe is nonrandom, since some matter is very rich in chemical energy, some in thermal energy, etc., and other matter is very poor in these kinds of energy. In a similar way, the arrangements of mass in the universe tend to go from order to disorder due to the random motion on an atomic scale produced by thermal energy. The diffusional processes in the solid, liquid, or gaseous states are examples of increasing entropy due to random atomic movements. Thus, increasing entropy in a system corresponds to increasingly random arrangements of mass and/or energy.

Entropy and Probability

There is another way to view entropy. The entropy of a system is a measure of the probability of a given arrangement of mass and energy within it. A statistical thermodynamic approach can be used to further quantify the system entropy. High entropy corresponds to high probability. As a random arrangement is highly probable, it would also be characterized by a large entropy. On the other hand, a highly ordered arrangement, being less probable, would represent a lower entropy configuration. The second law would tell us then that events which increase the entropy of the system require a change from more order to less order, or from less-random states to more-random states. We will find this concept helpful in Chapter 9 when we analyze condensation reactions for DNA and protein.

Clausius2, who formulated the second law of thermodynamics, summarizes the laws of thermodynamics in his famous concise statement: "The energy of the universe is constant; the entropy of the universe tends toward a maximum." The universe moves from its less probable current arrangement (low entropy) toward its most probable arrangement in which the energy of the universe will be more uniformly distributed.

Life and the Second Law of Thermodynamics

How does all of this relate to chemical evolution? Since the important macromolecules of living systems (DNA, protein, etc.) are more energy rich than their precursors (amino acids, heterocyclic bases, phosphates, and sugars), classical thermodynamics would predict that such macromolecules will not spontaneously form.

Roger Caillois has recently drawn this conclusion in saying, "Clausius and Darwin cannot both be right."3 This prediction of classical thermodynamics has, however, merely set the stage for refined efforts to understand life's origin. Harold Morowitz4 and others have suggested that the earth is not an isolated system, since it is open to energy flow from the sun. Nevertheless, one cannot simply dismiss the problem of the origin of organization and complexity in biological systems by a vague appeal to open-system non-equilibrium thermodynamics. The mechanisms responsible for the emergence and maintenance of coherent (organized) states must be defined. To clarify the role of mass and energy flow through a system as a possible solution to this problem, we will look in turn at the thermodynamics of (1) an isolated system, (2) a closed system, and (3) an open system. We will then discuss the application of open-system thermodynamics to living systems. In Chapter 8 we will apply the thermodynamic concepts presented in this chapter to the prebiotic synthesis of DNA and protein. In Chapter 9 this theoretical analysis will be used to interpret the various prebiotic synthesis experiments for DNA and protein, suggesting a physical basis for the uniform lack of success in synthesizing these crucial components for living cells.

Isolated Systems

An isolated system is one in which neither mass nor energy flows in or out. To illustrate such a system, think of a perfectly insulated thermos bottle (no heat loss) filled initially with hot tea and ice cubes. The total energy in this isolated system remains constant but the distribution of the energy changes with time. The ice melts and the energy becomes more uniformly distributed in the system. The initial distribution of energy into hot regions (the tea) and cold regions (the ice) is an ordered, nonrandom arrangement of energy, one not likely to be maintained for very long. By our previous definition then, we may say that the entropy of the system is initially low but gradually increases with time. Furthermore, the second law of thermodynamics says the entropy of the system will continue to increase until it attains some maximum value, which corresponds to the most probable state for the system, usually called equilibrium.

In summary, isolated systems always maintain constant total energy while tending toward maximum entropy, or disorder. In mathematical terms,

E / t = 0

(isolated system)

S / t 0 (7-4)

where E and S are the changes in the system energy and system entropy respectively, for a time interval t. Clearly the emergence of order of any kind in an isolated system is not possible. The second law of thermodynamics says that an isolated system always moves in the direction of maximum entropy and, therefore, disorder.

It should be noted that the process just described is irreversible in the sense that once the ice is melted, it will not reform in the thermos. As a matter of fact, natural decay and the general tendency toward greater disorder are so universal that the second law of thermodynamics has been appropriately dubbed "time's arrow."5

Closed Systems near Equilibrium

A closed system is one in which the exchange of energy with the outside world is permitted but the exchange of mass is not. Along the boundary between the closed system and the surroundings, the temperature may be different from the system temperature, allowing energy flow into or out of the system as it moves toward equilibrium. If the temperature along the boundary is variable (in position but not time), then energy will flow through the system, maintaining it some distance from equilibrium. We will discuss closed systems near equilibrium first, followed by a discussion of closed systems removed from equilibrium next.

If we combine the first and second laws as expressed in equations 7-1 and 7-2 and replace the mechanical work term W by P V, where P is pressure and V is volume change, we obtain,

[NOTE: Volume expansion (V> 0) corresponds to the system doing work, and therefore losing energy. Volume contraction
(V 0) corresponds to work being done on the system].

S [E + P V] / [T] (7-5)

Algebraic manipulation gives

E + P V - T S 0 or G 0 (7-6)

where

G = E + P V - T S

The term on the left side of the inequality in equation 7-6 is called the change in the Gibbs free energy (G). It may be thought of as a thermodynamic potential which describes the tendency of a system to change---e.g., the tendency for phase changes, heat conduction, etc. to occur. If a reaction occurs spontaneously, it is because it brings a decrease in the Gibbs free energy (G 0). This requirement is equivalent to the requirement that the entropy of the universe increase. Thus, like an increase in entropy, a decrease in Gibbs free energy simply means that a system and its surroundings are changing in such a way that the energy of the universe is becoming more uniformly distributed.

We may summarize then by noting that the second law of thermodynamics requires,

G / t 0, (closed system) (7-7)

where t indicates the time period during which the Gibbs free energy changed.

The approach to equilibrium is characterized by,

G / t 0, (closed system) (7-8)

The physical significance of equation 7-7 can be understood by rewriting equations 7-6 and 7-7 in the following form:

[S / t] - [ 1 / T (E / t + P V / t)] 0 (7-9)

or

(S / t ) - (1 / T H / t ) 0

and noting that the first term represents the entropy change due to processes going on within the system and the second term represents the entropy change due to exchange of mechanical and/or thermal energy with the surroundings. This simply guarantees that the sum of the entropy change in the system and the entropy change in the surroundings will be greater than zero; i.e., the entropy of the universe must increase. For the isolated system, E + P V = 0 and equation 7-9 reduces to equation 7-4.

A simple illustration of this principle is seen in phase changes such as water transforming into ice. As ice forms, energy (80 calories/gm) is liberated to the surrounding. The change in the entropy of the system as the amorphous water becomes crystalline ice is -0.293 entropy units (eu)/degree Kelvin (K). The entropy change is negative because the thermal and configuration entropy (or disorder) of water is greater than that of ice, which is a highly ordered crystal.

[NOTE: Confirgurational entropy measures randomness in the distribution of matter in much the same way that thermal entropy measures randomness in the distribution of energy].
Thus, the thermodynamic conditions under which water will transform to ice are seen from equation 7-9 to be:

-0.293 - (-80 / T) > 0 (7-l0a)

or

T 273oK (7-l0b)

For condition of T 273oK energy is removed from water to produce ice, and the aggregate disordering of the surroundings is greater than the ordering of the water into ice crystals. This gives a net increase in the entropy of the universe, as predicted by the second law of thermodynamics.

It has often been argued by analogy to water crystallizing to ice that simple monomers may polymerize into complex molecules such as protein and DNA. The analogy is clearly inappropriate, however. The E + P V term (equation 7-9) in the polymerization of important organic molecules is generally positive (5 to 8 kcal/mole), indicating the reaction can never spontaneously occur at or near equilibrium.

[NOTE: If E + P V is positive, the entropy term in eq 7 9 must be negative due to the negative sign which preceeds it. The inequality can only be satisfied by S being sufficiently positive, which implies disordenng].
By contrast the E + P V term in water changing to ice is a negative, -1.44 kcal/mole, indicating the phase change is spontaneous as long as T 273oK, as previously noted. The atomic bonding forces draw water molecules into an orderly crystalline array when the thermal agitation (or entropy driving force, T S) is made sufficiently small by lowering the temperature. Organic monomers such as amino acids resist combining at all at any temperature, however, much less in some orderly arrangement.

Morowitz6 has estimated the increase in the chemical bonding energy as one forms the bacterium Escherichia coli from simple precursors to be 0.0095 erg, or an average of 0.27 ev/ atom for the 2 x 1010 atoms in a single bacterial cell. This would be thermodynamically equivalent to having water in your bathtub spontaneously heat up to 360oC, happily a most unlikely event. He goes on to estimate the probability of the spontaneous formation of one such bacterium in the entire universe in five billion years under equilibrium conditions to be 10-1011. Morowitz summarizes the significance of this result by saying that "if equilibrium processes alone were at work, the largest possible fluctuation in the history of the universe is likely to have been no longer than a small peptide."7 Nobel Laureate I. Prigogine et al., have noted with reference to the same problem that:
The probability that at ordinary temperatures a macroscopic number of molecules is assembled to give rise to the highly ordered structures and to the coordinated functions characterizing living organisms is vanishingly small. The idea of spontaneous genesis of life in its present form is therefore highly improbable, even on the scale of billions of years during which prebiotic evolution occurred.8
It seems safe to conclude that systems near equilibrium (whether isolated or closed) can never produce the degree of complexity intrinsic in living systems. Instead, they will move spontaneously toward maximizing entropy, or randomness. Even the postulate of long time periods does not solve the problem, as "time's arrow" (the second law of thermodynamics) points in the wrong direction; i.e., toward equilibrium. In this regard, H.F. Blum has observed:
The second law of thermodynamics would have been a dominant directing factor in this case [of chemical evolution]; the reactions involved tending always toward equilibrium, that is, toward less free energy, and, in an inclusive sense, greater entropy. From this point of view the lavish amount of time available should only have provided opportunity for movement in the direction of equilibrium.9 (Emphasis added.)
Thus, reversing "time's arrow" is what chemical evolution is all about, and this will not occur in isolated or closed systems near equilibrium.

The possibilities are potentially more promising, however, if one considers a system subjected to energy flow which may maintain it far from equilibrium, and its associated disorder. Such a system is said to be a constrained system, in contrast to a system at or near equilibrium which is unconstrained. The possibilities for ordering in such a system will be considered next.

Closed Systems Far from Equilibrium

Energy flow through a system is the equivalent to doing work continuously on the system to maintain it some distance from equilibrium. Nicolis and Prigoginelo have suggested that the entropy change (S) in a system for a time interval (t) may be divided into two components.

S = Se + Si (7-11)

where Se is the entropy flux due to energy flow through the system, and Si is the entropy production inside the system due to irreversible processes such as diffusion, heat conduction, heat production, and chemical reactions. We will note when we discuss open systems in the next section that Se includes the entropy flux due to mass flow through the system as well. The second law of thermodynamics requires,

Si 0 (7-12)

In an isolated system, Se = 0 and equations 7-11 and 7-12 give,

S =Si 0 (7-13)

Unlike Si, Se in a closed system does not have a definite sign, but depends entirely on the boundary constraints imposed on the system. The total entropy change in the system can be negative (i.e., ordering within system) when,

Se 0 and | Se | > Si (7-14)

Under such conditions a state that would normally be highly improbable under equilibrium conditions can be maintained indefinitely. It would be highly unlikely (i.e., statistically just short of impossible) for a disconnected water heater to produce hot water. Yet when the gas is connected and the burner lit, the system is constrained by energy flow and hot water is produced and maintained indefinitely as long as energy flows through the system.

An open system offers an additional possibility for ordering---that of maintaining a system far from equilibrium via mass flow through the system, as will be discussed in the next section.

An open system is one which exchanges both energy and mass with the surroundings. It is well illustrated by the familiar internal combustion engine. Gasoline and oxygen are passed through the system, combusted, and then released as carbon dioxide and water. The energy released by this mass flow through the system is converted into useful work; namely, torque supplied to the wheels of the automobile. A coupling mechanism is necessary, however, to allow the released energy to be converted into a particular kind of work. In an analagous way the dissipative (or disordering) processes within an open system can be offset by a steady supply of energy to provide for (S) Se type work. Equation 7-11, applied earlier to closed systems far from equilibrium, may also be applied to open systems. In this case, the Se term represents the negative entropy, or organizing work done on the system as a result of both energy and mass flow through the system. This work done to the system can move it far from equilibrium, maintaining it there as long as the mass and/or energy flow are not interrupted. This is an essential characteristic of living systems as will be seen in what follows.

Thermodynamics of Living Systems

Living systems are composed of complex molecular configurations whose total bonding energy is less negative than that of their chemical precursors (e.g., Morowitz's estimate of E = 0.27 ev/atom) and whose thermal and configurational entropies are also less than that of their chemical precursors. Thus, the Gibbs free energy of living systems (see equation 7-6) is quite high relative to the simple compounds from which they are formed. The formation and maintenance of living systems at energy levels well removed from equilibrium requires continuous work to be done on the system, even as maintenance of hot water in a water heater requires that continuous work be done on the system. Securing this continuous work requires energy and/or mass flow through the system, apart from which the system will return to an equilibrium condition (lowest Gibbs free energy, see equations 7-7 and 7-8) with the decomposition of complex molecules into simple ones, just as the hot water in our water heater returns to room temperature once the gas is shut off.

In living plants, the energy flow through the system is supplied principally by solar radiation. In fact, leaves provide relatively large surface areas per unit volume for most plants, allowing them to "capture" the necessary solar energy to maintain themselves far from equilibrium. This solar energy is converted into the necessary useful work (negative Se in equation 7-11) to maintain the plant in its complex, high-energy configuration by a complicated process called photosynthesis. Mass, such as water and carbon dioxide, also flows through plants, providing necessary raw materials, but not energy. In collecting and storing useful energy, plants serve the entire biological world.

For animals, energy flow through the system is provided by eating high energy biomass, either plant or animal. The breaking down of this energy-rich biomass, and the subsequent oxidation of part of it (e.g., carbohydrates), provides a continuous source of energy as well as raw materials. If plants are deprived of sunlight or animals of food, dissipation within the system will surely bring death. Maintenance of the complex, high-energy condition associated with life is not possible apart from a continuous source of energy. A source of energy alone is not sufficient, however, to explain the origin or maintenance of living systems. The additional crucial factor is a means of converting this energy into the necessary useful work to build and maintain complex living systems from the simple biomonomers that constitute their molecular building blocks.

An automobile with an internal combustion engine, transmission, and drive chain provides the necessary mechanism for converting the energy in gasoline into comfortable transportation. Without such an "energy converter," however, obtaining transportation from gasoline would be impossible. In a similar way, food would do little for a man whose stomach, intestines, liver, or pancreas were removed. Without these, he would surely die even though he continued to eat. Apart from a mechanism to couple the available energy to the necessary work, high-energy biomass is insufficient to sustain a living system far from equilibrium. In the case of living systems such a coupling mechanism channels the energy along specific chemical pathways to accomplish a very specific type of work. We therefore conclude that, given the availability of energy and an appropriate coupling mechanism, the maintenance of a living system far from equilibrium presents no thermodynamic problems.

In mathematical formalism, these concepts may be summarized as follows:

(1) The second law of thermodynamics requires only that the entropy production due to irreversible processes within the system be greater than zero; i.e.,

Si > 0 (7-15)

(2) The maintenance of living systems requires that the energy flow through the system be of sufficient magnitude that the negative entropy production rate (i.e., useful work rate) that results be greater than the rate of dissipation that results from irreversible processes going on within the systems; i.e.,

| Se | > Si (7-16)

(3) The negative entropy generation must be coupled into the system in such a way that the resultant work done is directed toward restoration of the system from the disintegration that occurs naturally and is described by the second law of thermodynamics; i.e.,

- Se = Si (7-17)

where Se and Si refer not only to the magnitude of entropy change but also to the specific changes that occur in the system associated with this change in entropy. The coupling must produce not just any kind of ordering but the specific kind required by the system.

While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors.

It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered in Chapters 8 and 9.


References

1. Victor F. Weisskopf, 1977. Amer. Sci. 65, 405-11.

2. R. Clausius, 1855. Ann. Phys. 125, 358.

3. R. Caillois, 1976. Coherences Aventureuses. Paris: Gallimard.

4. H.J. Morowitz, 1968. Energy Flow in Biology. New York: Academic Press, p.2-3.

5. H.F. Blum, 1951. Time's Arrow and Evolution. Princeton: Princeton University Press.

6. H.J. Morowitz, Energy Flow, p.66.

7. H.J. Morowitz, Energy Flow, p.68.

8. I. Prigogine, G. Nicolis, and A. Babloyantz, November, 1972. Physics Today, p.23.

9. H.F. Blum, 1955. American Scientist 43, 595.

10. G. Nicolis and I. Prigogine, 1977. Self-Organization in Nonequilibrium Systems. New York: John Wiley, p.24.

11. S.L. Miller and L.E. Crgel, 1974. The Origins of Life on the Earth. Englewood Cliffs, New Jersey: Prentice-Hall, p.162-3.


Chapter 8

Chapter 9

Contents

Book Order Form