Skip to main content

Volume 3 Supplement 1

Special Issue: Coevolution

  • Education Article
  • Open access
  • Published:

Evolution and the Second Law of Thermodynamics: Effectively Communicating to Non-technicians

Abstract

Given the degree of disbelief in the theory of evolution by the wider public, scientists need to develop a collection of clear explanations and metaphors that demonstrate the working of the theory and the flaws in anti-evolutionist arguments. This paper presents tools of this sort for countering the anti-evolutionist claim that evolutionary mechanisms are inconsistent with the second law of thermodynamics. Images are provided to replace the traditional misunderstanding of the law, i.e., “everything always gets more disordered over time,” with a more clear sense of the way in which entropy tends to increase allowing a thermally isolated system access to a greater number of microstates. Accessible explanations are also provided for the ways in which individual organisms are able to minimize entropy and the advantages this conveys.

Much of the debate surrounding the foundations of evolutionary biology is a conversation occurring in the body politic rather than in the scientific community. As such, the means of engaging in this discussion must be carefully tailored to the context. Appeals to scholarly work and the use of technical notions, no matter how precise or empirically well-supported, will prove ineffective. The scientific community, so skilled at working within its own discourse conventions, must also concentrate on how to express these notions clearly to non-technicians. The term “rhetoric” has acquired an unfortunate connotation, but the synonymous phrase “effective communication” may be used for a project the academic community must actively engage in as a part of their place in the division of intellectual labor.

The purpose of this paper is to begin to develop easily accessible explanations, images, and metaphors to assist non-technicians in understanding the workings of the natural world that illustrate the flaws in anti-evolutionary arguments. In this paper, we seek to formulate effective tools for communicating the fallacies contained in the anti-evolution advocates’ argument that speciation by evolution violates the second law of thermodynamics (see, e.g., Morris 1987, 38–64).

Their argument runs like this:

  1. 1.

    Evolutionary theory contends that current species developed from earlier life forms.

  2. 2.

    These earlier life forms were simpler in having fewer capabilities and less complex systems.

  3. 3.

    Therefore, evolutionary theory claims that organisms get better ordered over time.

  4. 4.

    The second law of thermodynamics holds that entropy increases; that is, systems over time become more disordered.

  5. 5.

    Therefore, both evolutionary theory and the second law of thermodynamics cannot both be correct.

  6. 6.

    Physics is a more basic or well-established field than biology.

  7. 7.

    Therefore, we ought to prefer the second law of thermodynamics and reject evolutionary theory.

Among the chief errors of this argument are (a) its understanding of the second law of thermodynamics and the notion of entropy, (b) the scope of the application of the second law of thermodynamics, and (c) failing to understand the way in which the mechanisms underlying genetics are perfectly in line with physical law. What is needed are ways to effectively communicate these flaws to the general public.

The Second Law of Thermodynamics

The anti-evolutionists’ argument is based on an understanding of the second law of thermodynamics, according to which disorder always increases. This is a common misunderstanding of one of the more baffling principles in physics, which has a long and contentious history, having been formulated in different ways by Sadi Carnot, Rudolf Clausius, William Thomson (Lord Kelvin), Ludwig Boltzmann, and Max Planck. The second law is best known as the principle that rules out perpetual motion, something resulting from its origin in the question “how efficient can we make steam engines?”—a strangely pragmatic starting point for such an esoteric principle.

Scientists and engineers discovered that, when trying to convert one form of energy, e.g., heat, into another form of energy, e.g., motion, we were never able to make the transfer complete, there was always some energy lost. Think of this in terms of currency. Whenever we exchange money, say from dollars to Euros, the bank charges a transaction fee. So if we changed money back and forth, we would eventually go broke even with a fixed exchange rate.

Given its well behaved sibling, the first law of thermodynamics (that energy is always conserved, neither created nor destroyed), researchers sought a means of quantifying and explaining this energy transaction fee. That explanation led them to posit a strange quantity, one not directly observable: entropy. It measures the “disorder” of a system in terms of the number of microstates—arrangements of molecules—accessible to a system in a given macrostate—having a particular temperature, pressure, and volume. They found that “in any process in which a thermally isolated system goes from one macrostate to another, the entropy tends to increase (Reif 1965, 122).”

The word “tends” sparked a firestorm with physicists divided between those who took Clausius’ view that, like every other physical quantity, it was subject to absolute deterministic rules and therefore must increase and those who took Boltzmann’s position that thermodynamic quantities were statistical averages, so we have to talk about the probability that entropy most likely increases. Physicists traditionally were wedded to mechanistic pictures of point masses bouncing off of each other in accordance with well-structured Newtonian principles. Hence, those behind Clausius had a deep disdain for mere probabilistic principles, while those in line with Boltzmann argued that the number of interactions was far too large to be handled by normal means and that probabilistic claims were the best we could make for such large collections.

Boltzmann eventually won the day, and entropy is best thought of in terms of what physicists call “ensembles,” the set of all possible states of a thermal system. Entropy is a measure of the number of possible states in which you might find the system if you checked. Since heat flows from warm to cold, a system not in equilibrium is in flux; it is changing. This means that the number of possible states in which the system could be found increases over time.

Think of a deck of cards. If you sat down at a poker table with seven people to play five-card stud and dealt a brand new deck right out of the box, the results are a foregone conclusion because new cards are inserted in the box in order. The person to the dealer’s left will necessarily be dealt a six and king of spades, a seven and ace of diamonds, and an eight of clubs and have the highest hand.

But if the cards are shuffled once, the results will be different. Since the top half of the deck is arranged in ascending order in spades and the shuffle will generally begin when the cards are divided roughly in half and interwoven roughly alternating every other card, there is a very good chance that the ace of spades will be one of the first cards dealt and will almost certainly end up in someone’s hand.

Now if the cards are shuffled seven, ten, or 20 times, the chances of that ace of spades showing up become less and less, and with each additional shuffle of the deck, the likelihood of getting the ace of spaces approaches the likelihood of drawing any other card. That is what entropy measures. As the system approaches equilibrium, the chance of finding it in some particular state—some particular order of cards—approaches the likelihood of finding it in any other state—any other order of cards.

What the second law of thermodynamics does not say is that disorder always increases. Play poker long enough with well-shuffled decks, and you will eventually get dealt a royal straight flush. The chances are slim that such an ordering will appear on any given hand, but shuffling does not mean that order cannot accidentally appear, just that it is less and less likely. Play long enough with well-shuffled decks and eventually the original ordering from the box will reappear, as will the one in which everyone at the table is dealt four of a kind in ascending order. While the dealer of such a hand is unlikely to emerge from the game with his credibility (and perhaps his bodily features) intact, there is always the chance of such an order appearing after enough shuffles.

The second law of thermodynamics does not say that disorder necessarily increases in isolated systems (not adding or subtracting cards) that are not in equilibrium (the cards are being shuffled), rather it says that the likelihood of finding it in its original or any given state tends to approach the likelihood of finding it in any other state. When we understand what the second law of thermodynamics really says, the anti-evolutionists’ misrepresentation of it as requiring increasing disorder is seen as a misunderstanding.

The Scope of the Second Law of Thermodynamics

But it is not only the understanding of the content of the law that is faulty in the anti-evolutionists’ argument, it is also the scope of applicability of the law. The second law holds for systems that are thermally isolated and not in equilibrium. Let’s look at the first condition. A thermally isolated system is one in which energy is not being added or subtracted. This is crucial, as added energy can decrease entropy.

We may use energy to order systems in very straightforward ways. Let us posit that our thermodynamic system is a collection of ions, electrically charged molecules, in a long tube. We know that like charges repel and unlike charges attract. By using energy to charge a capacitor, we create a negative charge at one end of the tube and a positive charge at the other. This results in a smooth gradient in which more positive ions are drawn to one side of the tube and negative to the other. We achieve an order that would have been highly unlikely if the system was left to its own.

Similarly, an eight-year-old’s bedroom, when left to develop according to its natural happenings, will have a possibility of occupying a larger and larger number of possible states. The next day’s homework assignment, the shirt worn two days ago, the empty bag of chips that was eaten the previous week, could be under the bed, on the desk, in the closet, behind the book shelf, anywhere really. But if energy is added to the system—“I want this pig sty cleaned up now or you are not going to the movies with your friends tomorrow night”—then there is an increase in order, i.e., the number of possible states is massively decreased with dirty laundry more likely to be found somewhere in the hampered region of the room and the required math assignment more likely in or near the book bag. Adding energy can counter the increase in entropy.

The anti-evolutionists’ argument uses the second law of thermodynamics and applies it to the Earth and its natural systems as if the eight-year-old would never be asked to clean his or her room. But this Pippi Longstocking hypothesis is false. The Earth is not a thermally isolated system because it receives constant energy from the Sun. This is the energy fixed by plants using photosynthesis, which is then acquired by herbivores that eat the plants and carnivores that eat the herbivores. It is certainly true that without this constant addition of energy to the Earth’s system, life would be impossible, but fortunately for us, the radiation we receive is like the constant motivation for the youngster to keep his or her room tidy. The second law of thermodynamics simply cannot be used the way the anti-evolutionists try to use it.

How Evolutionary Theory Is Consistent with the Second Law of Thermodynamics

In this section, we will present an accessible three-part strategy for showing how evolutionary theory is consistent with the second law of thermodynamics, and we provide figures to simply convey the main points. We show that entropy, far from opposing evolution, is a thermodynamic driving force that propels natural selection, the mechanism of evolution. Our approach is as follows: (1) We first describe how an inherent characteristic of all living organisms is that they are open systems that maintain greater order than their surroundings by importing free energy (nutrients) and exporting entropy (heat and waste); we focus on the role of the semi-permeable cell membrane as a mediator of internal order. (2) We then discuss how entropy can decrease locally within subsystems and how organismal complexity can increase over evolutionary time as long as there is a greater increase in entropy in another interlocking part of the system; we focus on the Sun as the Earth’s ultimate source of low entropy light and how primary producers (plants and cyanobacteria) capture this low entropy and drive the evolution of complexity. (3) Lastly, we discuss how organisms can be viewed thermodynamically as energy transfer systems, with beneficial mutations allowing organisms to disperse energy more efficiently to their environment; we provide a simple “thought experiment” using bacteria cultures to convey the idea that natural selection favors genetic mutations (in this example, of a cell membrane glucose transport protein) that lead to faster rates of entropy increases in an ecosystem.

What Are Organisms and How Do They Resist Entropy?

Living things have been elegantly described as “islands of order surrounded by an ocean of chaos” (Margulis and Sagan 1995). In one of the most influential publications on the nature of life, the Austrian physicist and Nobel Laureate, Erwin Schrödinger, considered that a fundamental attribute of living things is that they maintain high levels of internal order by “exporting entropy” to their environment (Schrödinger 1944). The Belgian chemist and Nobel laureate, Ilya Prigogine, helped popularize the notion that in thermodynamic terms, life can be considered a subset of a larger class of systems called “dissipative structures” (Prigogine and Stengers 1984). These dynamic, self-maintaining systems include cyclones, whirlpools, flames, and black holes and are characterized by importing useful forms of energy (free energy) and exporting (dissipating) less useful forms (entropy), particularly heat. As long as the structures are actively self-organizing and self-maintaining (in the case of organisms, “alive”), they remain far from thermodynamic equilibrium with their environment. An organism attains thermodynamic equilibrium with its environment only after death, when its body decomposes.

All known organisms consist of one or more cells, the most basic units of life. Each cell maintains a precise and constant internal physiochemical environment throughout its life that is distinct from its surroundings. This is achieved by expending energy acquired from externally derived nutrients (free energy) to fuel diverse regulatory processes that are collectively termed “metabolism” (Fig. 1). Therefore, organisms, and the individual cells that compose them, are open systems that continually exchange nutrients and wastes with their environment. In effect, all organisms maintain their low entropy status by “eating” free energy and “pooping” entropy. As the eminent physicist Roger Penrose (1989) explains: “Where indeed does our own low entropy come from? The organization in our bodies comes from the food that we eat and the oxygen that we breathe.”

Fig. 1
figure 1

Cells maintain a relatively higher degree of order compared with their environment by continually importing free energy in the form of nutrients and exporting entropy as disordered wastes and heat. Cells selectively import ordered nutrients from a largely chaotic world via its semipermeable cell membrane, composed of substrate-specific protein channels and transporters (colored cylinders) embedded in a relatively impermeable phospholipid (dotted lines) membrane. A cell’s metabolism converts nutrients into usable forms of energy (ATP) and into diverse biomolecules that are used for self-maintenance, reproduction, and growth. The costs of converting nutrients into these useful low entropy forms for self-preservation are the production of high entropy wastes and heat. The constant export of entropy from cells via the cell membrane ensures the cells maintain higher internal order compared with their external environment

One of the most important biochemical features of cellular life is the presence of a semi-permeable cell membrane that both separates external chaos from internal order and mediates the exchange of specific nutrients and wastes in a highly controlled fashion; the cell membrane, in large part, defines life as an open system. Although the phospholipid component of cell membranes is impermeable to most water-soluble compounds in the environment, cell membranes also contain diverse transmembrane protein channels and transporters that facilitate the passage of specific nutrients (e.g., glucose, amino acids, nucleotides) and other molecules necessary for life. Many nutrients are converted by the cell into usable energy (e.g., ATP, a stable and storable form of energy), assimilated into cellular organelles, used for structural support, or converted into enzymes, all of which are used to maintain a cell’s integrity (replace broken-down parts), as well as mediate diverse physiological processes such as reproduction and growth. The chemical conversion of nutrients into useful forms usually produces toxic waste products and heat, all of which must be exported by the cell to its environment to ensure the cell’s survival. Ultimately, all organisms and their cellular constituents gain and preserve their internal ordered state by first importing free energy from their surroundings (eating), then converting the nutrients into useful forms (metabolizing), and finally exporting (pooping) an equal or greater amount of energy to their environment in the forms of heat and entropy.

How Can Greater Organismal Complexity Evolve in an Entropic Universe?

Isaac Asimov (1984) characterized the fallacy of the creationist understanding of entropy: “In kindergarten terms, the second law of thermodynamics says that all spontaneous change is in the direction of increasing disorder—that is, in a ‘downhill’ direction. There can be no spontaneous buildup of the complex from the simple, therefore, because that would be moving ‘uphill’.” Asimov reasons, “An argument based on kindergarten terms is only suitable for kindergartens.” In this section, we will apply an understanding of entropy beyond the kindergarten level.

The second law of thermodynamics clearly does not prohibit the building of complexity from simplicity, hence the existence of complex structures like termite mounds and toaster ovens. The physical world is filled with countless examples of spontaneous order emanating from a less ordered state, such as gases (e.g., water vapor in clouds) condensing into a more ordered liquid state (rain) and liquids freezing into an even more highly ordered solid crystalline state (e.g., ice crystals). Perhaps most dramatic and commonplace biological example of spontaneous order derived from a less ordered state is the development of a single cell, the zygote, into a complex multicellular (billions of cells), adult human possessing dozens of specialized organs, tissue classes, and terminally differentiated cell types. Clearly, snowflake synthesis and embryogenesis do not violate any physical laws, so what’s going on?

In a nutshell, the synthesis of order exacts an energetic price: The cost of converting a relatively disordered water droplet into a more ordered snowflake is the release of heat to the environment, and the cost of embryogenesis is the conversion of ordered nutrients into less ordered waste products and heat. In the end, the processes of snowflake synthesis and embryogenesis always contribute more net entropy to the system as a whole, consistent with the second law of thermodynamics. According to the creationist “kindergartener’s understanding of entropy” (Asimov 1984), neither snowflake synthesis nor animal development could possibly take place, let alone organismal evolution.

Having just discussed how individual organisms maintain consistently higher degrees of internal order compared with their surroundings, we now describe how the second law of thermodynamics is perfectly consistent with, indeed promotes, the progeny of some populations of organisms becoming incrementally more complex over evolutionary time.

A Gouldian Disclaimer

Natural selection produces organisms that are more adapted to their environments, but “more adapted” organisms are not necessarily more “complex” than their ancestors. Although natural selection has produced complex multicellular life from relatively simpler unicellular ancestors, we are in no way implying that complexity is the general evolutionary trend—which it clearly is not (see Gould 1997). For example, much of the unicellular Kingdoms Monera (bacteria) and Archaea (archaebacteria) (i.e., the vast majority of life on Earth) remain virtually unchanged over millennia, and similar (though far less dramatic) cases can be made for cockroaches and sharks, whose body forms have remained essentially unchanged throughout long stretches of animal evolutionary history. Furthermore, there are also examples of lineages that have become, arguably, less complex with evolution (e.g., loss of numerous organs and body parts in parasites, loss of eyes in deep sea and cave-dwelling fauna). Here, we are specifically addressing a thermodynamic paradigm that explains how evolutionary complexity can develop in the face of entropy, without suggesting that the development of complexity is inevitable. The anti-evolutionists’ caricature of evolution as inevitably increasing complexity as a whole is simply not the case, even if some adaptations may increase complexity.

Even though net entropy increases over time in a thermally isolated system, local regions of reduced entropy (e.g., complexity) can develop spontaneously in open subsystems as long as there is a greater decrease in entropy (decrease in complexity) in another interlocking part of the system. So long as entropy tends to increase in the entire system, the second law of thermodynamics is not violated. Evolution can occur locally within a system by moving thermodynamically “uphill” (building the complex from simpler precursors) in one subsystem (e.g., a population of organisms) as long as an interlocking part of the system (e.g., the Sun) moves thermodynamically “downhill” at a significantly faster rate and magnitude than evolution moves uphill.

Roger Penrose (1989) describes, “Contrary to a common impression, the earth does not gain [net] energy from the sun! What the earth does is to take energy in low-entropy form, and then spew it all back again into space, but in a high-entropy form. What the sun has done for us is to supply us with a huge source of low entropy. We (via the plant's cleverness), make use of this, ultimately extracting some tiny part of this low entropy and converting it into the remarkable and intricately organized structures that are ourselves.” These concepts can be challenging to visualize, and we present them in a simplified form in Fig. 2. The photons that emanate from the sun and arrive at Earth are highly directed (arrive from a narrow range of directions) and possess high energy (shortwave radiation). In contrast with incoming solar light, outgoing photons re-radiated from the Earth consist of low energy (longwave radiation) infrared light that is highly dispersed (photons are moving in many different directions). Because the total energy carried by the outgoing photons is the same as the incoming photons, there are many fewer photons traveling toward Earth than there are photons reradiating back into space. The Sun’s smaller number of highly directed, high-energy photons represent a state of much lower entropy compared with the greater number of highly dispersed, low energy photons reradiated to space.

Fig. 2
figure 2

Primary producers (photosynthetic plants and cyanobacteria; green cog) rely on the conversion of directed high energy (low entropy) sunlight (yellow arrows) into dispersed low energy (high entropy) infrared light (red arrows) to synthesize and store their own chemical energy (glucose and ATP). Glucose and ATP are used to maintain an organism’s lower entropy state compared with its environment. The thermodynamically “downhill” conversion of sunlight from low to high entropy is more than sufficient to not only turn the cog of life but to drive the thermodynamically “uphill” evolution of complex multicellular life from relatively simpler single-celled ancestors

The Earth’s primary producers (photosynthetic plants and bacteria) make use of this low entropy, thereby reducing their own entropy. Non-photosynthetic organisms reduce their entropy by eating these primary producers either directly or indirectly and using the oxygen released by photosynthesis for cellular respiration. Therefore, photosynthetic primary producers can be viewed as a rotating cog in the machinery of life, powered by the conversion of low entropy sunlight to higher entropy infrared light (Fig. 2). This rotating cog interlocks with virtually all of Earth’s organisms and powers the machinery of life. The powering of life by converting sunlight from low to high entropy is analogous to the powering of a city from a river whose water flow rotates hydroelectric turbines to generate electricity. As long as the river provides enough water flow to turn the turbines, the city will be able to use the resulting electricity to maintain itself and stay “alive.”

However, does the sun actually provide enough low entropy to not just simply maintain life’s status quo but to also drive the ‘uphill’ evolution of complex life? Or, using the river analogy, does the river flow provide enough hydroelectricity to not just simply maintain the city but to accommodate growth and development of the city (i.e., increased complexity in the forms of shopping malls, suburbs, water parks, etc.)? Using basic mathematics, physicist Daniel Styler (2008) has elegantly shown that the Earth is bathed in about one trillion times the amount of entropy flux required to support the evolution of complex life. Physicist Emory Bunn (2009) shows that the evolution of extant complex life is compatible with the second law of thermodynamics as long as the time required for life to evolve on Earth is at least 107 s or 116 days. Since life has had 4 billion years to evolve on Earth, the theory of evolution does not appear to be threatened by the second law of thermodynamics. Far from threatening evolution, as we will see, entropy actually functions as a thermodynamic driving force behind natural selection.

Describing Evolution Using the Second Law of Thermodynamics

Kaila and Annila (2008) of the University of Helsinki have described biological evolution mathematically as an equation of motion where, in the absence of an external high energy source, energy flows toward a stationary state (equilibrium), as described by the second law of thermodynamics. The physicists describe evolution as an energy transfer process, and since physical motion always takes the path of least resistance (i.e., the principle of least action), organisms can be depicted mathematically as dissipative systems that maximize the rate of entropy production in a system. As the physicists explained in an interview with Lisa Zyga (2008) of PhysOrg.com, “Nature explores many possible paths to level differences in energy densities, with one kind of energy transfer mechanism being different species within the larger system of the Earth.” Although an open system’s energy landscape is in constant flux, it always follows the most direct route (shortest path and steepest descent) to maximize rates of energy dispersal and entropy. Therefore, natural selection favors genetic mutations that lead to faster rates of entropy.

In Fig. 3, we describe a simple thought experiment that illustrates this concept. Imagine three “thermally isolated systems” on a laboratory bench in the forms of three identical covered petri dishes used for growing bacterial cultures. At time zero, there are no bacteria in the dishes, but each dish contains identical amounts of a nutrient substrate, glucose. Glucose (a product of photosynthesis) is a low entropy form of chemical energy, so the total amount of entropy in each dish at the start of the experiment is relatively low. If we leave one of the dishes undisturbed over a relatively long period of time, the glucose in that dish will very slowly degrade as it oxidizes (reacts with oxygen in the air) and is converted to heat and lower energy breakdown products, increasing the dish’s total entropy over time (red trajectory in Fig. 3). Now, imagine that we add 100 identical bacteria each to the two remaining dishes at time zero. These bacteria take up glucose from their environment via transmembrane glucose transport channels (see Fig. 1) and metabolize it, facilitating cell division and bacterial growth; as the bacterial population increases in number, the amount of glucose in the system decreases, and the amount of metabolic waste and heat (entropy) increases with time in the first bacteria dish (blue trajectory, Fig. 3). Since the bacteria are highly organized dissipative structures that degrade glucose far more efficiently compared with atmospheric oxidation alone, the total amount of glucose in this dish is depleted much more rapidly compared with the bacteria-free dish (red trajectory). Lastly, imagine the second dish (green trajectory) with a starting population of 100 bacteria identical to the first dish (blue trajectory). These bacteria initially begin to divide and consume glucose at the same rate as the blue trajectory bacteria; however, let us now assume that, at an early time-point (green arrow), an individual bacterium experiences a rare beneficial mutation in the gene coding for its transmembrane glucose transport protein, enabling the progeny of this mutant bacterium to import environmental glucose at significantly faster rates compared with the original blue trajectory bacterial strain. The new, more efficient green strain will divide and consume glucose at an even faster rate than the blue strain, thus depleting the dish’s glucose and achieving maximum system entropy at an earlier time point. That is, natural selection favors the genetic mutation that leads to the faster rate of entropy. Similarly, a random beneficial mutation in a muscle gene of a predator (say, a lion) that facilitates a more rapid skeletal muscle contraction could allow progeny expressing the mutation to capture prey more efficiently, leading to an increased rate of net system entropy (in this case, the conversion of zebras and wildebeests into higher states of entropy: heat and lion poop) while at the same time slightly decreasing entropy within small subsystems (namely, the population of lions). Far from contradicting biological evolution, entropy is a thermodynamic driving force that facilitates natural selection.

Fig. 3
figure 3

A thought experiment: Natural selection of E. coli bacteria in Petri dishes favors the beneficial mutation of a glucose transport gene that results in a more efficient conversion of a thermally isolated system to a state of maximum entropy. Three dishes contain equal amounts of glucose at time 0. The glucose in the dish containing no bacteria will degrade relatively inefficiently through oxidation, achieving a state of maximum entropy over a long period of time (red line). In contrast, the glucose in the dish containing a genetically stable strain of bacteria (blue line) will be degraded much more rapidly, since bacteria are highly organized “dissipative structures” that efficiently metabolize glucose. The third dish at time 0 contains the same bacterial strain as the second dish, but soon after (time denoted by green arrow), a subpopulation experiences a rare beneficial mutation in a cell membrane glucose transport gene, rendering it a more efficient glucose transporter compared with the original strain. This more efficient glucose transporting strain (green line) will rapidly out-compete the original strain for glucose, achieving a state of maximum entropy for that dish at a time significantly earlier than the genetically stable bacterial strain (blue line)

Conclusion

The second law of thermodynamics is one of the most misunderstood aspects of physics, but it need not be. If we think of thermodynamic systems as poker hands, entropy is a measure of how well the cards are shuffled. This dispels the wrongheaded idea that the second law mandates increasing disorder because poker players playing with fair decks will sometimes get dealt very good—that is, very well-ordered—hands. What the law does say is that energy exchanges, like currency exchanges, come with a service fee, and we pay this fee in terms of increased entropy.

But the increase in entropy is only to be expected in thermally isolated systems; those in which energy is neither added nor removed. We do not live in such a system because the Sun is constantly adding energy. We can use the Sun’s energy to overcome this increase in entropy just as an eight-year-old uses energy to increase the order in a formerly untidy room.

Living organisms and the cells that make them up are like machines that maintain much lower amounts of entropy than their surroundings and like a country; they come with borders, in this case cell membranes, to distinguish the region of decreased entropy. To survive, all living things must eat and after eating, must poop. Just as we ingest and expel chemicals, so too we ingest free energy and flush out entropy. When we die, we stop eating (and pooping), and the increase in entropy is evident in the rotting of our corpse.

But entropy plays a role, not only in the continuance of individuals and their parts but also in the evolutionary development of species and thereby in speciation. Advantageous mutations are ones that increase the efficiency of energy transfers within an ecosystem. When we calculate the amount of energy needed to push evolution thermodynamically “uphill,” it is clear that, like a powerful river, the Sun provides more than sufficient “flow” to not only turn the “hydroelectric turbines” of life’s foundation (the primary producers) but to distribute “electricity” with incrementally increasing efficiency over time.

This story in all of its gory details involves complex aspects of physics, chemistry, and biology, but with these metaphors can be employed to make the situation clear to non-technicians. It is not enough to know that the anti-evolutionists’ claim that speciation is incompatible with our best understanding of thermodynamics is flawed; we must be able to explain the errors clearly to those with no scientific background. These images ought to become part of an increasing cache that scientists and philosophers develop to communicate scientific results in order to be more effective members of the wider popular conversation.

References

  • Asimov I. The ‘threat’ of ceationism. In: Montagu A, editor. Science and creationism. New York: Oxford University Press; 1984. p. 182–93.

    Google Scholar 

  • Bunn EF. Evolution and the second law of thermodynamics. Am J Phys. 2009;77(10):922–5.

    Article  CAS  Google Scholar 

  • Gould SJ. Full house: the spread of excellence from Plato to Darwin. New York: Three Rivers Press; 1997.

    Google Scholar 

  • Kaila VRI, Annila A. Natural selection for least action. Proc R Soc A. 2008;464:3055–70.

    Article  Google Scholar 

  • Margulis L, Sagan D. What is life? New York: Simon & Schuster; 1995.

    Google Scholar 

  • Morris H. Scientific creationism. El Cajon: Master Books; 1987.

    Google Scholar 

  • Penrose R. The Emperor's new mind: concerning computers, minds, and the laws of physics. Oxford: Oxford University Press; 1989.

  • Prigogine I, Stengers I. Order out of chaos: man's new dialogue with nature. London: Flamingo; 1984.

  • Reif F. Fundamentals of statistical and thermal physics. New York: McGraw-Hill; 1965.

    Google Scholar 

  • Schrödinger E. What is life? Mind and matter. Cambridge: Cambridge University Press; 1944.

  • Styler DF. Entropy and evolution. Am J Phys. 2008;76(11):1031–3.

    Article  Google Scholar 

  • Zyga L. Evolution as described by the second law of thermodynamics. http://www.physorg.com/news137679868.html; 2008.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steven Gimbel.

Rights and permissions

Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License ( https://creativecommons.org/licenses/by-nc/2.0 ), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Reprints and permissions

About this article

Cite this article

Schreiber, A., Gimbel, S. Evolution and the Second Law of Thermodynamics: Effectively Communicating to Non-technicians. Evo Edu Outreach 3, 99–106 (2010). https://doi.org/10.1007/s12052-009-0195-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12052-009-0195-3

Keywords