Classical Theism

A brief introduction to classical theism. Classical theism is a systematic understanding of God shared among many Christian, Jewish, Pagan, Muslim, and Hindu thinkers throughout history. It is primarily philosophical rather than scriptural in origin, but it also opens up an intellectual space for understanding theism as a plausible and reasonable way to see reality. And so it makes for a useful point of entry into the world of scripture and religious experience.

With this episode I would like to do some systematic theology and focus on the most foundational subject of theology: God. Systematic theology is theology that pursues an orderly, rational, and coherent method. There are benefits to the systematic, orderly approach, which I want to take advantage of here. But it is admittedly not characteristic of the texts of scripture, which are often disorderly, uncanny, and occasionally contradictory. The systematic approach is a convenient way to understand and analyze theological concepts, but it’s usually not the way we actually encounter these things in religious experience. I’m reminded here of Blaise Pascal’s statement: “God of Abraham, God of Isaac, God of Jacob, not of the philosophers.” There’s much to be said for that sentiment. Nevertheless the systematic approach still has significant utility for comprehension and analysis. In talking about God in this systematic way the understanding of God I will take is that of classical theism.

In what follows I just want to lay out what classical theism is. I won’t get too much into arguments or proofs for God or for classical theism. That’s another topic. But I hope that just presenting what classical theism is will show it to be a very plausible and reasonable thing to believe. Even before taking any steps to argue for it or prove it.

First some definitions. Theism is the belief in the existence of God or gods. Monotheism is the belief that there is only one God. Classical theism is the belief that God is the source of all things. In more technical terms classical theism is the belief that God is metaphysically absolute. Classical theism is a form of monotheism but it’s more theoretically developed. It takes the belief that there is only one God and analyzes what that means, the way in which there is only one God, what this one God must be like. This is what makes it systematic, theological, and philosophical.

What does it mean for God to be metaphysically absolute, the source of all things? There are two major ways for there to be only one God. They are quite different and imply very different things about God’s nature. One way is for there to be a pre-existing reality in which God exists, a reality that is independent of God and prior to God. There’s a universe that happens to have a God in it and there’s only one God. The other way, the way of classical theism, is for God to be prior to everything. There is nothing without God. All reality depends on God for its existence. We could think of these loosely as God being inside all reality versus God being outside or beyond all reality.

In classical theism all of reality derives from God and depends on God. It’s even possible for God to be the only thing that exists. But it’s actually not possible for God not to exist. This is to say that God is absolutely necessary. Nothing else is necessary in this way. Everything else is contingent. It is possible for everything else not to exist. But it is not possible for God not to exist.

Classical theism tends to be philosophical, trans-religious, and trans-scriptural, meaning that it spans many religions and the texts of many religious traditions. Throughout history classical theists have been Christian, Pagan, Jewish, Muslim, and Hindu. Obviously classical theists in each of these traditions disagree on a lot. But they tend to agree in their classical theism and in their understanding of God’s primary attributes, even if they disagree on the specific things they believe God to have done in human history. Pagan classical theists include Plotinus and Proclus. Jewish classical theists include Philo of Alexandria and Maimonides. Christian classical theists include Augustine, Pseudo-Dionysius, Anselm, and Thomas Aquinas. Muslim classical theists include Ibn Sina, and Ibd Rushd. I also think that many of the ideas of Hindu thinkers like Shankara and Ramanuja have much in common with classical theism.

What’s interesting about classical theism is that it basically starts from the premise of God’s metaphysically absolute nature and derives God’s attributes from there. These attributes often coincide with scripture, albeit not always perfectly, which is an important theological issue. But that’s also a topic for another time. The attributes of God in classical theism include the following:

Aseity
Necessity
Simplicity
Eternity
Immutability
Immateriality
Omnipotence
Omniscience
Perfect Goodness

Aseity is not a well-known term but it’s very important to the topic. The word comes from Latin “a se” meaning “from self”. Aseity is the property by which a being exists of and from itself, and not from anything else. God’s aseity means that God does not depend on anything else for his existence; not on the universe, not on anything it all.

Necessity is when something cannot fail to be the case. For example, logical truths are generally considered to be necessarily true. An example would be the proposition “If p and q, then p”. This would seem to be necessarily true. It couldn’t be otherwise. Philosophers might still debate that but it should at least be clear what we’re talking about with necessity. God’s necessity means that God cannot not exist. Understanding why that is and arguing for it is a bigger topic. But understanding the claim that God is necessary is key to understanding what classical theism is.

Simplicity means not having any parts. According to classical theism God is simple in this way. God is not composed of parts. Put another way, God is not composite. Composite is the opposite of simple. Many philosophers consider divine simplicity to be the most important concept of classical theism and hold that all of classical theism derives from it and is ultimately equivalent to it. To understand some of the motivation behind this, anything that is composite, made up of parts, has to be put together in the way that it is put together. But composition of this kind makes it dependent on whatever it is that puts it together. So it wouldn’t be the first or source of all things.

Eternity refers to what exists outside of time. Eternity, as understood in classical philosophy, is different from how the word is commonly understood. There is the notion of things being everlasting, existing within time but lasting forever, for an infinite duration. But this is different from the kind of eternity in classical theism. God’s eternity is his existence outside of time itself. Time, in fact, would be one of the things created by God. We can imagine God looking at the passage of time as we look at the passage of time for characters in a book. For the characters in a story, if they were real, they would experience time sequentially. But for us as readers we can look at the story as a whole, all at once, because we are outside of the time of that story. Like the characters in that story, we experience our time sequentially. The past is behind us. The future is ahead of us. Only the present is before us. But for God it is all present and equally before him.

Immutability is the impossibility of changing. There’s definitely a relation here to eternity. God could hardly change across time since he exists outside of time itself. This brings up an interesting question about whether God, being immutable, will seem the same to us at all times. Not necessarily. Even if God doesn’t change, we do. For example, God is perfectly good and that doesn’t change. But our morality varies significantly. The way we perceive God will vary significantly depending on whether our conduct is mostly moral versus mostly immoral.

Immateriality, as the term suggests, is the quality of not being material. Even without a technical definition I think we all have a good intuition what materiality is. In fact, it’s more difficult to think of anything that isn’t material. It’s the material that makes up our immediate experience. Matter is the stuff that, when you kick it, it kicks back. Material things exist in time and space. If we refer to more modern chemistry and physics, matter is composed of particles, waves, and fields. Particles like protons, neutrons, and electrons have mass, particles like photons do not. But they’re all material. Material things interact with each other. They exchange momentum; they attract or repel each other through electric change. Photons induce chemical reactions. But God, being immaterial is not like any of these things.

How could any thing be immaterial? This was a question that Augustine had. He was finally able to conceptualize immaterial entities by way of Platonist and Neo-Platonist philosophy, which have a lot to say about immaterial forms. Today we most commonly come across immateriality in the form of abstract, mathematical, and logical objects. The philosopher Phillip Cary uses the example of the Pythagorean theorem. The Pythagorean theorem is not something that exists in space and time. It’s eternal, necessary, and omnipresent. It didn’t ever start being true and it will never stop being true. It cannot not be true. And it’s true everywhere. It’s not made up of particles, waves, or fields. It’s not something you handle or that kicks back. That gives an idea of what an immaterial thing can be like.

God is not an abstract, mathematical, or logical object. But he is immaterial in classical theism. He’s more like an abstract, mathematical, or logical object than he is like an electron, proton, or magnetic field.

Omnipotence is the quality of having unlimited power. This is very related to God’s nature as metaphysically absolute, the source of all things. All things come from God and are the way they are because of God. There is no other source for all that is and no other power in serious competition with God. God is able to do anything that it is possible to do. What kind of constraints does that condition impose? What would be impossible for God? Contradiction certainly. Even God cannot make something to be the case and not be the case. You’ve probably heard the question, often asked in jest, “Could God make a stone so heavy that he couldn’t lift it?” Well, no. That would be a contradiction. Other constraints imposed by consistency may be more subtle. Like, why does God permit human history to proceed in certain ways, especially ways that we would much prefer that they didn’t? Here again, self-consistency probably plays an important role. Human free will is an important constraint. And there are likely other, unknown constraints, resulting from God’s unrevealed purposes.

Omniscience is the quality of knowing everything. This is also very related to being metaphysically absolute, the source of all things. As the cause of all things God also has knowledge of all things. If we imagine all things that can be known as a book God knows all things in that book, not only because he has read it, but also because he wrote it. He is the author of all that is. Many of the foregoing points about omnipotence apply here as well. There’s a classic concern about the conflict between divine omniscience and human free will. If God knows everything, including everything that we will ever do, can we really be said to freely choose to do those things? That’s a complicated problem and a whole topic in itself. Without actually resolving that question I’ll just make an observation using the analogy of the author. There is a sense in which the author of a story is constrained by the story itself. Authors can arbitrarily impose nonsensical decisions on their characters. But good authors don’t. Good authors follow their stories where they naturally lead. Their characters, even though they’re fictional, have a kind of free will of their own. That’s just an analogy but I think something similar applies to God’s authorship of all things and his knowledge of them. On the one hand he is the author and cause of all things. But this authorship and resulting knowledge is not just arbitrary. The evolution of all things, especially of human history, make sense and have a narrative coherence to them.

Finally, God is perfectly good. In Plato’s Republic, Socrates actually placed “the form of the Good” at the highest point on his spectrum of entities, the Divided Line. Goodness is not incidental to God’s nature but is absolutely intrinsic to who he is. One of the oldest problems in moral philosophy is whether God decrees what is good because it is good or whether it is good because he decrees it. This is a form of the Euthyphro Dilemma, based on another of Plato’s dialogues. Put another way, the question is whether God is prior to goodness or goodness prior to God. But in classical theism this is a false dilemma. God and the Good are not distinct at all. God is the Good.

Apart from classical theism the great worry with the Euthyphro Dilemma is that if goodness is merely whatever God decrees it to be then God could decree horrendous evils to be good. And they would have to be good. But under classical theism this is not possible. God is the Good. Neither God nor the Good are arbitrary. Horrendous evil cannot be made good and God cannot and will not decree them so. To do so would be to contradict his own nature.

All of the foregoing is principally philosophical rather than scriptural or based on revelatory religious experience. Though it has been most developed by Christians the foundations come largely from Platonist and Neo-Platonist philosophy, for example from Plotinus’s Enneads and Proclus’s Elements of Theology. Whether that is a weakness or a strength is a matter of perspective. I think it’s a strength but it also means that for Christian theology classical theism is a starting point rather than an end point. But I also consider it a great strength to see that classical theism spans so many traditions and schools of thought.

One of the best modern books on classical theism is David Bentley Hart’s The Experience of God. In that book he makes the following point:

“Certainly the definition of God I offer below is one that, allowing for a number of largely accidental variations, can be found in Judaism, Christianity, Islam, Vedantic and Bhaktic Hinduism, Sikhism, various late antique paganisms, and so forth (it even applies in many respects to various Mahayana formulations of, say, the Buddha Consciousness or the Buddha Nature, or even to the earliest Buddhist conception of the Unconditioned, or to certain aspects of the Tao…” (p. 4)

I find the Hindu convergences especially fascinating. Shankara (circa 700 – 750) was an interpreter of Vedantic Hinduism, Advaita Vedanta to be specific. A central concept in that tradition is Brahman, the highest universal principle, the ultimate reality, the cause of all that exists. In Advaita Vedanta this is identical to the substance of Atman, the Self or self-existent essence of individuals. Ramanuja (1017 – 1137) had a different interpretation called “qualified non-dualism” which makes greater distinction between Atman and Brahman. But Brahman, the ultimate reality behind all that exists, is central to the thought of both.

There are four modern authors on classical theism that I really like. These are David Bentley Hart, Edward Feser, James Dolezal, and Matthew Barrett.

I already mentioned David Bentley Hart’s book The Experience of God: Being, Consciousness, Bliss. Hart is an Orthodox Christian and also has an interesting affinity for Hinduism. In fact, the subtitle to his book – “Being, Consciousness, Bliss” – is a nod to the Hindu concept of Satcitananda, a Sanskrit term for the subjective experience of Brahman, the ultimate unchanging reality. Satcitananda is a compound word consisting of “sat”, “chit”, and “ananda”: being, consciousness, and bliss. These three are considered inseparable from Brahman.

Edward Feser’s book Five Proofs for the Existence of God goes through five proofs that he reworks from the ideas of five individuals: Aristotle, Plotinus, Augustine, Aquinas, and Leibniz. Each of the five proofs is classically theistic in nature. Later chapters in the book also go over the classical theist understanding of God’s nature in great detail.

James Dolezal’s major book on this subject is All That Is in God: Evangelical Theology and the Challenge of Classical Christian Theism. Dolezal pushes back on what he perceives as some drift away from classical theism in Evangelical theology. I mentioned earlier that some theologians place simplicity foremost among God’s attributes. Dolezal is one of these. Simplicity is central to his thought.

Matthew Barrett is a delightful theologian to read. He is editor of Credo Magazine and host of the Credo podcast. One of his common themes on Twitter is the need for Protestants and especially Evangelicals to take seriously the thought of Aquinas, the Church Fathers, and classical theism. His major book on the subject is None Greater: The Undomesticated Attributes of God.

Why talk about classical theism? To lay all my cards on the table, I desire for all to believe in God the Father, his Son Jesus Christ, and in the Holy Spirit. I am enthusiastically Christian and desire for all to be so as well, because I believe it is true. One of the first steps in this direction is belief in God. But in modernity belief in God is hardly a given. It might even seem implausible. How is believing in God any different from believing in Santa Claus, the Tooth Fairy, or the Flying Spaghetti Monster? Well, it’s actually extremely different. And I think that to really understand classical theism is to understand this difference.

God is not just an invisible being that we have to believe in, just because. Blind faith. Classical theism is much more philosophically reflective than that. To think about God is to think about and have some interest and curiosity about everything that exists, why it exists, and why it is as it is. It is maximally inquisitive and critically so. I believe that classical theism is very plausible and reasonable. That’s not actually why I believe in God or in Christianity. I attribute my belief to revelation from the Spirit. But intellectual openness and receptivity preceded that Spiritual revelation. Seeing classical theism to be a plausible and reasonable way to understand reality broke down intellectual and cultural barriers to spiritual receptivity. And that’s why I think it’s a topic worth talking about.

Star Trek: Rapture

Rick and Todd discuss the Star Trek: Deep Space Nine episode “Rapture” in which Captain Sisko, Emissary to the Bajoran Prophets, receives a series of dramatic visions. We discuss the interpretive frameworks of spiritual and secular worldviews, the high costs of prophecy, the reliability or trustworthiness of powerful entities, the interaction of spiritual experiences and the brain, and the importance of the visions in the Deep Space Nine narrative arc.

Evolutionary Biology With Molecular Precision

Evolutionary biology benefits from a non-reductionist focus on real biological systems at the macroscopic level of their natural and historical contexts. This high-level approach makes sense since selection pressures operate at the level of phenotypes, the observed physical traits of organisms. Still, it is understood that these traits are inherited in the form of molecular gene sequences, the purview of molecular biology. The approach of molecular biology is more reductionist, focusing at the level of precise molecular structures. Molecular biology thereby benefits from a rigorous standard of evidence-based inference by isolating variables in controlled experiments. But it necessarily sets aside much of the complexity of nature. A combination of these two, in the form of evolutionary biochemistry, targets a functional synthesis of evolutionary biology and molecular biology, using techniques such as ancestral protein reconstruction to physically ‘resurrect’ ancestral proteins with precise molecular structures and to observe their resulting expressed traits experimentally.

I love nerdy comics like XKCD and Saturday Morning Breakfast Cereal (SMBC). For the subject of this episode I think there’s a very appropriate XKCD comic. It shows the conclusion of a research paper that says, “We believe this resolves all remaining questions on this topic. No further research is needed.” And the caption below it says, “Just once, I want to see a research paper with the guts to end this way.” And of course, the joke is that no research paper is going to end this way because further research is always needed. I’m sure this is true in all areas of science but I think two particular fields it’s especially true. One is in neuroscience, where there is still so much that we don’t know. And the other is evolutionary biology. The more I dig into evolutionary biology the more I appreciate how much we don’t understand. And that’s OK. The still expansive frontiers in each of these fields is what makes them especially interesting to me. Far from being discouraging, unanswered questions and prodding challenges should be exciting. With this episode I’d like to look at evolutionary biology at its most basic, nuts-and-bolts level at the level of chemistry. This combines the somewhat different approaches of both evolutionary biology and molecular biology.

Evolutionary biology benefits from a non-reductionist focus on real biological systems at the macroscopic level of their natural and historical contexts. This high-level approach makes sense since selection pressures operate at the level of phenotypes, the observed physical traits of organisms. Still, it is understood that these traits are inherited in the form of molecular gene sequences, the purview of molecular biology. The approach of molecular biology is more reductionist, focusing at the level of precise molecular structures. Molecular biology thereby benefits from a rigorous standard of evidence-based inference by isolating variables in controlled experiments. But it necessarily sets aside much of the complexity of nature. A combination of these two, in the form of evolutionary biochemistry, targets a functional synthesis of evolutionary biology and molecular biology, using techniques such as ancestral protein reconstruction to physically ‘resurrect’ ancestral proteins with precise molecular structures and to observe their resulting expressed traits experimentally. This enables evolutionary science to be more empirical and experimentally grounded.

In what follows I’d like to focus on the work of biologist Joseph Thornton, who is especially known for his lab’s work on ancestral sequence reconstruction. One review paper of his that I’d especially recommend is his 2007 paper, Mechanistic approaches to the study of evolution: the functional synthesis, published in Nature and co authored with Antony Dean.

Before getting to Thornton’s work I should mention that Thornton has been discussed by biochemist Michael Behe, in particular in his fairly recent 2019 book Darwin Devolves: The New Science About DNA That Challenges Evolution. Behe discusses Thornton’s work in the eighth chapter of that book. I won’t delve into the details of the debate between the two of them, simply because that’s it’s own topic and not what directly interests me here. But I’d just like to comment that I personally find Behe’s work quite instrumentally useful to evolutionary science. He’s perceived as something of a nemesis to evolutionary biology but I think he makes a lot of good points. I could be certainly wrong about this but I suspect that many of the experiments I’ll be going over in this episode were designed and conducted in response to Behe’s challenges to evolutionary biology. Maybe these kinds of experiments wouldn’t have been done otherwise. And if that’s the case Behe has done a great service. 

Behe’s major idea is “irreducible complexity”. An irreducibly complex system is “a single system which is composed of several well-matched, interacting parts that contribute to the basic function, and where the removal of any one of the parts causes the system to effectively cease functioning.” (Darwin’s Black Box: The Biochemical Challenge to Evolution) How would such a system evolve by successive small modifications if no less complex a system would function? That’s an interesting question. And I think that experiments designed to answer that question are quite useful.

Behe and I are both Christians and we both believe that God created all things. But we have some theological and philosophical differences. My understanding of the natural and supernatural is heavily influenced by the thought of Thomas Aquinas, such that in my understanding nature is actually sustained and directed by continual divine action. I believe nature, as divine creation, is rationally ordered and intelligible, since it is a product of divine Mind. As such, I expect that we should, at least in principle, be able to understand and see the rational structure inherent in nature. And this includes the rational structure and process of the evolution of life. Our understanding of it may be miniscule. But I think it is comprehensible at least in principle. Especially since it is comprehensible to God. So I’m not worried about a shrinking space for some “god of the gaps”. Still, I think it’s useful for someone to ask probing questions at the edge or our scientific understanding, to poke at our partial explanations and ask, “how exactly?” But, perhaps different from Behe, I expect that we’ll continually be able to answer such questions better and better, even if there will always be a frontier of open questions and problems.

With complete admission that what I’m about to say is unfair, I do think that some popular understanding of evolution lacks a certain degree of rigor and doesn’t adequately account for the physical constraints of biochemistry. Evolution can’t just proceed in any direction to develop any trait to fill any adaptive need, even if there is a selection pressure for a trait that would be nice to have. OK, well that’s why it’s popular rather than academic, right? Like I said, not really fair. Still, let’s aim for rigor, shall we? Behe gets at this issue in his best known 1996 book Darwin’s Black Box: The Biochemical Challenge to Evolution. In one passage  he comments on what he calls the “fertile imaginations” of evolutionary biologists:

“Given a starting point, they almost always can spin a story to get to any biological structure you wish. The talent can be valuable, but it is a two edged sword. Although they might think of possible evolutionary routes other people overlook, they also tend to ignore details and roadblocks that would trip up their scenarios. Science, however, cannot ultimately ignore relevant details, and at the molecular level all the ‘details’ become critical. If a molecular nut or bolt is missing, then the whole system can crash. Because the cilium is irreducibly complex, no direct, gradual route leads to its production. So an evolutionary story for the cilium must envision a circuitous route, perhaps adapting parts that were originally used for other purposes… Intriguing as this scenario may sound, though, critical details are overlooked. The question we must ask of this indirect scenario is one for which many evolutionary biologists have little patience: but how exactly?”

“How exactly?” I actually think that’s a great question. And I’d say Joseph Thornton has made the same point to his fellow biologists, maybe even in response to Behe. In the conclusion of their 2007 paper he and Antony Dean had this wonderful passage:

“Functional tests should become routine in studies of molecular evolution. Statistical inferences from sequence data will remain important, but they should be treated as a starting point, not the centrepiece or end of analysis as in the old paradigm. In our opinion, it is now incumbent on evolutionary biologists to experimentally test their statistically generated hypotheses before making strong claims about selection or other evolutionary forces. With the advent of new capacities, the standards of evidence in the field must change accordingly. To meet this standard, evolutionary biologists will need to be trained in molecular biology and be prepared to establish relevant collaborations across disciplines.”

Preach it! That’s good stuff. One of the things I like about the conclusion to their paper is that it talks about all the work that still needs to be done. It’s a call to action (reform?) to the field of evolutionary biology. 

Behe has correctly pointed out that their research doesn’t yet answer many important questions and doesn’t reduce the “irreducible complexity”. True, but it’s moving in the right direction. No one is going to publish a research paper like the one in the XKCD comic that says, “We believe this resolves all remaining questions on this topic. No further research is needed.” Nature and evolution are extremely complex. And I think it’s great that Thornton and his colleagues call for further innovations. For example, I really like this one:

“A key challenge for the functional synthesis is to thoroughly connect changes in molecular function to organismal phenotype and fitness. Ideally, results obtained in vitro should be verified in vivo. Transgenic evolutionary studies identifying the functional impact of historical mutations have been conducted in microbes and a few model plant and animal species, but an expanded repertoire of models will be required to reach this goal for other taxa. By integrating the functional synthesis with advances in developmental genetics and neurobiology, this approach has the potential to yield important insights into the evolution of development, behaviour and physiology. Experimental studies of natural selection in the laboratory can also be enriched by functional approaches to characterize the specific genetic changes that underlie the evolution of adaptive phenotypes.”

For sure. That’s exactly the kind of work that needs to be done. And it’s the kind of work Behe has challenged evolutionary biologists to do. I think that’s great. Granted, that kind of work is going to be very difficult and take a long time. But that’s a good target. And we should acknowledge the progress that has been made. For example, earlier in the paper they note:

“The Reverend William Paley famously argued that, just as the intricate complexity of a watch implies a design by a watchmaker, so complexity in Nature implies design by God. Evolutionary biologists have typically responded to this challenge by sketching scenarios by which complex biological systems might have evolved through a series of functional intermediates. Thornton and co-workers have gone much further: they have pried open the historical and molecular ‘black box’ to reconstruct in detail — and with strong empirical support — the history by which a tightly integrated system evolved at the levels of sequence, structure and function.”

Yes. That’s a big improvement. It’s one thing to speculate, “Well, you know, maybe this, that, and the other” (again, being somewhat unfair, sorry). But it’s another thing to actually reconstruct ancestral sequences and run experiments with them. That’s moving things to a new level. And I’ll just mention in passing that I do in fact think that all the complexity in Nature was designed by God. And I don’t think that reconstructing that process scientifically does anything to reduce the grandeur of that. If anything, such scientific understanding facilitates what Carl Sagan once called “informed worship” (The Varieties of Scientific Experience: A Personal View of the Search for God). 

With all that out of the way now, let’s focus on Thornton’s very interesting work in evolutionary biochemistry.

First, a very quick primer on molecular biology. The basic process of molecular biology is that DNA makes RNA, and RNA makes proteins. Living organisms are made of proteins. DNA is the molecule that contains the information needed to make the proteins. And RNA is the molecule that takes the information from DNA to actually make the proteins. The process of making RNA from DNA is called transcription. And the process of making proteins from RNA is called translation. These are very complex and fascinating processes. Evolution proceeds through changes to the DNA molecule called mutations. And some changes to DNA result in changes to the composition and structure of proteins. These changes can have macroscopically observable effects.

In Thornton’s work with ancestral sequence reconstruction the idea is to look at a protein as it is in an existing organism, try to figure out what that protein might have been like in an earlier stage of evolution, and then to make it. Reconstruct it. By actually making the protein you can look at its properties. As described in the 2007 Nature article:

“Molecular biology provides experimental means to test these hypotheses decisively. Gene synthesis allows ancestral sequences, which can be inferred using phylogenetic methods, to be physically ‘resurrected’, expressed and functionally characterized. Using directed mutagenesis, historical mutations of putative importance are introduced into extant or ancestral sequences. The effects of these mutations are then assessed, singly and in combination, using functional molecular assays. Crystallographic studies of engineered proteins — resurrected and/or mutagenized — allow determination of the the structural mechanisms by which amino-acid replacements produce functional shifts. Transgenic techniques permit the effect of specific mutations on whole-organism phenotypes to be studied experimentally. Finally, competition between genetically engineered organisms in defined environments allows the fitness effects of specific mutations to be assessed and hypotheses about the role of natural selection in molecular evolution to be decisively tested.”

What’s great about this kind of technique is that it spans a number of levels of ontology. Evolution by natural selection acts on whole-organism phenotypes. So it’s critical to understand what these look like between all the different versions of a protein. We don’t just want to know that we can make all these different kinds of proteins. We want to know what they do, how they function. Function is a higher-level ontology. But we also want to be precise about what is there physically. And we have that as well, down to the molecular level. Atom for atom we know exactly what these proteins are.

To dig deeper into these experimental methods I’d like to refer to another paper, Evolutionary biochemistry: revealing the historical and physical causes of protein properties, published in Nature in 2013 by Michael Harms and Joseph Thornton. In this paper the authors lay out three strategies for studying the evolutionary trajectories of proteins.

The first strategy is to explicitly reconstruct “the historical trajectory that a protein or group of proteins took during evolution.”

“For proteins that evolved new functions or properties very recently, population genetic analyses can identify which genotypes and phenotypes are ancestral and which are derived. For more ancient divergences, ancestral protein reconstruction (APR) uses phylogenetic techniques to reconstruct statistical approximations of ancestral proteins computationally, which are then physically synthesized and experimentally studied… Genes that encode the inferred ancestral sequences can then be synthesized and expressed in cultured cells; this approach allows for the structure, function and biophysical properties of each ‘resurrected’ protein to be experimentally characterized… By characterizing ancestral proteins at multiple nodes on a phylogeny, the evolutionary interval during which major shifts in those properties occurred can be identified. Sequence substitutions that occurred during that interval can then be introduced singly and in combination into ancestral backgrounds, allowing the effects of historical mutations on protein structure, function and physical properties to be determined directly.”

This first strategy is a kind of top-down, highly directed approach. We’re trying to follow exactly the path that evolution followed and only that path to see what it looks like.

The second strategy is more bottom-up. It is “to use directed evolution to drive a functional transition of interest in the laboratory and then study the mechanisms of evolution.” The goal is not primarily to follow the exact same path that evolution followed historically but rather to stimulate evolution, selecting for a target property, to see what path it follows. 

“A library of random variants of a protein of interest is generated and then screened to recover those with a desired property. Selected variants are iteratively re-mutagenized and are subject to selection to optimize the property. Causal mutations and their mechanisms can then be identified by characterizing the sequences and functions of the intermediate states realized during evolution of the protein.”

If the first strategy is top-down and the second strategy is bottom-up, the third strategy is to cast a wide net. “Rather than reconstructing what evolution did in the past, this strategy aims to reveal what it could do.” In this approach:

“An initial protein is subjected to random mutagenesis, and weak selection for a property of interest is applied, enriching the library for clones with the property and depleting those without it. The population is then sequenced; the degree of enrichment of each clone allows the direct and epistatic effects of each mutation on the function to be quantitatively characterized.”

Let’s look at an example from Thornton’s work, which followed the first, top-down approach. The most prominent work so far has been on the evolution of glucocorticoid receptors (GRs) and mineralocorticoid receptors (MRs). See for example the 2006 paper Evolution of Hormone-Receptor Complexity by Molecular Exploitation, published in Science by Jamie Bridgham, Sean Carroll, and Joseph Thornton.

Glucocorticoid receptors and mineralocorticoid receptors bind with glucocorticoid and mineralocorticoid steroid hormones. The two steroid hormones studied in Thornton’s work are cortisol and aldosterone. Cortisol activates the glucocorticoid receptor to regulate metabolism, inflammation, and immunity. Aldosterone activates the mineralocorticoid receptor to regulate electrolyte homeostasis of plasma sodium and potassium levels. Glucocorticoid receptors and mineralocorticoid receptors share common origin and Thornton’s work was to reconstruct ancestral versions of these proteins along their evolutionary path and test their properties experimentally.

Modern mineralocorticoid receptors can be activated by both aldosterone and cortisol but modern glucocorticoid receptors are activated only by cortisol in bony vertebrates. So in their evolution GRs developed an insensitivity to aldosterone.

The evolutionary trajectory is as follows. There are versions of MR and GR extant in tetrapods, teleosts (fish), and elasmobranchs (sharks). GRs and MRs trace back to a common protein from 450 million years ago, the ancestral corticoid receptor (AncCR). The ancestral corticoid receptor is thought to have been activated by deoxycorticosterone (DOC), the ligand for MRs in extant fish.

Phylogeny tells us that the ancestral corticoid receptor gave rise to GR and MR in a gene-duplication event. Interestingly enough this was before aldosterone had even evolved. In tetrapods and teleosts, modern GR is only sensitive to cortisol; it is insensitive to aldosterone.

Thornston and his team reconstructed the ancestral corticoid receptor (AncCR) and found that it is sensitive to DOC, cortisol, and aldosterone. Most phylogenetic analysis revealed that precisely two mutations, amino acid substitutions, resulted in the glucocorticoid receptor phenotype: aldosterone insensitivity and cortisol sensitivity. These amino acid substitutions are S106P, from serine to proline at site 106, and L111Q, from leucine to glutamine at site 111. Thornston synthesized these different proteins to observe their properties. The protein with just the L111Q mutation did not bind to any of the ligands: DOC, cortisol, or aldosterone. So it is unlikely that the L111Q mutation would have occurred first. The S106P mutation reduces aldosterone and cortisol sensitivity but it remains highly DOC-sensitive. With both the S106P and L111Q mutations in series aldosterone sensitivity is reduced even further but cortisol sensitivity is restored to levels characteristic of extant GRs. A mutational path beginning with S106P followed by L111Q thus converts the ancestor to the modern GR phenotype by functional intermediate steps and is the most likely evolutionary scenario.

Michael Behe has commented that this is an example of a loss of function whereas his challenge to evolutionary biology is to demonstrate how complex structures evolved in the first place. That’s a fair point. Still, this is a good example of the kind of molecular precision we can get in our reconstruction of evolutionary processes. This does seem to show, down to the molecular level, how these receptors evolved. And that increases our knowledge. We know more about the evolution of these proteins than we did before. That’s valuable. We can learn a lot more in the future using these methods and applying them to other examples. 

One of the things I like about this kind of research is that it not only shows what evolutionary paths are possible but also which ones are not. Another one of Thornton’s papers worth checking out is An epistatic ratchet constrains the direction of glucocorticoid receptor evolution, published in Nature in 2009, co-authored by Jamie Bridgham and Eric Ortlund. The basic idea is that in certain cases once a protein acquires a new function “the evolutionary path by which this protein acquired its new function soon became inaccessible to reverse exploration”. In other words, certain evolutionary processes are not reversible. This is similar to Dollo’s Law of Irreversibility, proposed in 1893: “an organism never returns exactly to a former state, even if it finds itself placed in conditions of existence identical to those in which it has previously lived … it always keeps some trace of the intermediate stages through which it has passed.” In their 2009 paper Harms and Thornton and  state: “We predict that future investigations, like ours, will support a molecular version of Dollo’s law: as evolution proceeds, shifts in protein structure-function relations become increasingly difficult to reverse whenever those shifts have complex architectures, such as requiring conformational changes or epistatically interacting substitutions.”

This is really important. It’s important to understand that evolution can’t just do anything. Nature imposes constraints both physiologically and biochemically. I think in some popular conceptions we imagine that “life finds a way” and that evolution is so robust that organisms will evolve whatever traits they need to fit their environments. But very often they don’t, and they go extinct. And even when they do, their evolved traits aren’t necessarily perfect. Necessity or utility can’t push evolution beyond natural constraints. A good book on the subject of physiological constraints on evolution is Alex Bezzerides’s 2021 book Evolution Gone Wrong: The Curious Reasons Why Our Bodies Work (Or Don’t). Our anatomy doesn’t always make the most sense. It’s possible to imagine more efficient ways we could be put together. But our evolutionary history imposes constraints that don’t leave all options open, no matter how advantageous they would be. And the same goes for biochemistry. The repertoire of proteins and nucleic acids in the living world is determined by evolution. But the properties of proteins and nucleic acids are determined by the laws of physics and chemistry.

One way to think about this is with a protein sequence space. This is an abstract multidimensional space. Michael Harms and Joseph Thornton describe this in their 2013 paper.

“Sequence space is a spatial representation of all possible amino acid sequences and the mutational connections between them. Each sequence is a node, and each node is connected by edges to all neighbouring proteins that differ from it by just one amino acid. This space of sequences becomes a genotype–phenotype space when each node is assigned information about its functional or physical properties; this representation serves as a map of the total set of relations between sequence and those properties. As proteins evolve, they follow trajectories along edges through the genotype–phenotype space.”

What’s crucial to consider in this kind of model is that most nodes are non-functional states. This means that possible paths through sequence space will be highly constrained. Not just any path is possible. There may be some excellent nodes in the sequence space that would be perfect for a given environment. But if they’re not connected to an existing node via a path through functional states they’re not going to occur through evolution.

To conclude, it’s an exciting time for the evolutionary sciences. If you compare our understanding of the actual physical mechanisms for inheritance and evolution, down to the molecular level we are leaps and bounds ahead of where we were a century ago. Darwin and his associates had no way of knowing the kinds of things we know now about the structures of nucleic acids and proteins. This makes a big difference. It’s certainly not the case that we have it all figured out. That’s why I put evolutionary biology in the same class as neuroscience when it comes to what we understand compared to how much there is to understand. We’re learning more and more all the time just how much we don’t know. But that’s still progress. We are developing the tools to get very precise and detailed in what we can learn about evolution.