|posted 15 December 2002|
William H. Calvin, "Gott neu erfinden,” ("Reinventing god.") chapter for Tobias Daniel Wabbel (ed.), Im Anfang war (k)ein Gott (Patmos, Dusseldorf, 2004), pp. 175-185.Original English at http://WilliamCalvin.com/2003/reinventing.htm.
English serial rights available. This is for a German-language book with chapters by Gregory Benford, William H. Calvin, George V. Coyne, Richard Dawkins, Johannes V. Feitzinger, John F. Haught, Donald D. Hoffman, Hans Küng, William C. Mitchell, Bill Napier, F. David Peat, Douglas Preston, Ulf von Rauchhaupt, Martin Rees, Rupert Sheldrake, Ian Tattersall, Gerd Theißen, Frank J. Tipler, Charles Townes, Roger Trigg, Tobias Daniel Wabbel, Ulrich Walter, Carl Friedrich von Weizsäcker, Franz M. Wuketis.
William H. Calvin
If God did not
exist, it would be necessary to invent him.
Once ancient Greek and Islamic science took off with renewed vigor in Europe about 500 years ago, increasingly naturalistic views of the world emerged. Most of the scientists then believed in a conventional god, but they were curious about how god did it, wanting to understand the design. Relying on the supernatural in your explanation came to be seen as an unfortunate shortcut; it got a bad name through overuse, as a cover for ignorance. With more experience, scientists became very sensitive to how many ways there are to fool yourself. Leaps were to be avoided as a proven source of how your logic can lead you down a garden path into nonsense.
Avoiding an explanatory crutch is, of course, different from denying the existence of a god. It’s more of a strategy for not jumping to premature conclusions. But atheism is indeed an occupational hazard for certain scientists – not in all of science, but at least in certain areas such as cosmology and the neurosciences. While I want to be careful about throwing out the baby with the bathwater, I do suffer from both occupational predispositions to nonbelief. I once aspired to do cosmology. Now I am a neuroscientist concerned with the evolution of higher intellectual functions and ethical judgments – and that’s about as close as science gets to investigating what most people would see as the soul. You have to take care about what assumptions you make or you will just end up “discovering” your hidden assumptions. (I enjoyed years of singing in the church choir as a youth but, since then, cathedral architecture and organ music have been the main draw.)
God can be reinvented as, at a minimum, a metaconcept encompassing the major organizational principles of the universe and of life, especially intelligent life. (I will not consider here the optional add-ons such as an afterlife, reincarnation, or an interventionist god who responds to pleading, flattery, gifts, and even commands.) What needs to be saved and incorporated into a more open-ended version of “the author of the universe”?
Reinvention or abandonment may not be the only choices, of course. And I am not the right person to flesh out a useful reinvention – I’ve met a number of theologians who are fans of science and they seem more likely to get it right than I am. But let me try to cover a few of the skeletal scientific concepts that it would be nice to see in a humanistic revision.
Reductionism is handy for revealing building blocks, working your way down to quarks. But in working back upward, you get to see the coming-together principles at work – an entirely different scientific undertaking.
Building blocks are assembled to make new entities, as when hydrogen and helium were cooked in our local supernova about 7 billion years ago to produce the heavier elements found in our bodies, such as carbon and oxygen. Atoms can combine to make a molecule (and who would ever have suspected that two gases would combine to make a liquid called water!). Then there is self-organization, as when sugar crystals form in the bottom of your glass of supersaturated iced tea. Or a whirlpool forms over the drain of your bathtub. Or hexagons form in the surface of the cooking porridge when you fail to stir it. The round heads of bees, pounding on the walls of little tunnels in wax, form hexagonal cross-sections in consequence – all without planning or insight.
Processes do not begin and end but rather turn one thing into another. In the familiar phase transitions, solids turn into liquids (or sometimes directly into vapors). Same molecules, yet very different physical properties. Most processes don’t have much of a memory (the atmospheric circulation preserves traces of major events for only a few years; the ocean circulation for perhaps 1,500 years as mixing is very slow). They may “know” what state they are in now, but they usually “forget” how they got there. Experience doesn’t buy much that way.
Up to Biology
Biology incorporates some of the experience of earlier generations, making possible – though not guaranteeing – improvements. In DNA evolution, the code (genotype) and the body and behaviors they produce (the phenotype) are separate, though it probably all got started without that separation (as in RNA evolution). The genes become a rough-but-handy repository of what worked in earlier environments. Experience can accumulate over thousands of generations, thanks to fairly faithful duplication of the molecular string as a cell divides.
The cell itself is a major step up because it rewards genes working well together; they either live together or die together by rupturing the envelope they live within. The principle is carried further in eukaryotes, where the cell membrane encompasses formerly free-living organisms that were good at working together. The reinvention of god would probably want to take note of that, even celebrate it once a year.
Then one sees multicellular organisms and cells that specialize: a specialized cell has the genes to be any type of cell but only some genes are expressed, and so the cell becomes a photoreceptor rather than a gamete. Now a whole colony of cell types lives, reproduces, and dies together.
But products are not the same as principles and what a reinvention needs to encompass is a process that makes things “better.” (The scare quotes are necessary, as a century of theorizing has shown.) Indeed, the evolution of a new species’ body type is made possible because a certain type of evolutionary process earlier evolved, one that can turn one type of existing body into a novel one, never before seen. If god is to be thought of as a creator, then this process is surely a key one to understand.
Turning the creative crank
The key process is the one that Charles Darwin (and later Alfred Russel Wallace) figured out, the one that caused Thomas Huxley to say, when reading Darwin's book manuscript before its publication in 1859, “How stupid not to have thought of it before.” Two and a half millennia of very smart philosophers trying to solve the problem of how the evolutionary crank is turned, and then the answer turns out to be so simple. If the catechism contains the most solemnly defined dogmas of a religion, then under the reinvention you might want to include the six essentials of the Darwinian creative process.
It was unfortunate that Charles Darwin named his theory “natural selection” as that is only one of essentials of the process. Variations, then selection, then more variations centered on the more successful (at surviving, finding mates, rearing offspring) of the first round. Keep doing this, and some very improbable things of high quality can gradually be shaped up.
One can summarize Darwin's bootstrapping process in various ways. A century ago, Alfred Russel Wallace emphasized variation, selection, and inheritance. (It reminds me of a three-legged stool: evolution takes all of them to stand up.) But as I explain at more length in A Brain for All Seasons (from which this section is adapted), there are some hidden biological assumptions in that three-part summary. When trying to make the list a little more abstract to encompass non-biological possibilities, I wound up listing six ingredients that are essential (in the sense that if you're missing any one of them, you're not likely to see much progress):
1. There's a pattern of some sort (a string of DNA bases called a gene is the most familiar such pattern, though a cultural meme – ideas, tunes – may also do nicely).
2. Copies can be made of this pattern (indeed the minimal pattern that can be semi-faithfully copied tends to define the pattern of interest).
3. Variations occur, typically from copying errors or superpositions, more rarely from a point mutation in an original pattern.
4. A population of one variant competes with a population of another variant for occupation of a space (bluegrass competing against crabgrass for space in my backyard is an example of a copying competition).
5. There is a multifaceted environment that makes one pattern's population able to occupy a higher fraction of the space than the other (for grass, it's how often you water it, trim it, fertilize it, freeze it, and walk on it). This is the “natural selection” aspect for which Darwin named his theory, but it's only one of six essential ingredients.
6. And finally, the next round of variations are centered on the patterns that proved somewhat more successful in the prior copying competition. (The “inheritance principle”.)
Try leaving one of these out, and your quality improvement lasts only for the current generation – or it wanders aimlessly, only weakly directed by natural selection.
There are at least five things that speed up evolution. First is speciation, where a population becomes resistant to successful breeding with its parent population and thus preserves its new adaptations from being diluted by unimproved immigrants. The crank now has a ratchet. Then there is sex (systematic means of creating variety by shuffling and recombination – Don't leave variations to chance!). Splitting a population up into islands (that temporarily promote inbreeding and limit competition from outsiders) can do wonders. Another prominent speedup is when you have empty niches to fill (where competition is temporarily suspended and the resources so rich that even oddities get a chance to grow up and reproduce). Climate fluctuations, whatever they may do via culling, also promote island formation and empty niches quite vigorously on occasion, and so may temporarily speed up the pace of evolution.
This process can work in computers as well as in biology, and one suspects that it could work elsewhere in the universe in noncarbon-based life forms. You need a patterned “memory” like genes, a copying process that isn’t perfect, and then the rest of the process seems likely to spring into action. So here we have a self-organizing principle which, thanks to the addition of a memory, takes random variations and makes them into increasingly sophisticated organisms – and eventually us.
Up to Brains
Even single-cell organisms have behavior, as when bacteria tumble to randomize their direction of travel when the smell of food starts to fade. Having lots of “moves” tends to require an animal to have a specialized collection of nerve cells.
They can learn from the immediate past. “After this, therefore because of this” is a simple inference that will prevent eating something that made you sick the last time that you ate it. It leads to a lot of superstitious behavior (the philosophers call it the post hoc, ergo propter hoc fallacy) but in situations where you are short of time or knowledge, it is a good rule of thumb.
Back before evolution invented image-forming eyes, nearly all animals were soft-sided (and so didn’t fossilize very well). The Cambrian explosion 543 million years ago was when many animals added shells and skeletons that made nice fossils – and the animals appear to have done this in response to predators that had evolved image-forming eyes, instead of making do with the usual fan of photoreceptors that is sufficient to navigate for most purposes. Not only was armor and camouflage now required for protection, but life surely became much more complicated what with chases, giving advantages to animals with a more versatile brain.
The brain of a fish doesn’t have to be very big to be effective. Some of the simplest mammals get by on a small brain that only requires less that one percent of the blood supply. The average mammal uses about three percent, with predators having somewhat bigger brains than prey. We humans use about 16 percent, but our big brain is a fairly recent development. It enlarged three-fold over the pint-sized brains of our closest relatives, the chimpanzees and bonobos.
Even though they pay attention to social happenings in the manner of other primate societies, chimps and bonobos don’t augment this with gossip (and more than half of human discourse is catching up on who did what to whom). Chimps throw sticks and stones in an effort to intimidate but rarely as a hunting technique; they are never seen practicing their technique to improve their accuracy or versatility. Nor do the apes exhibit much in the way of shared attention, nothing like the way in which a child directs an adult’s attention to a third object. (“Look at that!”) Or an adult actively teaches a child. All that develops in the ape-to-human branch of evolution.
Up to an Intellect that can make ethical choices
A mere two-and-a-half million years ago, our ancestors were spun off from those upright apes with small brains and little toolmaking skill that had developed on the forest fringe as Africa became drier and drier. The spinoff had a somewhat larger brain and some toolmaking abilities.
Many of the behaviors we value – the reassuring touch and the arm around the shoulder, the hugging and kissing – turn out to be shared with our closest cousins among the great apes. That spinoff likely had them too. Sharing is generally found only between mother and offspring, but chimps also have a limited form of meat sharing. Reciprocal altruism, a fancy name for doing favors for friends, has an amazingly long growth curve – you can double your payoffs by sharing more things, with more people, over longer periods of time, onwards and upwards to today’s versions seen in third-party peacekeeping forces, which is real altruism indeed.
Those behaviors that we especially value, structured language and rationality, are not seen in the chimpanzees and bonobos. And they may not have played much of a role in that two-and-a-half million year long story, appearing only in the last one percent of the period, about 50,000 years ago in the midst of the most recent ice age, long after modern brain size had been achieved. This may have been when structured language, contingent planning, games with rules, or trains of logic finally took hold culturally.
With such structured thought, people can now pretend and lie, imitate, deceive, and simulate alternative courses of action. We pyramid levels of organization: from phonemes to words, words to sentences, sentences to narratives to ethics. We can fantasize about the future, and sometimes peer far enough into the future fog to keep from speeding past a better alternative. We now have an extraordinary ability to juggle abstractions, building a mental house of cards with enough stability to find a better metaphor, to compose a more compelling poem.
And with structured, speculative thought, we can imagine how a contemplated course of action will impact others. This is the basis of ethical behavior. But, like everything else, it can also be used for the exact opposite – say, as the terrorist attempts to create panic and death by stampede. Planning ahead also makes possible the escalation of raids into real warfare by utilizing training and stockpiling of supplies.
But structured thought does make possible the great cultural developments we associate with philosophy, science, and religion. With education, they take hold in a new generation and are often improved. Our ancestors developed altruism into disaster relief, welfare, and volunteer fire departments. They also came to reflect on what they saw, to develop a public conscience:
The disillusion and despair that characterize the political vision of Thucydides provide, paradoxically, evidence of the moral advance that had taken place in Eurasia in the first millennium B.C. Until then, people had lived in a world without a public conscience. Wars often had turned men into beasts in the past; what was new in fifth-century Greece was that Thucydides, Euripides, and others were shocked by it.
–David Fromkin, The Way of the World (Knopf 1998), p.63
Structured thought is what makes human consciousness so different from that of other animals (we may not be able to read their minds but we can see their behavior; if they could contemplate the way we can, they would be doing things to their advantage that we would see in their behaviors – but don’t). That big step up overlaps a lot with the concept of a soul.
Does it become independent of the body? Yes, in the sense of contributions to the culture and the examples we set for others, which live on after our deaths. But in Consciousness, the neurologist Adam Zeman writes [p.151], “It may be arrogant to deny that consciousness can ever slip its moorings in the brain – after all, much of the world’s population believes firmly that it can – but the evidence of this happening is tenuous at best.” And that tends to be the consensus among neuroscientists.
The search for coherence, where everything hangs together
With our kind of consciousness, we can even attempt to understand how our brains work and why they evolved. There is indeed a search for meaning. The Darwinian process operates in our brains on the time scale of thought and action as we create a plan or utterance of high quality – something better than our nighttime dreams, where we see cognitive processes free-wheeling without much quality control.
Dreams provide us with a nightly experience of people, places, and occasions that simply do not fit together. Fortunately our movement command centers are inhibited during most dreams, so we don’t get into trouble acting on dangerous nonsense. Awake, we are always searching for coherence, trying to shape up combinations that “hang together” well enough to act on. When our quality control fails and incoherence is the best thing our consciousness has available during waking hours, it tends to be called hallucination, delusion, or dementia.
A great deal of our consciousness – indeed, our intelligence – involves guessing well, as we try to make a coherent story out of fragments. We search for coherence in our surroundings, ways in which things unexpectedly hang together – and the pleasure we get from finding hidden patterns is not just from jigsaw and crossword puzzles or listening to Bach. Coherence finding – the search for meaning – has spawned an enormous range of art and technology, pyramiding complexity while miniaturizing it all, allowing computers to expand what we can accomplish with a little thought.
But what satisfies us, among all the new possibilities for coherence that we turn up, is based on the little-understood emotional intelligence carried over from the ape heritage and the life of the ice ages. Given our routine search for meaning, it is not surprising that religious concepts arose, and it should not be surprising that they will change as we understand brains and evolution better.
Still, it needs to be said that the light of evolution is just that – a means of seeing better. It is not a description of all things human, nor is it a clear prediction of what will happen next.
– Melvin Konner, The Tangled Wing (2001)
In the future
It might be objected that the reinvention of god, as sketched out here, starts to look a lot like Science writ large (real science, that is, not the advanced technology that people constantly confuse with science). But then that’s what, before recent generations, most scientists thought they were doing, figuring out how god did it.
Is the “author of the universe” nothing more than the principles I have sketched? My guess is that revisionist theologians will suggest something more, and that scientists will continue to try to find that “something more” emerging from the natural world. It will be interesting to see what evolves from that interaction.
William H. Calvin is a neurobiologist at the University of Washington in Seattle and the author of A Brain for All Seasons: Human Evolution and Abrupt Climate Change, which won the 2002 Phi Beta Kappa book award for science. Copyright ©2003 by the author. http://faculty.washington.edu/wcalvin
Wabbel, Tobias Daniel
Click on a cover for the link to amazon.com.
The River That
copyright ©2003 by William H. Calvin