posted 25 May 2004

COPY-AND-PASTE CITATION


William H. Calvin, "The fate of the soul." Natural History 113(5):52-56 (June 2004). See also http://WilliamCalvin.com/2004/NaturalHistory.htm


There is also a color facsimile (the PDF file is 10mb) for those who want to print out a single personal copy.

William H. Calvin 
it's an image, you need to type it, not copy it (spam...)       
 
 University of Washington

 SEATTLE, WASHINGTON 98195-1800 USA  

REVIEW                                                                     Natural History 113(5):52-56 (June 2004)

 

The Fate of the Soul

Centuries of "experimental philosophy" and cognitive neuroscience have led to a revolutionary understanding of how the brain makes the mind.

 

By William H. Calvin

  

Soul Made Flesh:
The Discovery of the Brain
and How It Changed the World

by Carl Zimmer
Free Press, 2004, $26.00

The Birth of the Mind:
How a Tiny Number of Genes
Creates the Complexities
of Human Thought

by Gary Marcus
Basic Books, 2004; $26.00    
                                       

 

If any organ could claim to be the seat of feeling and intellect, surely it was the heart. Until three centuries ago, that seemed a fact too obvious to contest. Unlike other organs, you can feel your heart pounding away inside you. If you start thinking exciting thoughts, it beats even faster. If it stops beating, you are animated no more. And so the heart seemed to be the seat of the soul.

"Soul" was the name for what animated something, what gave it goals and the ability to make things happen. Just as people now distinguish hardware from software, anatomy from physiology, brain from mind, nouns from verbs, and form from function, it was once commonplace to distinguish body from soul.  Besides The Soul, philosophers also believed in various "little souls," which made the bodily organs into something more than meat. The stomach's soul, for instance, was said to attract food down from the mouth. Once seventeenth-century science began to realize the heart is just a humble pump, it was as if the soul had suddenly fled the chest like a restless ghost to lodge itself in the head.

Today we physiologists would point out that the "little soul" animating an organ is simply its function, which arises from the emergent properties of a "committee" of cells. And we would suggest that the big, catchall Soul is one of the brain's higher functions.

Only forty years ago, it also seemed obvious that the world was divided into animated stuff and non-animated stuff. But now, instead of a sharp boundary between the living and the inert, there is a gray zone at the level of molecular biology. The still-useful distinction is expressed by the special word we employ for the formerly animated: "dead."

What really counts, physiologists now know, is "brain dead." Even though some ancient philosophers knew the brain plays a role in paralysis, seizures, and behavioral derangement, that knowledge was regularly overlooked for the following 2,000 years. The Delphic oracle's reputed advice to "know thyself" has had a rocky road. No one understood what was inside the brain. No one was able to imagine how all that fatty stuff could animate us, enabling us to think complex thoughts and communicate them to others. Soul, mind, and brain all overlap -- but how much? Can we do without one category entirely?

Two new books now provide important perspectives on that question for the general reader. Carl Zimmer's Soul Made Flesh traces the rise in England of experimental philosophy through the lives of the so-called virtuosi --anatomists, physicians, and philosophers --in the dozen years before they banded together to form the Royal Society in London in 1660. It was the virtuosi who began to replace Aristotle's theory of the soul with knowledge about the body and the brain gleaned for the first time through the scientific method. In The Birth of the Mind, Gary Marcus writes from the twenty-first-century perspective of how the brain makes mind ("soul" has now been dropped from the scientific vocabulary). He describes the biological basis for higher mental processes, and explains how the gene-controlled process of wiring up the brain leads to behavioral differences between individuals - the inborn source of the unique individuality of every mind. 

Like most brain scientists, I am inconsistent in using the term "mind" (and I haven't heard a serious discussion about the soul's interface with the brain for thirty years). Some say "Mind is what brains do," but most of what the brain does is routine and no different from what all other animal brains do: controlling the search for food and mates, analyzing the sensory inputs, and deciding what to do next. What are so obviously mindlike are the higher intellectual functions involving structured thought. And despite the accomplishments of centuries of science, which are celebrated in these two books, scientific knowledge of how and why our remote ancestors first developed these higher capacities is still anything but complete.

Some 50,000 years ago a burst of technological and artistic activity erupted in Africa and soon became a great profusion of art, trading, body decoration, and new tools. The material evidence of that creative explosion is taken as an indicator of the mind's "big bang": the time after which Homo sapiens did things from which we infer that, for the first time, people could think long, complicated thoughts, much as we do today.

What triggered that "modernity"? Was it an enhanced ability to imitate? Planning ability? The use of symbolism, even words? Many suspect that the spark 50,000 years ago may have come from the development of structured language.

A protolanguage made of nothing more complex than short sentences, similar to the ones uttered by two-year-olds, could have been around for a long time, slowly building vocabulary without lengthening sentences. Without longer sentences, though, our ancestors probably lacked long and complex thoughts. That most likely restricted them to a mental life in the here-and-now.  They would have been unable to see themselves as the narrators of a life story, always (as we are today) at a crossroads between alternative interpretations of the past and various paths projected into possible futures. (They might not have worried much, either. Although they saw death every day, without the ability to speculate about the future they could not conceive of their own mortality.)

Yet there is a major barrier to creating longer sentences. As the number of words increases, there are so many ways they could relate to one another that you drown in ambiguity. Short sentences -at least in context- are seldom ambiguous, so structuring is optional. But long sentences –the kind that children today are beginning to figure out at age three --are possible only through structuring language with syntax. It works like this: I can have a model in my mind of who did what to whom, where, when, and why. If you and I share a knowledge of how to place words and phrases around a verb to tell a little story, and of how phrases and clauses can be nested inside one another, you can correctly guess the novel set of relationships I'm thinking about, just from the clues in the short string of sounds I utter. You thus recreate my model of events in your mind. This everyday exercise in structured speech, even if its only use was to gossip about who did what to whom, likely facilitated logic, narrative, and contingent planning-perhaps even structured music. 

Nevertheless, you may ask, weren't our ancestors gradually getting smarter, as the brain enlarged threefold in the past several million years? Bigger is smarter, is better -- why, it seems obvious.

That common assumption, however, is challenged by what archaeologists have been finding in the past few decades.  There were two early periods of human history, each lasting a million years, without obvious signs of toolmaking progress, despite all of
the brain enlargement going on at the same time. The increases in brain size must have
been driven by something that has not been preserved for the archaeologists to find –perhaps protolanguage, imitation, expanding cooperation, or more accurate throw
ing. Perhaps cleverness was a by-product? But if the brain-size increase resulted in
gradually increasing cleverness (again, the common assumption), note that it
didn't gradually improve their tool making. Oops. Even more to the point, by the time of the mind's "big bang," people who looked like us, big brain and all, had been running
around Africa for more than 100,000 years without showing signs of modern behaviors like fine toolmaking.  Oops again. The big brain may (or may not) turn out to be necessary for our kind of intelligence, but it sure isn't sufficient for modernity.

Once writing was invented, around 3200 b.c., knowledge could not be lost as easily as before; you could actually learn from dead people, and even reanimate their ideas. Indeed, as Zimmer's historical account makes clear, the ideas about the soul expounded first by Aristotle and then by Galen, the Greek philosopher-physician of second-century Rome, kept popping up-and preventing progress-for two millennia. Beginning in the sixteenth century, as standards improved for what constituted an adequate explanation, many traditional concepts about human bodily and mental animation began to seem simplistic, or even erroneous. In the seventeenth century, as Zimmer recounts, the English physician William Harvey figured out that the "soul" of the heart seemed to be all about pumping endlessly. The organ just didn't seem to have the right stuff for all those other functions ascribed to it.

The search for a better seat of personhood soon began to focus on the brain. Christopher Wren, remembered today mainly for his grand architecture and for rebuilding London after the great fire of 1666, was particularly skillful at [dissecting] brains. (He also invented intravenous injection - pretty good for an Oxford professor of astronomy.)  Wren's countryman Thomas Willis, an anatomist and physician who plays a central role in Zimmer's history, "did for the brain and nerves what William Harvey had done for the heart and blood: made them a subject of modern scientific study." As Zimmer makes clear, however, Wren, Willis, and the other virtuosi were forced not only to invent the practice of science as they went along, but also to navigate the treacherous waters of well-established doctrine regarding the soul. 

Willis and the rest of the virtuosi who emerged from the English Civil War pondered how they should go about gathering knowledge through experiments and observations, but only in an ad hoc way. It was [John Locke who] subsequently transformed this kind of thinking into a full-blown philosophy, one that would become the heart of the scientific method.

 


 

The new science of human nature conflicted with some vested interests concerning the soul. Selling indulgences, for instance, to ensure preferred treatment for your soul in the afterlife, had become a big business, aided by the invention of the printing press. The tortures imposed on dissenters by the inquisitions of the Roman Catholic Church attested to the dangers of thinking differently, and many an early scientist-philosopher was wary and guarded for good reason. The natural philosophers who populate Soul Made Flesh were no exception. "In 1666," Zimmer writes, "bishops blamed [London's] fire and plague on [Thomas Hobbes's] atheism." Although Hobbes was never formally charged as a heretic, he was "forbidden to write ever again about human nature."

Even medical men such as Willis had to tread warily through both the religious and the social conventions. Zimmer notes that for most of his working life, Willis was allowed to dissect only the bodies and brains of condemned criminals --his results could thus be ignored because they pertained only to the brains of the "abnormal." Willis, however, was good at persuading relatives of his aristocratic patients to surrender the bodies of their dead for autopsies.

Because the brains belonged to England's ruling class, it became hard for his readers to dismiss his observations. The respectability of his success allowed Willis to expand his mechanical, chemical explanations of the brain to include the soul itself without being accused of heresy.

That tactic of Willis's for gaining scientific acceptance, as Zimmer points out, was a clever bit of social jujitsu.

 


 

One might think, in the enlightened present, that holding nonconformist views about the comings and goings of the soul would not be criminalized-- but that's what is happening. The fallacy of "the little person inside" (about which, more in a minute) has long confused matters even for modern psychology students, who expect "a viewer" to be at some location inside the brain. Centuries ago, a little person was imagined to lie within a sperm. (Now the little person is imagined inside the fertilized egg. This is not progress.) The little person or soul causes endless confusion in otherwise responsible reasoning about regulating abortion.

"When life begins" is a phrase that already carries with it the idea that the soul pops out of a starting gate at the moment the sperm enters the egg. Next we see the dubious line of reasoning that concludes that a single cell has achieved legal personhood. It's only another small leap to claiming that interference with such a one-cell stage of a fertilized human egg is manslaughter or murder.

Few people, however, realize that nature seems rather careless with early embryos; many beginnings are not finished. At least one in four embryos is spontaneously aborted in the first several months. In women who smoke too much (or drink from the wrong water supply), three out of four may be lost. (The usual figures of between 10 and 15 percent for "pregnancy loss" refer to what happens even later, once pregnancy becomes obvious.)

Those numbers are, of course, far greater than those of elective abortions.

So when conflicts arise in the early stages of pregnancy, many people have concluded that the beginnings need not be finished -- that other considerations (time, place, health, resources, the father, other responsibilities) can reasonably be taken into account by the prospective mother. Many biologists -- and some modern theologians, too -- would add that, just as a pile of construction materials and some assembly instructions does not constitute a house, neither does a fertilized egg and its genome constitute a person, absent a lot of "value added" over many, many months.

Whatever one thinks about the soul and its connection with the contemporary abortion conflict, the terms in which that issue is argued make it abundantly clear that big ideas still matter. And the soul is one of the big ideas of all time.

 


Zimmer gives us a history of early concepts of soul and mind, in Soul Made Flesh, and Marcus gives us an overview of contemporary notions of mind, in The Birth of the Mind. In a nutshell, the two books tell the story of how centuries of scientific inquiry have led to new and revolutionary explanations for what animates us.

Many of us, as I mentioned earlier, imagine a little person inside the head watching sensory inputs, then telling the muscles what to do. It took a long time for scientists to realize that ascribing thought to a little person inside the head is the equivalent of asking, "What makes a car move?" and answering, "Another little car inside" rather than "An engine." But to explain thinking, it is all too easy to argue in a circle. And that classic beginner's mistake is not always innocuous; it sets you up to view a fertilized egg as also containing a little person inside.

With what, however, does science replace the little person inside? How does the brain make mind? To begin to address those questions -- to do justice to the complexity of human imagination, foresight, and capacity for reflection -- you have to come to grips with three basic conceptual features of human mentality.

First, mental life and functionality develop gradually. They occupy no single spot in the brain. And they form a push-and-pull web of influences rather than a falling-domino chain of causation.

Second, human mental life depends, crucially, on structuring to keep concepts from blending together like a summer drink. Structuring makes complex sentences possible, such as "I think I saw him leave to go home," in which three sentences nest inside a fourth, like Russian dolls. Structuring enables people to test out chains of logic, enjoy complex music, play games with rules, make contingent plans for the weekend.

Third, and probably most difficult, it must be possible for structured mental activity to become qualitatively improved. How do you manage to do something structured that you've never done before-- say, utter a long sentence about a friend's hopes and fears? Somehow you start with an incoherent jumble of concepts, then you improve its quality, editing them into a more coherent sentence in a second or two, before you finally decide to go with it.

How did the human animal ever acquire such features of mind? The only relevant process known in nature is Darwin's variation and selection. Of course, one can see the Darwinian process at work on a grand time scale, in the evolution of new species. But one also sees its results after any flu shot, in the response of the body's immune system to the challenge of the vaccine, creating better and better antibodies. The Darwinian process is the foundation of biology, without which nothing makes much sense (yet many parents do not wish their children to hear about it). Biologists are just beginning to explore how the brain could apply natural selection to the memories it stores in order to improve the quality of, say, a verbal performance --and do it all in the few instants between an incoherent thought and a structured utterance.

 


Soul Made Flesh provides an account of the first big steps toward in understanding of how the brain makes mind. Zimmer, a science writer and the author of Evolution: The Triumph of an Idea, the companion volume to the eight-hour PBS television series of the same name, has written a fine intellectual history of early neuroscience. It is full of drama, and it brings to life the struggles for insight that begin in William Harvey's time with the flowering of physiology.

Most of us regularly fail to distinguish how from why, a process from an object, distributed from pointlike, structured from simple, gradual ramp-ups from sudden beginnings. Scientists, in the course of centuries of investigation, have made all those mistakes; but they also, eventually, corrected them. We still eagerly compete to discover our present misconceptions, one of the things that makes doing science so different from other endeavors.

One long-since-corrected but persistent misconception, at least among nonscientists, is that "science says" genes determine behavior and destiny.  If you share that misconception, you probably need to read The Birth of the Mind.

The real story, as Marcus is at pains to emphasize, is about the flexible interactions between genes and the ways the brain is wired up, then subsequently between experiences and how genes are expressed in the brain. What emerges from those interactions are behavioral propensities that allow for an ever-widening set of choices, not "fate." "A brain built by pure blueprint," Marcus writes, "would be at a loss if the slightest thing went wrong; a brain that is built by individual cells following self-regulating recipes has the freedom to adapt."

Marcus, a psychology professor at New York University and the author of The Algebraic Mind: Integrating Connectionism and Cognitive Science, neatly explains why genes are less like blueprints and more like recipes. A blueprint has point-to-point correspondences between plan and construct. A recipe often shows no such correspondence: indeed, what comes out of the oven is often impossible to reconcile with its list of ingredients. Similarly, Marcus explains, there is seldom a single gene for the variable aspects of the body, such as eye color. Instead a gene is usually part of a committee of genes in which some push while others pull to help control a process.

Marcus also explains how genetic variations change the receptors sticking out from the surface of a so-called pathfinder cell. During embryonic development those variations can give rise to alternative "wiring diagrams" of brain tissue, which, in turn, promote some behaviors more than others. Finally, in considering the prospects for genetically modified humans, Marcus squarely faces the problem of unintended consequences. Soon, he notes, geneticists will be able to synthesize "whatever genes we like." But, he warns:

For many years it will be difficult, if not impossible, to gauge the potential side effects of a given gene] manipulation in advance. I can live with a buggy beta-test version of a new software package, but I don't want to have to restart my child.

 


The fate of the soul, I suspect, is to be reinvented again and again. That's because one nonessential aspect of it-- that little person inside -- is a beginner's error. Even today, when higher education provides a much better explanation for the emergence of persons and their roles and responsibilities toward one another in a society, the old version survives, because it is so easily reinvented by each succeeding generation.

The problem is serious because relying on the "little person" concept may force us to devalue things people might want to retain. Some optional add-ons to the soul (which vary around the world) include: comforting the bereaved or downtrodden, intimidating a misbehaving child, proselytizing, reaching for the greater meaning of self and life. Many are invaluable appeals to kindness or long-term individual responsibility that could readily stand on their own. The ghostly prop (the "little person," the soul) carries a danger with it: when a historic or scientific analysis casts doubt on "the little person within," some will throw out the baby with the bathwater and turn away from the valuable teachings.


Yet a stripped-down concept of soul might continue to stand for the uniqueness that different genes, in conjunction with different formative experiences and different personal decisions, confer on each individual. While the term "individual" might suffice, the term "soul" might better connote human foresight, ethics, and sense of responsibility, the personal track record and outlook on life that should matter to each of us. All those ideas are well worth emphasizing, no matter what one's religious tradition or beliefs about an afterlife.

Once on the right track, science is pretty good at turning the crank. The coming decades will likely see a revolution in our thinking about how one cell slowly becomes a real person, gradually able to comprehend life's great journey.

 

WILLIAM H. CALVIN is the author of A Brief History of the Mind: From Apes to Intellect and Beyond (Oxford University Press, 2004). He won the Phi Beta Kappa book prize for his previous hook, A Brain for All Seasons.  He is a neurobiologist and affiliate professor of psychiatry and behavioral sciences at University of Washington in Seattle.

 

 

 

 

 

 



 

The Virtual Index for my books and articles, far better than my printed index in most cases:

Google
WWW
WilliamCalvin.com
And my favorite source for looking up
 other authors' books (and who has quoted them):

Search:
Keywords:
In Association with Amazon.com

 

 

 



A Brief History
 of the Mind, 2004

click to order from amazon.com
A Brain for All Seasons
2002

click to order from amazon.com
Lingua ex Machina
2000

click to order from amazon.com
The Cerebral Code
1996

click to order from amazon.com
How Brains Think
1996

click to order from amazon.com
Conversations with
Neil's Brain
1994