William H. Calvin and George A. Ojemann's CONVERSATIONS WITH NEIL'S BRAIN (chapter 14)
Home Page || Public Bookmarks || Science Surf magazine || Table of Contents
Conversations with Neil’s Brain
The Neural Nature of Thought & Language
Copyright  1994 by William H. Calvin and George A. Ojemann.

You may download this for personal reading but may not redistribute or archive without permission (exception: teachers should feel free to print out a chapter and photocopy it for students).


William H. Calvin, Ph.D., is a neurophysiologist on the faculty of the Department of Psychiatry and Behavioral Sciences, University of Washington.

George A. Ojemann, M.D., is a neurosurgeon and neurophysiologist on the faculty of the Department of Neurological Surgery, University of Washington.

14
How the Brain Subdivides Language


ESTO ES UN ELEFANTE,” Neil says, from under the sterile tent in the O.R. The slide projector advances. “Esta es una manzana,” he continues.
      Since Neil has been bilingual from childhood, George is going to locate his Spanish naming areas, to see if they differ from the English naming areas. The neuropsychologist reloaded the slide trays in the projector, and Neil was asked to name the pictures of elephants, apples, and other objects, in Spanish this time through. And again the stimulating probe is placed on the same brain areas to determine if any of those sites are essential for naming in this second language.
      As Neil’s mapping in Spanish proceeds, it becomes apparent that stimulating the earlier naming sites does not always block naming in Spanish. And at some sites where English naming was unaffected, naming in Spanish is disturbed.

bk7p220.jpg 29.9 K
[FIGURE 69 Naming objects in two languages]

YESTERDAY AFTERNOON, after Neil had checked into the hospital room, I went with George when he discussed the details of the operation with Neil.
      George described once again why the operation was being done, the probability that it would control Neil’s seizures (“operations don’t always work”), its risks (“low but not zero”), and the mechanics of positioning and testing and such.
      “We’re going to do some extra testing during the stimulation tomorrow,” George said. “Since you use your Spanish in your business, we have to worry about where that’s located. And since I hear that your appetite for books is insatiable, I think we’d better do some extra testing of reading localization.”
      “I hope you’ll give me something easier to read in the O.R. than those convoluted sentences that describe brain pathways,” Neil said.
      “The sentences we use are probably stolen from grade-school textbooks,“ George said. ”Areas for reading are often close to the epileptic area and aren’t always in the same place as sites important to naming in English.”
      “I always suspected that reading and speaking didn’t use the same piece of brain,” Neil commented. “I once had one of my speeches transcribed, one that sounded pretty good. But it was terrible in the written version. And vice versa — all those prepared speeches that look good on paper — they never sound right when read from a podium. The rules are just different. But different areas for English and Spanish?”
      “Well,” George continued, “strokes in bilingual patients often cause more problems with one of their languages than the other.”
      “How’s that?”
      “Sometimes it’s probably because one language had been used more often than the other. If it’s the less-used language that’s impaired, that’s hardly surprising. Or maybe the patient’s native tongue is `better recorded’ because it was written on a `blank slate’ of childhood. Perhaps the remaining language is the one used most recently, before the stroke. But occasionally, neurologists encounter a patient whose remaining language abilities can’t be explained away by one of these common-sense aspects.”
      Neil nodded, and George continued. “Following a stroke, someone might speak only a language that she learned as an exchange student. Even though she hasn’t used it since her teens, that’s all she can now speak.”
      Indeed, I mentioned, the immediate family is quite distressed because nobody knows the language. There’s a frantic search to find someone who can guess what language it is.
      “Cases such as that have forced language theorists to consider whether second languages are housed elsewhere, not in the same place in the brain as the native tongues. And we think we know why this happens,” George explained. “In the dozen or so patients who have had stimulation mapping of two languages, some degree of separation of naming sites in the two languages has been the rule, although there are also sites common to both languages.”
      “Are naming areas for the second language smaller?” asked Neil from his chair by the window.
      “Actually, they’re slightly larger than those for the first language,” George replied. “In other words, the second language can be disrupted from a nickel-sized site rather than a dime-sized site. It’s often hard to separate `when you learn it’ from `how good you get,’ but the first language may be somewhat more compactly organized than later ones — its naming sites are not spread out as widely.”
      It’s rare for a second language to be found in the opposite hemisphere — at least, based on the most reliable methods for establishing such lateralization, judging from the Wada test or the effects of strokes and tumors. Regardless of whether the language is pictographic or phonetic, written or spoken, it usually depends on left-brain mechanisms.
      “Even sign languages depend on the left brain, just as do oral languages and reading. A neuropsychologist, Ursula Bellugi, studied the effects of strokes in deaf patients who have communicated with American Sign Language since birth. She found that left-brain strokes interfere with signing, and right-sided strokes do not. Furthermore, stroke location in the sign-language users tends to predict expressive and receptive types of difficulty, just as in the hearing and speaking patient. It shows you that this is a cortical specialization for language we’re dealing with, not speech or hearing.”
      Different naming areas can be found for sign and oral language, based on observations in a few hearing patients who learned sign language because of a deaf family member. Just as in the more traditional bilingual patients, mapping shows a partial separation of sites that disrupt signing or speaking the name of the same object pictures.
      George then got down to the business at hand. “Neil, each time we have someone awake during one of these operations, we also try to learn a little something more about how the brain works. These are research studies and not crucial to treating your seizures, so we don’t have to do them. But awake operations are a unique opportunity to learn more about such functions as language, things that can be studied only in humans.”
      Neil nodded, and George continued. “So what we would like to do tomorrow, if it’s agreeable with you, is a special stimulation study — in addition to the one we have to do to perform your operation safely, localizing your two languages and reading. The only major down side to this is that it will lengthen the operation about 20 or 30 minutes, and you’ll be awake that extra time. The extra time probably doesn’t change the risk of the operation significantly, except that it might very slightly increase the risk of an infection.”
      “What have you been trying to find out?”
      “Well, we’ve been looking into the location of different categories of names. We’ve been comparing `animals’ to `tools’ — that’s because there are a few stroke patients who can name one of those categories and not another, as if their animal names were stored in the region of cortex that their stroke destroyed. We’ve also looked at stimulation effects on understanding of speech sounds, and the face and tongue movements you have to make to produce speech sounds. Recently we’ve been examining the ability to produce verbs from nouns — I say `bike’ and you say `ride.’ It’s all the rage in language studies now.”
bk7p223a.jpg 27.4 K
[FIGURE 70 Cortical zones with naming sites]

     
bk7p223b.jpg 26.2 K
[FIGURE 71 Cortical zones with reading sites]

      “Sounds like syntax.”
      “No, that’s just the parts of speech and their common partners. The mistakes made when reading seem to involve the syntax of the sentence. These patients make mistakes on verb endings, pronouns, conjunctions, and prepositions — but not on nouns or verb stems.”
      “Such as?”
      “Changing `them’ to `we’ or `she’ to `it.’ Reading `If my son is late for class again’ as `If my son will getting late for class again.’ Things like that.”
bk7p224.jpg 26.5 K
[FIGURE 72 Cortical zones with syntax errors]

      “My English teacher in high school must have thought I was missing my sites for syntax.”
      “All in all,” George went on, “these studies suggest that language is taken apart by the brain. That different parts of language are processed in different areas, just as the visual system takes apart the visual image into colors and contours and movements using different specialized regions of cortex. But we are only beginning to learn how language is separated out. Some of the separations we expected to find haven’t proved to be very obvious in the physiology.”
      Our overall impression from the varied development, stroke, and stimulation studies is that, at the level of the cortex, language is fragmented into many different components, each processed in a separate area, as though there were many different computers running in parallel, each assigned a small portion of a problem. A major question is, of course, how it is all pulled together to speak a meaningful sentence. But perhaps we will first need to discover the sites for nouns and verbs, for adverbs and adjectives, for declarative sentences and questions, for embedded phrases and metaphors.
      Yet some caution is in order. Nouns and verbs seem like such sensible categories, but the history of science is full of attempted categories that failed (such as the four humors, or the categories of phrenology). Are we looking for the right categories as we map the language cortex? Is there a different way to slice the speech-and-language pie? Are there more fundamental questions that we should be asking?
      “And what’s on the research agenda for tomorrow?” Neil asked.
      “We’ll try to get some more information about a particularly odd finding,” answered George. “While it’s mostly different places for different things in our language studies, there are two functions that are usually changed from the very same places.”
      “Which ones?”
      “The ability to understand speech sounds, like `ba’ and `pa’— and the ability to move the face and tongue in the sequences of movement needed to produce language. That’s what we want to investigate tomorrow — the first is simply a listening task, and the second is a performance task. My technician will come by your room after dinner to rehearse you on the tests that we’ll use.”
      “I thought that speech production and speech understanding were really separated in the brain. Production in the front, and understanding in the rear. And now you tell me they overlap. What about wiretapping some of my neurons? I talked to a couple of other patients of yours who have had operations like mine, and one said he’d participated in a study of individual neurons. And another talked about a brain wave study.”
      “Yes,” George responded, “we’ve several other studies that try to discern the physiologic mechanisms that generate language. We can’t do every study in each patient, because that would take too much time. But I think we’ll have time to try some wiretapping on you, if you’re agreeable.”
      “By all means.” Neil asked a few questions about this, read over the form asking if he was willing to participate in the research study, and then signed the form.
      Microelectrodes are fine wires about the thickness of a hair. They’re so fine that they can get close to neurons and record those impulses. But to do that, you have to stick the microelectrodes into the cortex.
      “Since we want to preserve any brain that’s important to language,” explained George, “we can only do microelectrode recordings in brain that we’re going to subsequently remove, which means only in areas that aren’t essential for language. Even so, those studies have had some very interesting results. In one of those studies, we compared neuron activity during naming aloud to naming silently. And then to matching a spatial feature on the same pictures, such as the angle of a colored line superimposed on the picture. So you see the same pictures each time but pay attention to different things.”
      “Of course, the spatial thing should be a right-brain function and the naming a left-brain one. Isn’t it?”
      “When we do those microelectrode recordings from the right temporal lobe of patients who we know are left dominant for language, we find neurons that change activity with naming. Just as we do recording from the left temporal lobe in other patients like you.”
      “So this is from neurons in cortical areas that aren’t close to naming sites — but they are still change their activity during naming? Right?”
      “About a fifth of the neurons we record from, in either the left or the right temporal lobe, get active during naming,” George explained. “And the same thing is true for the spatial task. So the special role of the right brain in spatial functions also isn’t a matter of having all the active neurons on that side.”
      “So does anything distinguish the `dominant’ side in these studies?”
      “Yes, the changes in activity with language occur earlier in neurons in the left temporal lobe,” George continued. “And many more of the neurons show reductions in activity - inhibition - than do neurons on the right. An exactly reversed pattern is seen for neurons active with the spatial measure — earlier activity and more inhibition on the right.”
      “So the neurons that get there first determine the response? Another survival of the fittest for neuron activity?”
      “Possibly,” George said. “We think the inhibition is part of an inhibitory surround that helps focus neural activity in the dominant hemisphere. It’s interesting that few neurons, if any, seem to be active during both the spatial and language measures.”
      “So it’s different neurons for different functions?” Neil said.
      “Thus far. Although a few neurons were active with both naming aloud and naming silently, most were active with only one of these, too. We’ve tested reading aloud and reading silently. And we’ve recorded a few neurons during naming the same pictures in two languages. The findings are the same. Neurons change activity with only one function.”
      “And their neighbors do the same thing?”
      “Not so far. Once in a while, the microelectrode will record activity from two or three neurons at the same time. Usually each of those nearby neurons will respond to something different. That might be a pretty good arrangement, just to help make associations. Such as having the neurons for naming next to the neurons for reading.”
      “Is that why this part of the brain is called association cortex?”
      No, the name’s been around for a hundred years, used for all of the neocortex except for the primary sensory and motor areas. Instead of calling it terra incognita like the old mapmakers, they called it the association cortex. Which, in part, it surely is: cortex has the reputation for handling associations. There is thought to be a division of labor with the subcortical regions, which handle the more well-established skills such as riding a bike.
      The brain wave studies, of course, can be done from any of the exposed brain surface such as the language areas, not just what is going to be removed later. And so that gives a broader picture without quite as much detail. You tend to judge “getting active” by changes in the EEG of a region, where it becomes faster but with smaller excursions.
      “The EEG change during naming, that characterized the naming sites, is called desynchronization,” George added, “and likely reflects activity of the selective attention system from the thalamus. That system selects the cortical areas appropriate to the task at hand. For example, all the naming sites become active at about the same time, after the slide comes on the screen — the frontal lobe naming sites aren’t lagging behind the temporal lobe naming sites, the way we’d expected.”
      “On the theory that language involved a serial process,” Neil observed, “first decoding in the temporal lobe and then expression in the frontal lobe?”
      “That was indeed the guess,” George said. “And it was wrong. Our studies didn’t find evidence for serial brain wave changes. All sites seemed to be turned on at once, at the beginning of a language event, and they stayed on during the whole event. That’s the way many functions in animal cortex seem to be organized, too — parallel activation of dispersed cortical areas.”
      “And the changes are widespread,” George continued. “We found that activation of motor speech areas was present even with silent naming, so our inner speech includes activation of motor systems.”
      “Whew! My brain as a committee of all these disparate areas still doesn’t sit very well with me,” Neil said. “And having all the committee members talking at once isn’t making it any better. Is there any mechanism to make a particular committee hang together? Or hang separately, as the case may be?”
      “We’re currently looking for that mechanism in some of our newer brain wave studies. In various animal studies, there is a coherence in the EEG between different brain areas involved in the same task. Sometimes it shows up as a lot more wiggles in the higher frequency range, up around 25 to 70 Hertz. Maybe such a frequency also links together the separate areas for some one language function. That’s what we’re examining, looking for activity at some frequency in that range that is synchronous at the different language sites during the appropriate language function.”
      “We certainly haven’t found such a frequency yet,” George said, “but we’re still looking. That’s the nature of science. You get an idea of how things work from previous studies, and figure out a way to test it, and sometimes you’ve guessed right and sometimes not.”
     

LANGUAGE ALSO DEPENDS on several other motor areas of the brain, I told Neil, in addition to the important one surrounding the sylvian fissure. One of these areas is high in the frontal lobe, near the midline, immediately in front of the leg motor cortex. This is a region called the supplementary motor area, a staple of motor organization in many mammals. Especially in the monkeys used in animal research, this area is thought to be important to initiating, planning, and programming of complex movements.
      When this area is damaged in the left brains of humans, the effect is initially quite dramatic. The patient is mute, can’t speak a word. And he can’t move the other side of his body, either. In a few days, recovery begins. At first speech returns — but in an odd way. Initially, at least, the patient’s speech is much better when merely talking at random, rather than when speech is requested. The problem seems to be in getting started with movements, including those of speech. George has described it as similar to the car’s starter motor being broken.
      Recovery continues over the next few weeks and is usually complete. In that respect, the effects of cortical damage to the supplementary motor area are quite different from damage around the sylvian fissure, where the effects may be permanent. Apparently the supplementary motor system on the other side can take over the “starter” function.
      The supplementary motor area has shown increased activity during learning of new movements. The cortical region between it and the corpus callosum, the cingulate gyrus, has shown activity with word reading in the blood flow studies on normal volunteers. Stimulation of the supplementary motor area in the O.R. results in movements, usually of a complex nature. In some parts of the supplementary motor area, stimulation will block speech. In monkeys, stimulation in the supplementary motor area and the cingulate gyrus results in vocalization. Monkey neurons in that region change their activity when the monkey hears vocalizations — often selectively, responding only to vocalizations of their own species.
     

NEIL ASKED ABOUT SWEARING. While it is certainly speech, is it language? Or at least, language of the beyond-the-apes variety.
      The brain area immediately below the supplementary motor area, the cingulate gyrus, is part of the “emotional” limbic system. An attractive hypothesis is that the activity of this region in man and monkey is related to emotional speech, especially expletives. Preservation of emotional speech is a characteristic of many patients with aphasia from strokes of the perisylvian region.
      Patients who are aphasic are not usually mute. Regardless of how reserved or proper they may have been before the stroke, their limited language is sometimes dominated by swearing. One of many unresolved issues in the brain organization of language is whether this emotional speech is more resistant to the effects of brain damage because it depends on a different brain area, like the cingulate gyrus, or because it depends on the same brain areas as other language but is preserved because of its simplicity and extensive associations.
      “Is that where all the swearing comes from in the Tourette’s people?”
      Impulsive vocalizations, most often grunts, characterize Tourette’s syndrome. About 30 percent of the time, these largely involuntary vocalizations consist of shouting words, sometimes obscenities. Except for being interrupted by expletives at points in speech where the rest of us would usually just pause, his — it’s males, three to one — language is otherwise normal.
      These patients also have motor tics, such as blinking, nodding, tongue protrusion, sniffing, and even hopping and squatting. These usually start earlier in childhood, along with some grunts and barks. The obscenities are added to the repertoire in late grade school or high school.
      “Just as in normal youngsters,” Neil observed.
      Tourette’s is highly familial and, in affected families, looks as if it might be a male version of what, in females, turns instead into an obsessive-compulsive disorder. Current evidence indicates disordered function in the motor centers deep in the cerebral hemispheres, another clue that emotional speech and other language may depend on separate brain areas.
      The Supplementary motor area is the only site where animal and human vocalization can be closely related. Either this region is part of a more primitive communication system than human language, or an example of an evolutionarily older system being co-opted into language. No one has yet produced swearing with human brain stimulation, or even tried a mapping study during emotional speech.
      “So tomorrow I’m going to lose some neurons in my left temporal lobe,” Neil observed. “Probably even neurons that may be active during naming and other such language tasks — but they’re not essential for them? Right?”
      “That’s why we talk about naming sites as essential sites,” George replied. “The other stimulation sites are where naming can continue even though they are temporarily confused by the electrical buzz. The places that are merely active during naming don’t seem to be essential in the same way, even though they obviously participate in the process. We can identify them as active using microelectrode recordings, brain wave, and blood flow studies. They’re doing something that we don’t understand very well. But in epileptics where these areas cause trouble occasionally, we find that we can remove many of them without causing more trouble than we cure.”

Conversations with Neil's Brain:
The Neural Nature of Thought and Language
(Addison-Wesley, 1994), co-authored with my neurosurgeon colleague, George Ojemann. It's a tour of the human cerebral cortex, conducted from the operating room, and has been on the New Scientist bestseller list of science books. It is suitable for biology and cognitive neuroscience supplementary reading lists. ISBN 0-201-48337-8.
AVAILABILITY widespread (softcover, US$12).
Home Page || Science Surf magazine || Table of Contents || End Notes for this chapter || Continue to NEXT CHAPTER