Bodytalk
Laura Spinney
New Scientist 08 April 2000.

JANE is telling Ian a story. At one point, one of her characters slips a lethal substance into somebody else's drink. To illustrate the action, Jane moulds her hand into a C-shape and tilts it, as if pouring from a cup. Nothing strange about that, you might think, except that Jane is blind and well aware that Ian is, too.

Everybody gestures. Sometimes a visual sign is used as a substitute for speech, as when Winston Churchill famously held up two fingers for victory. But we also produce gestures spontaneously and unwittingly as we speak, even when we are on the telephone. It's not clear why we do it. Popular psychology texts promise to help us uncover the messages revealed by our body language. And some researchers maintain that gesturing has nothing to do with improving the recipient's understanding, claiming instead that it helps the speaker find the right word by acting as a kind of three-dimensional aide-memoire.

But some recent studies of children and primates have led to the revival of a different and controversial interpretation. To Michael Corballis from the University of Auckland in New Zealand, the gestures made by congenitally blind people and the hand waving we do while we're chatting on the telephone speak not of a lack of communicative power, but of a deep evolutionary link between speech and gesture.

Our gestures are not only an adjunct to speech, Corballis claims. They may have been our earliest method of communication. Early humans communicated using their whole bodies in a form of mime. Speech evolved out of this ancient body language, and gesture is all that remains of it today, he says. Gesture and speech have co-evolved, and the connection is so close that we can't do one without the other. It's an old habit and it doesn't die just because an individual is born blind.

The theory was first put forward by the French philosopher Étienne Bonnot de Condillac in 1746, and given another airing in the 1970s by the late Gordon Hewes, an anthropologist at the University of Colorado at Boulder. Hewes had combed through historical case notes of explorers encountering new peoples, and was struck by the fact that in the absence of dictionaries and interpreters, they resorted to a limited repertoire of gestures to make themselves understood. That suggested to him the existence of an underlying, universal human language based on nonverbal communication.

Hewes's theory gained little credence, however. With no hard evidence about how early humans communicated, it seemed more logical to argue that language evolved from the sort of crude vocalisations made by apes. Then last year, Corballis revived the theory once more.

He believes that the hooting and alarm calls made by our closest animal relatives are fundamentally different from our speech. In fact, says Corballis, our vocal expressions of emotion--laughter, sobs and shrieks--are the more likely descendants of those primitive alarm calls. He believes this could explain why attempts to teach chimps to speak have largely failed. While some have learned to grasp a very basic grammar--Kanzi, for instance, the pygmy chimp pupil of Sue Savage-Rumbaugh at Georgia State University--they never progress beyond the grammatical complexity understood by two-year-old children.


Pointing the way

To explain how a non-verbal language might have developed, Corballis describes the formation of Africa's Rift Valley, where the ancestors of the great apes remained on the western side while those that were to become hominids were cut off to the east. On the eastern side, forest eventually gave way to carnivore-infested savannah, and Corballis suggests that the danger of being spotted may have contributed to selection for silent, stealthy cooperation and communication.

The information to be conveyed would have been mainly the whereabouts of predators and prey, he says, and the first messages may have been simple actions such as pointing. Today, gesture still captures that spatial component. And vision is our most highly developed sense, not hearing as you'd expect if vocal communication was the original form. Bipedalism may also have been a by-product of our desire to communicate. Our ancestors needed their hands free to carry a spear and light a fire but also, perhaps, to warn of the approaching sabre-tooth.

Once this form of communication was established, the pressure to communicate in the dark and when the speaker was hidden could have helped to propel language towards a spoken form, Corballis believes. By transferring the burden of communication to the voice, early humans were able to talk and perform manual tasks simultaneously, which would have helped with the coordination of armed hunting parties and passing on tool skills to the uninitiated.

"Language skates on the surface of a mimetic culture," says psychologist Merlin Donald of Queens University in Kingston, Ontario. He agrees that somewhere in our evolutionary history speech took over from gesture as the main conduit of language, just as children communicate by mime for the first two years of life, while they master the mechanics of speech. But mime survives as a separate channel of communication even in adulthood, in ritualised behaviours such as courtship and dance, Donald says.

Corballis doesn't see mime and speech as separate channels, more as a progression of forms. He backs his theory by pointing out the close neurological link between movement and speech. If movement in the form of gesture gave rise to language, then you would expect some overlap between motor and language areas in the human brain, he argues. And that's exactly what you find.

Brain imaging has shown that a region called Broca's area, which is important for speech production, is active not only when we speak, but when we wave our hands. Conversely, motor and premotor areas are activated by language tasks even when those tasks have been robbed of their motor aspect--during silent reading, for instance--and particularly by words that have a strong gestural component, such as verbs and tool names.

Impairments of language and coordination are closely related, too. People with a condition called Broca's aphasia can put names to things but have trouble stringing sentences together. They show similarly impoverished gestures, indulging less in hand movements that relate to the flow or rhythm of the accompanying speech than to those that convey its content.

A more direct link between movement and meaning is found in a group of brain cells known as mirror neurons, which fire not only when a monkey makes a particular movement, but also when it watches another making the same movement. They are found in a region of the monkey brain that corresponds to the speech production area in humans, and it is possible that they may be the key to allowing individuals to understand one another's gestures--the development that made language possible.

Jana Iverson from the University of Missouri-Columbia and her colleague Esther Thelen from Indiana University at Bloomington point to this evidence of the brain's organisation to argue that gesture and speech are two outlets for the same thought processes. But they don't think that this necessarily means that gesture came first. One possibility, suggested by Elizabeth Bates from the University of California at San Diego, is that language could have hijacked neural systems that were geared to more basic sensory and motor tasks. Although the two systems usually perform separate functions, from time to time language could spill over into the motor mode, generating gestures. But according to Corballis, it is much more likely that gesture once carried the whole linguistic burden. Nothing else, he believes, can explain the huge amount of information that can be conveyed by gesture alone.

Its role in communication today is still evident. When people see a storyteller's gestures as well as hear the voice, they pick up around 10 per cent more accurate information about the story than when they hear the voice alone, say psychologists Geoffrey Beattie and Heather Shovelton of the University of Manchester. "Gestures are every bit as rich communicatively as speech," says Beattie."Meaning is divided between the hand and the mouth."

Moreover, gesture can be called upon to replace speech--in religious communities where a vow of silence has been taken, in the tic-tac system used by bookmakers and, of course, by the deaf. Conventional sign languages, such as American Sign Language, share all the syntactic complexities of spoken languages. More interesting, says Susan Goldin-Meadow of the University of Chicago, are the gestures invented by deaf children of hearing parents before they are taught a formal sign system. Even though their parents' spontaneous gestures lack any kind of linguistic structure, these children invent gestural lexicons with which they are not only able to construct sentences, but also to express difficult concepts such as past or future action.

Gesture speaks, says Goldin-Meadow. It tells the listener something. But it has other uses, too. To her, it is clear that gesture must also be performing some function for the speaker. One theory, put forward by psychologist Robert Krauss of Columbia University in New York, is that when racking your memory for the right word, gesture opens up a second channel into the mental lexicon where words and meanings are stored.

It is surprising, says Krauss, how often people's gestures represent visually a concept that is eluding them verbally--when they are in the "tip-of-the-tongue" state. Last year, Krauss and Uri Hadar of Tel Aviv University videotaped the conversation of brain-damaged patients suffering from anomic aphasia--the inability to name things. Those patients gestured significantly more than patients with a conceptual impairment, who could name objects without being able to explain what they were. When you need to put a name to a thing, Krauss concludes, gesture helps.

Krauss's findings have not always been confirmed by other studies. In the British Journal of Psychology last year, Beattie and Jane Coughlan, also of the University of Manchester, described inducing a tip-of-the-tongue state in 60 students, by asking them to fit uncommon words to descriptions. Half of them were asked to keep their arms folded, while the other half were free to gesture. If Krauss is right, gesturing should have fuelled the lexical search engine. But the gesturing group were no better at finding the correct word.

Yet when naive observers watched videos of the students' gestures without the soundtrack, they were able to select the right word from a set of five alternatives far more reliably than if they had been guessing. They were clearly extracting useful information from the gestures. The problem, according to Goldin-Meadow, is one of definition. When Krauss talks about communication, he means intentional communication. But whether or not it's conscious, she says, gesture may still be communicating something. Besides, even if gesture aids memory, a more ancient communicative role would not be ruled out.

The idea that body language taps into non-conscious thought is not a new one. It has spawned generations of self-help books on how to succeed in interviews, or read the signs that your boss fancies you. Consider the indentation at the base of the neck, says David Givens, director of the Center for Nonverbal Studies in Spokane, Washington. Revealing it is a universal sign of submission and approachability in all mammals and a courtship cue in humans. So a man who loosens his tie in the presence of a potential mate may unwittingly be expressing his attraction.

Often, says Goldin-Meadow, people use gestures to express notions that they are not consciously able to articulate--either because they don't understand them, or because they are painful or embarrassing. Their gestures may even express something at odds with what they are saying. Children, in particular, are vulnerable to this kind of gesture-speech mismatch as they acquire new knowledge.

One example is that when a row of 10 objects is rearranged we know that there are still 10 objects. But a six-year-old is likely to say that the number has changed because they have been moved. Confronted with a second, reference row of 10 objects, however, the child might point back and forth between objects in the two rows, indicating some understanding of their one-to-one correspondence. The child is in an unstable transition phase between an old, incorrectly held notion, and a new, correct one. A child in this state is ready to learn the new idea, says Goldin-Meadow, and with the right guidance will grasp it.

Gesture may perform many functions. But if Corballis is right it could help resolve a long-running debate about whether language emerged gradually or all at once, in a "big bang". Some linguists have argued that grammar is not something that could have evolved slowly--you either have it or you don't--and that therefore it must have exploded onto the hominid scene at some point in our history, perhaps with the emergence of Homo sapiens roughly 150 000 years ago. Corballis takes a different view. He thinks language in the syntactic sense--where rules govern how components are combined to convey different meanings--evolved slowly through gesture, but with an ever-expanding vocal dimension. He thinks that gesture was relieved of syntax as spoken language developed, but evidently it didn't become obsolete.

The picture of language evolution goes some way to explaining the arbitrary nature of words, says Corballis. The very first gestures may have mimicked the objects they were intended to represent. But as time went on, in a mime version of Chinese whispers, they became more abstract and removed. Later, symbolic gestures were coupled to random sounds, which is how words that sound nothing like the objects or events they describe came to be assigned to them. "The big bang was not the sudden emergence of syntactic language," he says, "but rather the turning point when vocal speech could carry the syntactic component without any gestural involvement."

Further reading:

"Neural expectations: a possible evolutionary path from manual skills to language" by M. A. Arbib and G. Rizzolatti, Communication and Cognition, vol 29, p 393 (1997)

"The role of gesture and mimetic representation in making language the province of speech" by S. Goldin-Meadow and D. McNeill, in The Descent of Mind edited by M. C. Corballis and S. Lea (OUP, 1999)

"Hand, mouth and brain: the dynamic emergence of speech and gesture" by J. M. Iverson and E. Thelen, Journal of Consciousness Studies, vol 6, no 11-12, p 19 (1999)

Laura Spinney is a writer based in London
From New Scientist magazine, 08 April 2000.