Saturday, May 21, 2011

Monkey see, monkey do: Mirror neurons and language

The fact that speech is rather messy makes perceiving and making sense of it quite a formidable task. In fact, the human brain is the only object in the known universe that can both perceive speech and fully understand it. Computers may be getting a bit better at recognizing speech than they used to be, but even the best of them are still unacceptably bad at making sense of it, and will remain so for a long time. Some of our closest relatives don't do that badly considering that they didn't evolve to use language, but they still can't compete with us. Not by a long shot.


The Motor Theory of Speech Perception

The aforesaid messiness of speech, particularly the problem of lack of invariance, led American psycholinguist Alvin Liberman and his colleagues to propose a Motor Theory of Speech Perception more than half a century ago. While the theory has been revised over the years, at its core has always lain the rather bold (and at that time visionary!) proposal that humans perceive speech by reference to speech production. In other words, the theory states that when we listen to somebody speak, we are not only passive recipients of the speech signal. Rather, we imagine the articulatory gestures (e.g. lip and tongue movements) that the speaker is performing, and that is when comprehension of what we are hearing really starts happening.

The Motor Theory has had both its advocates and critics ever since its original publication. Some of the criticisms have  revolved around Janet Werker's finding (replicated and expanded numerous times, but also refined considerably) that infants are able to perceive a large number of sounds appearing in human languages (even those that they have never heard their caregivers utter) before they produce their first real word, but that they stop being sensitive to sounds not present in their native language(s) as they inch closer to starting to speak (around age one).

Also, as other critics pointed out, proposing that we mentally model others' articulatory gestures as we listen to them speak does not really solve the problem of lack of invariance, as the lip, tongue, etc. movements required to produce the first sound of cat and the first sound of cut are not exactly the same. In response to this, the theory was revised to refer to "intended phonetic gestures" (something like the movements we plan our speech organs to perform rather than the actual movements they do perform). The response to this revision has been that the concept of intended gesture is probably too vague to be testable with the kind of experimental techniques we have at our disposal right now.

Overall, it could be said that the Motor Theory has fared better in areas such as speech-language pathology or theoretical linguistics than in the field of speech perception itself, as it does not seem to be sufficient to fully account for our ability to perceive speech. However, the fact that mental modelling of speech gestures may not be the sole mechanism which enables us to figure out what sounds we are hearing when we're talking to somebody does not mean that it's not one of the mechanisms we rely on. The relatively recent discovery of mirror neurons, brain cells which fire both when an animal plans to perform an action and when it sees another animal do the same thing, may lend new credence to the Motor Theory of Speech Perception.

Mirror neurons in monkeys

Mirror neurons were first discovered in the premotor cortex of macaques. The premotor cortex (in both macaques and humans) is involved in the planning of actions (such as moving your legs, taking a sip of water, or grasping an object) and is located in front of the primary motor cortex, which is crucial for executing actions. Both can be seen in this illustration of the human brain.


Thus, macaques have neurons in their premotor cortex which fire both when the monkey performs an action and when it observes another monkey (or even a human!) perform that same action. [1] Particularly interesting are the mouth mirror neurons of macaques.

Most of these neurons fire either when the monkey performs or observes a feeding-related action with its mouth. In other words, the"active" and "mirror" functions of such neurons are related to the same action. However, in a smaller proportion of these neurons, there is a discrepancy between their apparent "active" and "mirror" functions. While such neurons still fire when the monkey performs a feeding-related mouth action, in "mirror mode" they best respond to communicative mouth actions by other monkeys! [2] This may point to an evolutionary connection between ingestive and communicative mouth movements. Also, there is a group of macaque mirror neurons which fire both when the monkey performs a hand action and when it hears the sound of that action. [3] Very intriguingly, these neurons are located in the macaque analog to Broca's area in the human brain, which is indispensable in language functioning. Broca's area can be seen in this picture (alongside another patch of cortex crucial for language use, Wernicke's area, and the arcuate fasciculus).


Before moving on to a discussion of mirror neurons in humans, I'll just say one more thing about macaques. There are mirror neurons in the inferior parietal cortex of macaques (see the illustration in the linked article) which respond differently to the same action depending on what action follows it (e.g. grasping an object in order to eat it or to move it). This is true both in "active" and "mirror" modes. [4] This is a truly important discovery, as it points to a potential neural mechanism for understanding others' intentions. Understanding others' intentions, or theory of mind, is, of course, critical for social functioning, including the acquisition of language by human infants.

Mirror neurons in humans

Two areas of the brain's surface largely involved in motor functioning are also implicated in the observation of actions: One is located in the the lower part of the parietal lobe and the other in the region of the frontal lobe bordering the temporal lobe and close to the parietal lobe, which is roughly equivalent to Broca's area, but on both the left and right sides of the brain (Broca's area is located on the left side of the brain for an overwhelming majority of right-handers as well as for most left-handers). This illustration shows the lobes of the human brain. (Note that this brain is oriented in the opposite direction from those in the previous illustrations; this is a view of the right hemisphere.)


Not surprisingly, the two areas involved in movement and the observation of actions are precisely the location of most human mirror neurons [5]. Note that Broca's area, which is heavily involved in speech production (but, as multiple recent studies show, is also activated during comprehension) is practically brimming with mirror neurons! This, then, invites questions about a possible connection between movement, perception of action, and language. Kind of sounds like the Motor Theory of Speech Perception, doesn't it?

Before I start talking about some neat experimental findings that speak to this connection, I need to say a bit about transcranial magnetic stimulation (TMS) and motor evoked potentials (MEPs). Since the brain relies on electricity to transmit information within neurons, it is possible to use electrodes to stimulate the brain during surgery and evoke various types of responses in the patient (who, incidentally, is awake). For instance, you might get a certain muscle to twitch. While the use of this technique has led to some very important discoveries, it is, obviously, not possible to do this type of research with healthy research subjects. Enter TMS! Relying on the principle of electromagnetic induction, TMS basically uses a powerful electromagnet, such as the one in the picture below, to induce electrical activity in the brain.



TMS can be used to stimulate the brain and produce some type of motor response as well as to temporarily (and reversibly!) disable small areas of the cortex (and observe the effect this has on behaviour). Finally, it appears that TMS can be used to treat depression, but this is not our topic here.

OK, so if you can use TMS to zap certain brain regions and to get certain muscles to twitch, you can easily place sensors on a person's skin right on top of the muscle you expect to control in this way and measure the strength of the response (called a motor evoked potential, or MEP) caused by the zapping.

With this in mind, let's turn to some interesting experimental results. For instance, one group of researchers [6] measured MEPs in the right hand muscles of healthy participants elicited by using TMS to stimulate the primary motor cortex in the left brain hemisphere while having the participants do different things (such as observing actions and gestures, looking at objects, etc.). The muscular reactions evoked by magnetic stimulation of the motor cortex were stronger when an action was observed. It did not matter whether the action was exerted upon an object or whether it was mere arm movement. Also, the evoked potentials were only larger in those muscles which the subject would need to use to perform the action that he or she was watching.

Another experiment [7] used functional magnetic resonance imaging (fMRI) to investigate whether human mirror neurons are only activated by observing other humans do stuff or whether observing a monkey or a dog perform an action would also result in mirror neurons firing (we saw above that monkeys' mirror neurons do indeed fire in response to actions performed by humans). It turns out our mirror neurons fire when we see a human, monkey, or dog bite something, as well as when we see a human speak or a monkey smack its lips (a communicative gesture). However, human mirror neurons do not respond when a person watches a dog bark (only visual areas get activated in this case). It appears, then, that our mirror neurons only respond when we observe actions that are part of our own repertoire, probably resulting in a much more personalized understanding of such actions [5]. Recall the importance of understanding others' motivations (theory of mind) for members of a highly social species such as Homo sapiens.

So what about language?

The main thing to note here is that a large part of these human mirror neurons that respond to hand and mouth actions are located smack dab in the middle of Broca's area! This hints at the intriguing possibility of an evolutionary connection between gesturing and language, which is one of a number of currently competing theories of how language might have evolved in humans. (See [5] for more on this as well as for some interesting arguments for why language might be less likely to have evolved from involuntary animal calls.)

There is also experimental evidence that manual gesturing and language directly interact through the mirror neuron system. For instance, TMS/MEP experiments show that the area of the motor cortex which controls the right hand (located in the left cerebral hemisphere) becomes more excitable while participants are reading aloud, but the areas which control the left hand and either leg do not. This increase in excitability can't be due to speech articulation, as articulatory movements are controlled by both hemispheres. Rather, they seem to be specifically related to language processing! [8] Convergent evidence comes from studies in which people with aphasia (a spectrum of language disorders caused by brain damage) are asked to  name objects (which is generally hard for people with aphasia). Naming is facilitated when accompanied by right-hand pointing gestures, but only for patients suffering from types of aphasia resulting predominantly from damage to the frontal lobes (the location of Broca's area). [9]

 Interestingly, humans appear to have evolved mirror neurons responsive to speech sounds. In one experiment, MEPs were recorded from the tongue muscles of subjects who received TMS bursts to the left motor cortex while listening to words containing either a double [f] sound or a double [r] sound. The difference between these sounds is that the former requires very little tongue movement, while the latter is primarily produced with the tongue. The recorded MEPs were larger while the subjects were listening to words containing the double [r] sound. [10] Similarly, the excitability of lip muscles following a TMS burst to the left motor cortex is higher when people are listening to speech or viewing speech-related mouth movements than when they're viewing eye and brow movements. Also, there is no increase in MEPs when the motor cortex in the right hemisphere is stimulated. [11]

It would seem, then, that there is something to the Motor Theory of Speech Perception after all. Mirror neurons present us with a plausible brain mechanism which might enable speech perception to proceed with reference to articulation. What is not clear at present is to what extent speech perception crucially depends on the listener's brain modelling the speaker's articulatory gestures. Apart from the criticisms of the Motor Theory mentioned above, another reason why we might want to allow for the possibility that speech perception may not critically depend on creating a mental model of the speaker's articulatory gestures is the fact that aphasic individuals with severe damage to the left frontal lobe, which often includes damage to Broca's area (and, presumably, in many cases, to a large part of the mirror neuron system), are often able to understand individual words as well as most connected speech. If forced to make an educated guess, I'd say that, if anything, speech perception may be enhanced by the mirror neuron system rather than crucially hinging on it. But even this is just a guess. Much research remains to be done.

Another distinct possibility is that the mirror neuron system is particularly important for imitation, and therefore for language learning, particularly if it is also true that it is important for understanding others' intentions. This too merits intensive investigation.

At any rate, the link between mirror neurons and language, whatever it ultimately turns out to be, is a tantalizing and fascinating research topic, and it will continue to inspire and intrigue cognitive scientists of all stripes for a long time to come.


References


[7] Buccino, G., Lui, F., Canessa, N., Patteri, I., Lagravinese, G., Benuzzi, F., et. al. (2004). Neural circuits involved in the recognition of actions performed by nonconspecifics: An fMRI study. Journal of Cognitive Neuroscience, 16, 114-126. 

[10] Fadiga, L., Craighero, L., Buccino, G., & Rizzolatti, G. (2002). Speech listening specifically modulates the excitability of tongue muscles: A TMS study. European Journal of Neuroscience, 15, 399-402. 

[6] Fadiga, L., Fogassi, L., Pavesi, G., & Rizzolatti, G. (1995). Motor facilitation during action observation: A magnetic stimulation study. Journal of Neurophysiology, 73, 2608-2611. 

[2] Ferrari, P. F., Gallese, V., Rizzolatti, G., & Fogassi, L. (2003). Mirror neurons responding to the observation of ingestive and communicative mouth actions in the monkey ventral premotor cortex. European Journal of Neuroscience, 17, 1703-1714. 

[4] Fogassi, L., Ferrari, P. F., Gesierich, B., Rozzi, S., Chersi, F., & Rizzolatti, G. (2005). Parietal lobe: From action organization to intention understanding. Science308, 662-667. 


[1] Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119, 593-609. 

[9] Hanlon, R. E., Brown, J. W., & Gerstman, L. J. (1990). Enhancement of naming in nonfluent aphasia through gesture. Brain and Language, 38, 298-314. 


[3] Kohler, E., Keysers. C., Umiltà Fogassi, L., Gallese, V., & Rizzolatti, G. (2002). Hearing sounds, understanding actions: Action representation in mirror neurons. Science, 297, 846-848. 


[8] Meister, I. G., Boroojerdi, B., Foltys, H., Sparing, R., Huber, W, & Topper, R. (2003). Motor cortex hand area and speech: Implications for the development of language. Neuropsychologia, 41, 401-406. 

[5] Rizzolatti, G., & Craighero, L. (2007). Language and mirror neurons. In M. G. Gaskell (Ed.), The Oxford handbook of psycholinguistics (pp. 771-785). Oxford: Oxford University Press.


[11] Watkins, K. E., Strafella, A. P., & Paus, T. (2003). Seeing and hearing speech excites the motor system involved in speech production. Neuropsychologia, 41, 989-994. 

No comments:

Post a Comment