Skip to content

Guest post: Language on the left?

9 Mar, 2011

The human brain is split into two halves, the left and the right hemisphere. But to what extent are language functions found mainly in one hemisphere, and why this might be? In the first in a series of posts from scientist bloggers, Professor Sophie Scott describes how there are two sides to language in the brain.

Speech and language were the first ‘functions’ to be associated with particular brain areas. In 1861, Paul Broca described a brain area that appeared to be very important in speech production. The area, in the posterior third of the left inferior frontal gyrus, came to be known as ‘Broca’s area’, and he determined this by studying the brain of one of his patients, known as Tan, who had a brain tumour and could only say the word, “Tan” (in addition to a few, more colourful, words).

Tan

Figure 1: the brain of “Tan”, showing a lesion in the left inferior frontal gyrus.

In 1881, Carl Wernicke described lesions in the left temporal lobe (figure 2) that were associated with problems in the perception of spoken language. Both of these findings were incredibly influential, partly because acquired language problems still typically follow damage to the left side of the brain.

 

Wernicke brain

Figure 2: Wernicke’s description of the ‘speech receptive area’ in the left temporal lobe marked “x” (Broca’s area is marked “y”).

As one might imagine, 150 years of neuropsychology and neuroscience have enabled us to refine some of our ideas about the neural basis of speech perception and production. One big aspect of this has been to show that the right hemisphere is significantly involved in the perception and production of speech. This contribution can, however, be quite complex. For example, in normal, conversational speech production, the region corresponding to Broca’s area in the right hemisphere is actively suppressed (Blank et al, 2003).

I am interested in what might modulate this suppression. For example, when those with the language disorder Broca’s aphasia recover speech production, this is associated with increased activation in the right hemisphere Broca’s area. But is right Broca’s area associated with other kinds of speech change? We continuously modulate our voices to adapt to our environment, for instance, both in terms of acoustics and social factors. We unconsciously change our voices when we speak in a noisy room, and this change can be quite specific to different kinds of background noise (Cooke and Lu 2010). We also change our voices a lot depending on to whom we are talking – people tell me I talk exactly like my mother after I’ve been speaking with her on the phone, but I can’t tell that I am doing it. In my lab, we are just starting to identify the brain systems involved in this kind of modulation, and it will be interesting to see to what extent and how right hemisphere mechanisms are involved in these changes.

In speech perception, things have become yet more complex. While linguistic information in speech is processed in the left temporal lobe regions that Wernicke described, we also see very strong activation of the right temporal lobe in spoken language. This seems to reflect the fact that, when we speak, we produce an incredibly complex sound – human speech is probably the most complex sound being made by a single sound source. And when we speak we express information not just about the words we say, but also about who we are, where we come from, how old we are, how well we are, whether we are a man or a woman, as well as our mood.

Functional imaging studies indicate that the right temporal lobe areas are especially concerned with processing many of these other aspects of the information in our voices. Thus, the recognition of a speaker is associated with right temporal lobe areas – patients who cannot recognize a speaker by their voice typically have lesions in this area.

Of course, in normal speech perception these factors interact – studies have shown that we very rapidly adapt to speaker-specific idiosyncrasies, but we don’t retune our whole speech perception network (Eisner and McQueen, 2006). Thus, if I hear Jonathan Ross describing a “wed wobin” I am not surprised when the person he’s talking to says “yes, a red robin”. These speaker-specific mechanisms are strong – we understand people better if we are familiar with their voice. It suggests that though the right and left temporal lobes might be sensitive to different aspects of the speech signal, they work together to enable us to understand speech.

The elephant in the room is why linguistic representations and processes are so associated with the brain’s left hemisphere in the first place. The left lateralisation of language is seen in 96 per cent of right-handed people, and is still there in 73 per cent of left-handed people (Knecht et al, 2000). It is there for men and women equally. People whose language centres are not in their left hemisphere have it in their right hemisphere: there is no evidence for people who have an intermediate, more equally divided representation of language across the left and right sides of the brain. And if the language-dominant hemisphere is damaged, the non-dominant hemisphere can take over function. Does this mean that the non-dominant hemisphere still performs linguistic functions in some low-key way? Or that it can adapt following damage to the brain (or perhaps even that it is released from some form of suppression)?

Ideas about why the linguistic aspects of speech perception are left lateralised tend to be a bit circular – one prominent argument is that the left hemisphere is good at processing the acoustic properties of speech. Apart from the evidence for this, why should there be such differences between the left and the right sides of the brain to start with? Is speech perception on the left because speech production is left lateralised? Possibly, but that still leaves the question of why speech production would be left lateralised.

Perhaps by focusing on language we are looking in the wrong place altogether. There are other kinds of asymmetry between the hemispheres. Damage to the right hemisphere can lead to lasting problems with orienting attention in space and time, such as in left spatial neglect, where people do not pay attention to the left side of space, don’t talk to people who stand on that side of them, don’t eat food on that side of their plate and so on. People with left hemisphere damage can show the opposite pattern of ignoring the right-hand side of their world, but this is considered less common and patients tend to recover quickly.

This suggests that there may be differences in how attentional mechanisms operate in the two hemispheres. Along different lines, my UCL colleague Professor Tim Shallice, comes to this problem through his research into high-level thought and monitoring systems. He thinks of the left hemisphere as the side that is good at categorising and classifying objects in the world. When we think about language, it forms a good model in which categorisation and classification are critical processes. I suggest that, in order to understand the lateralisation of language in the human brain, we should keep other, non-linguistic processes in mind.

References

  • Blank SC, Bird H, Turkheimer F, & Wise RJ (2003). Speech production after stroke: the role of the right pars opercularis. Annals of neurology, 54 (3), 310-20 PMID: 12953263
  • Eisner F, & McQueen JM (2006). Perceptual learning in speech: stability over time. The Journal of the Acoustical Society of America, 119 (4), 1950-3 PMID: 16642808
  • Cooke M, & Lu Y (2010). Spectral and temporal changes to speech produced in the presence of energetic and informational maskers. The Journal of the Acoustical Society of America, 128 (4), 2059-69 PMID: 20968376
  • Knecht, S. (2000). Handedness and hemispheric language dominance in healthy humans Brain, 123 (12), 2512-2518 DOI: 10.1093/brain/123.12.2512
  • Sophie Scott

    Sophie Scott is a Professor of Cognitive Neuroscience at UCL and a Wellcome Trust Senior Research Fellow in Basic Biomedical Science. She blogs at Listening In.

    This is part of a series of guest posts from Trust-associated scientists who blog. If you’re a scientist who blogs, we’d love to hear from you. Contact the Editor m.looi@wellcome.ac.uk

    Image credits: Nature Neuroscience
    6 Comments leave one →
    1. 9 Mar, 2011 4:47 pm

      The right hemisphere biases you mention are important, but I am not sure they relate to language laterality. For years I thought, like most people, that the left- and right-hemisphere specialisations were complementary, reflecting a kind of compartmentalisation of function. Yet the evidence is growing that the biases to the left for language and to the right for visuospatial/attentional functions are independent. So at the population level you have most people left-lateralised for language and right-lateralised for visuospatial skills, but those who show right-hemisphere language are as likely as anyone else to show right hemisphere visuospatial skills, and vice versa. See:
      Whitehouse, A. J. O., & Bishop, D. V. (2009). Hemispheric division of function is the result of independent probabilistic biases. Neuropsychologia, 47, 1938-1943.
      Badzakova-Trajkov, G., Haberling, I. S., Roberts, R. P., & Corballis, M. C. (2010). Cerebral asymmetries: Complementary and independent processes. PLOS One, 5(3), 9.
      We (Richard Rosch, Nic Badcock and I) are about to submit another paper using transcranial Doppler ultrasound replicating these findings with a new right hemisphere task that varies in difficulty.

      • 10 Mar, 2011 11:54 am

        I agree completely, Dorothy, that differences between the two hemispheres are not well accounted for as complimentary, or as varying along different dimensions. But I do think it’s interesting that attentional mechanisms seem to differ between the left and the right (and that they might just be different, rather than complimentary). For example to hear speech as a coherent stream, you need to glue it all together perceptually – something I can’t do when I encounter a click language, for example. I must stress I don’t have any evidence that this is something the LH is especially good at, but it’s why I have been thinking about attention and language. And I look forward to your Doppler paper with interest.

    Trackbacks

    1. Left brain — right hand, right brain — left hand | Empress of the Global Universe
    2. Wednesday Round Up #146 | Neuroanthropology
    3. Link love: language (28) « Sentence first
    4. Top blog posts 2011 « Wellcome Trust Blog

    Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out / Change )

    Twitter picture

    You are commenting using your Twitter account. Log Out / Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out / Change )

    Google+ photo

    You are commenting using your Google+ account. Log Out / Change )

    Connecting to %s

    Follow

    Get every new post delivered to your Inbox.

    Join 2,533 other followers

    %d bloggers like this: