Most people think of dyslexia as a reading problem – a learning disability that causes letters to get jumbled up. But new research by MIT scientists suggests an even more basic cognitive difference sets people with dyslexia apart: they have difficulty recognizing voices speaking their own language.
The finding, published today in the journal Science, adds to the evidence that what underlies the reading problems in dyslexia may be fundamental problems in how the brain processes language, even distinguishing words and speech sounds.
In general, people are better at linking a voice to a speaker when the person is talking in a language they understand. Indeed, the new study found that non-dyslexic English speakers presented with an array of cartoon characters, each with a different voice, were good at identifying which voice went with which character when they werespeaking English — and not as good when it came to Chinese. But the surprise came when people with dyslexia were given the same task.
“We were struck by the severity. The [dyslexic] participants were high-functioning students, going to college; not children who avoided reading altogether. And yet their ability to learn to recognize one voice from another was no better for English than it was for Chinese,’’ said John Gabrieli, a professor of cognitive neuroscience at MIT. “This just shows how substantial this difficulty really is.’’
Dyslexia is not a problem with hearing; instead, something goes awry in the brain’s processing of audio signals.
Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, said that research in her laboratory shows that babies’ abilities to differentiate between sounds – “pat’’ vs. “bat,’’ for example — can predict the size of their vocabulary and basic language skills such as rhyming, up to five years age.
The new research, she said, adds to what has been known about impairments in speech perception by highlighting just how important the social process of voice recognition — figuring out who is speaking — is.
The importance of social interactions in learning has become apparent: one study found that 9-month-old babies can learn the sounds and words of a foreign language by interacting with a person, but not by passively watching a television.
“Socially, we want to know who [is talking]– and are they happy, is this person angry at me, is it my child or is it my mother. Who is it and what mood are they in?’’ Kuhl said. “That makes a difference with regard to interpretation of what was said.’’
Glenn Rosen, associate professor of neurology at Beth Israel Deaconess Medical Center, who studies animal models of dyslexia, said that the new research is intriguing. Dyslexia has long been associated with reading, he said, because it’s something the brain has to be taught to do, marshalling different brain regions to read a simple sentence — unlike spoken language, an ability children naturally acquire without much instruction.
“What this research and some other more recent findings are showing pretty clearly is dyslexics have difficulty in language in general,’’ Rosen said.
Gabrieli said future research will look at what brain regions are active in people with dyslexia and others when they are trying to recognize voices, and whether people with dyslexia perform differently on other tasks, such as facial recognition.