Dyslexia is a learning disability we typically associate with reading and writing. But a study that appeared in the journal Science in late July of 2011 looked at how dyslexia affects the aural perception of language – that is, the way people with dyslexia hear words. EarthSky spoke with Tyler Perrachione, a graduate student at MIT who worked with Dr. John Gabrieli on the Science study.
These two researchers discovered that dyslexics have a fair amount of trouble distinguishing voices. It’s an implication that could alter how dyslexics are schooled, and also impact experts’ understanding of the roots of dyslexia, according to Perrachione. He iterated the major findings of his study:
Our study has two main implications. The first is, we show that people recognize voices better when they can understand the language that’s being spoken. So, for example, you and I would be better at recognizing voices if they’re speaking in English than voices speaking in a language we can’t understand, like Chinese, for instance.
The second major finding is that the “familiar language” advantage is not something dyslexics have – they don’t get this boost for recognizing voices in their native language. Which means they’re less accurate at identifying voices than someone who doesn’t have dyslexia.
The researchers figured this out by exposing subjects with and without dyslexia to recorded voices. Some voices spoke English, and others spoke a foreign language: Mandarin Chinese. Each voice was paired, on the computer, with a unique cartoon avatar.
At the end of a listening round, all the study participants were asked to match the voices to the correct avatars – in other words, to indicate the “identity” of the speaker.
Non-dyslexics were able to do this with accuracy about 70 percent of the time when their native language – English – was being spoken. That dropped to 50 percent when a foreign language – Mandarin – was being spoken.
By contrast, dyslexics had the same trouble distinguishing a voice whether that voice was talking in the dyslexics’ native language (English) or a foreign language (Mandarin). Their voice matching stayed at 50 percent, regardless of the language. Perrachione summed up the implications:
There’s a couple of interesting implications with this line of research. It’s been a hypothesis in the field that somehow the reason people with dyslexia struggle to learn to read is that somehow, in their mind, they don’t have a very good representation of the sounds of their language … but they can understand speech just fine.
When you think about that, he said, that last part doesn’t make a whole lot of sense.
And this is the first time we were able to show that there’s something about auditory processing that depends on recognizing the sounds of your language that people with dyslexia are impaired at. And suggests we’re kind of on the right track, as far as science goes, at pursuing that as an underlying cause.
He said this finding can boost the learning of dyslexics in the classroom:
Recognizing voices is very important for a variety of different kinds of social and linguistic tasks that people are doing. Say you’re at a restaurant and lots of different people are talking, and you’re talking to somebody, you want to be able to tell their voice from all the other voices. And that’s something that teachers and educators can take into account for students who have dyslexia, is that if there’s a noisy environment, they may have trouble following the teacher’s voice if other people are talking.
In other words, this study made clear that the best learning environment for dyslexics is a quiet one. Perrachione indicated that his study also raised some questions about the way human brains are wired for language, versus the way (other) animals’ brains are wired for calls:
Because we have words and we have language, that gives us the opportunity to recognize voices in a different way than any other species. The brains of different species are organized very differently [from ours]. And there’s been some nice studies about how voice recognition happens in the brains of humans and monkeys. It’s a relatively new field. Not a lot is known.
Bottom line: A study that appeared in the journal Science in late July of 2011 looked at how dyslexia affects the aural perception of language – that is, the way dyslexics hear words. EarthSky spoke with Tyler Perrachione, a graduate student at MIT who worked with Dr. John Gabrieli on the recent Science study.
Beth Lebwohl researches, writes and helps produce science content in audio and video formats for EarthSky. She is one of the authors on EarthSky.org, a script-writer for our podcasts, and helps host our English science podcasts in 90-second, 8-minute and 22-minute formats. Beth came to EarthSky in 2006 from the American Museum of Natural History's Department of Astrophysics, where she was surrounded by some of the greatest telescope-building, equation-wielding, code-writing physicists of our time. And they made her think . . . this science thing . . . it's pretty cool.