Over the last few years it’s become apparent that humans, like bats, can make effective use of echolocation by emitting click sounds with the tongue and listening for the echoes that result. Now a team led by Lore Thaler at the University of Western Ontario has conducted the first ever investigation into the neural correlates of this skill.
Thaler and her colleagues first had to overcome the practical challenge of studying echolocation in the noisy environment of a brain scanner, in which participants are required to keep their heads still. The researchers established that two blind, experienced echo-locators, EB and LB, were able to interpret with high accuracy the recordings of tongue clicks and echoes they’d made earlier, and so this form of passive echolocation was studied in the scanner.
Among several remarkable new insights generated by the research, the most important is that EB and LB exhibited increased activity in their visual cortices, but not their auditory cortices, when they listened to clicks and echo recordings taken outside, compared with when they listened to the exact same recordings with the subtle echoes omitted. No such differential activity was detected among two age-matched, male sighted controls.
The finding suggests that it is the visual cortex of the blind participants that processes echoes, not their auditory cortex. This visual cortex activity was stronger in EB who was blind from an earlier age than LB, and is more experienced at echolocation. However the echolocation skill of both blind participants is remarkable. Both are able to cycle and they can identity objects, and detect movement. EB, but not LB, showed evidence of a contra-lateral pattern in his echo-processing brain activity, just as sighted people do with the processing of light. That is, activity was greater in the brain hemisphere opposite to the source of stimulation.
Just how the visual cortex extracts meaningful information from subtle echo sounds must await future research. The researchers best guess is that the relevant neurons perform ‘some sort of spatial computation that uses input from the processing of echolocation sounds that was carried out elsewhere, most likely in brain areas devoted to auditory processing.’ Establishing the functional role of the cerebellar processing that was also differentially activated by echo sounds in the echo-locators must also await future research.
‘… [O]ur data clearly show that EB and LB use echolocation in a way that seems uncannily similar to vision,’ the researchers concluded. ‘In this way, our study shows that echolocation can provide blind people with a high degree of independence and self-reliance in their daily life. This has broad practical implications in that echolocation is a trainable skill that can potentially offer powerful and liberating opportunities for blind and vision-impaired people.’
If this research has piqued your interest in echolocation, a previous research paper on the topic by Antonio Martinez and his co-workers explained that anyone, blind or sighted, is able to learn the skill. In fact they said that after two hours practice a day for two weeks you should be able to recognise blindfolded whether there is an object in front of you or not.
_________________________________
L Thaler, S Arnott, and M Goodale (2011). Neural Correlates of Natural Human Echolocation in Early and Late Blind Echolocation Experts. PLOS One DOI: 10.1371/journal.pone.0020162
This post was written by Christian Jarrett for the BPS Research Digest.
Tuesday, June 7, 2011
The neuroscience of batman, or how the human brain performs echolocation
Labels:
Brain,
Cognition,
Perception
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment