Showing posts with label Brain. Show all posts
Showing posts with label Brain. Show all posts

Wednesday, November 2, 2011

Male-to-female transsexuals have "male" brains

People with "gender dysphoria" feel as though their sexual identity doesn't match their biological sex. A popular theory is that such people have a brain with physical characteristics that match the sex they identify with. So, for a man who feels like he is a woman - a male-to-female transsexual - the proposal is that he has a female brain "trapped" in a male body. Now in one of the first studies of its kind, Ivanak Savic and Stefan Arver have scanned the brains of 24 heterosexual, pre-operative male-to-female transsexuals and compared their structure to the brains of 24 heterosexual male and 24 heterosexual female controls. Homosexual transsexuals were omitted to help avoid the complicating influence of sexuality on the results. None of the transsexual participants had taken any hormone treatments, which is another factor that could have skewed the findings.

The scans threw up several of the structural brain differences associated with biological sex that have been reported before. For example, the men's brains had more grey matter in the cerebellum (involved in motor control) and lingual gyrus (involved in vision) and less gray matter and white matter in the precentral sulcus (part of the frontal lobe), compared with the women's brains. The men also had smaller hippocampi (involved in memory) than the women. In all these respects the brains of the male-to-female transexuals resembled the brains of the male control group. Likewise, the male-to-female transsexuals, like the male controls, had more asymmetric brains than the female controls. "The present study does not support the dogma that male-to-female transsexuals have atypical sex dimorphism in the brain but confirms the previously reported sex differences in structural volumes, gray, and white matter fractions," the researchers said. In other words, the male-to-female transsexuals may have felt like women, but their brains had structural characteristics typical of men.

But that's not to say that the male-to-female transsexual participants had brains that were unremarkable. Compared with the male and female controls, they had a smaller thalamus (the brain's relay centre) and putamen (an area involved in motor control) and increased gray matter in the right insula and inferior frontal cortex (regions involved in representing the body, among other functions). Savic and Arver advised treating these differences with caution. They've never been found before so need to be replicated with a larger sample. And even if confirmed, it's not clear what these differences mean, or whether they are a cause or consequence of gender dysphoria. "One highly speculative thought is that the enlargement of the ... insular and inferior frontal cortex ... could derive from a constant rumination about one's own body," the researchers said.

More research is needed, with larger samples and including studies of homosexual transsexuals and female-to-male transsexuals. "Any interpretation must, therefore, proceed cautiously and can at this point only be highly speculative," the researchers said.
_________________________________

ResearchBlogging.orgSavic, I., and Arver, S. (2011). Sex Dimorphism of the Brain in Male-to-Female Transsexuals. Cerebral Cortex, 21 (11), 2525-2533 DOI: 10.1093/cercor/bhr032

Post written by Christian Jarrett for the BPS Research Digest.

Sunday, August 28, 2011

Animal-sensitive cells discovered in the human brain

A part of the human brain that's involved in emotion gets particularly excited at the sight of animals, a new study has shown. The brain structure in question is the amygdala: that almond-shaped, sub-cortical bundle of nuclei that used to be considered the brain's fear centre, but which is now known to be involved in many aspects of emotional learning.

Florian Mormann and his colleagues didn't use a brain scanner for their main study. Instead they inserted electrodes directly into the brains of 41 patients with epilepsy, who were undergoing neurosurgery as part of their treatment. This allowed the researchers to present the patients with different pictures and to record the resulting activity of nearly 1,500 individual brain cells, located in the amygdala, hippocampus, and entorhinal cortex (all regions are found in the medial temporal lobe; the latter two are involved in memory).

The dramatic result was that cells in the right-sided amygdala, but not the other regions, were far more likely to respond to pictures of animals, and to be aroused more powerfully by them, as compared with pictures of people (mostly celebrities), landmarks and objects (e.g. food and tools). By contrast, hippocampus cells responded similarly to the different picture categories, whilst the entorhinal cortex cells showed a reduced likelihood of response to pictures of people.

Cells in the right-sided amygdala weren't only more likely to respond to the sight of animals than other pictures, and to do so more powerfully, they also did so extra fast, with a mean latency of 324ms. This wasn't true for the other brain regions. Although this suggests the sight of animals is processed with extra efficiency by the amygdala, the latency is not so short as to suggest bypassing of the cortex (the crumpled, outer layer of the brain associated with conscious processing).

Because the amygdala is involved in fear learning, among other functions, it's tempting to interpret these findings alongside fossil evidence showing that early hominids were preyed on by carnivores, and alongside findings relating to "prepared learning" - this is our innate or early predisposition to have our attention grabbed by threats, such as snakes, faced by our ancestors rather than by contemporary threats like guns. Other research shows that animals are more likely to be detected, than vehicles or even buildings, in change blindness tasks, in which an object or animal appears in a scene that remains otherwise unchanged. However, Mormann's team noted that there was no relation between the likelihood or speed of response of amygdala cells and the nature of the animal pictures as either threatening or harmless.

The researchers said the differential response to animals by amygdala cells is "truly categorical" and "argues in favour of a domain-specific mechanism for processing this biologically important class of stimuli.

"A plausible evolutionary explanation," they continued, "is that the phylogenetic importance of animals, which could represent either predators or prey, has resulted in neural adaptations for the dedicated processing of these biologically salient stimuli."
_________________________________

ResearchBlogging.orgF. Mormann, J. Dubois and 10 others (2011). A category-specific response to animals in the right human amygdala. Nature Neuroscience, In Press.

Post written by Christian Jarrett for the BPS Research Digest.

Thursday, August 18, 2011

Empathy breeds altruism, unless a person feels they have low status. A brain-scan study with a lesson for riot-hit England

In a defining image of the recent English riots, a man helped an injured youngster to his feet while an accomplice stole from the same victim's bag. This sheer lack of empathy on the part of the perpetrators has shaken observers to their core. How could humans display such a lack of altruism toward their fellow man?

A possible clue comes from a new brain imaging study that has examined links between the neural correlates of empathy, an act of altruism, and participants' subjective sense of their social status. Among people who feel they have low status, the study finds, increased neural markers of empathy are actually related to reduced altruism. The researchers surmised this is because any feelings of empathy are quashed by a grudging sense of low status. This could be a kind of defence mechanism whereby self-interest dominates over empathy for others. A possible lesson is that by reversing people's feelings of low status, through educational opportunities and other interventions, we all gain, by reinstating the usual link between empathy and altruism.

Yina Ma and her team at Peking University scanned the brains of 33 student participants while they watched numerous video clips of people being pricked painfully in the face or hand by a needle, or touched on those same parts by a cotton bud (referred to as a Q-tip in the US). Extra activity in the brain, in response to the needle clips versus cotton bud clips, was taken to be a neural marker for empathy (seeing someone else in pain is known to trigger activity in the pain matrix of one's own brain).

The participants also rated their own empathy levels and their subjective sense of their socio-economic status. They were shown a ladder with ten rungs, with the top rung representing people with the best jobs and education and most money; participants then indicated which rung they saw themselves as occupying. Although the participants were students at the same university they varied in their subjective sense of status. Finally, the participants were left alone in a room with an anonymous donation box, labelled as raising money to help impoverished patients with cataracts.

Among patients who considered themselves privileged in terms of socio-economic status, there was a positive relationship between empathy and altruism. The more neural signs of empathy they displayed in the scanner (based on extra activity in the left somatosensory cortex when viewing needle clips), the more empathy they said they had, and the more money they chose to donate to charity. By contrast, among participants who considered themselves lower in socio-economic status, the opposite pattern was observed. The greater their empathy-related brain activity in the scanner (based on extra right somatosensory cortex and inferior frontal cortex activity in response to needle clips), the less empathy they said they had, and the less money they chose to donate to charity. The researchers said the empathy-related inferior frontal cortex activity observed in these participants could be a sign of inhibitory processes quashing the emotional impact of seeing another person in pain.

Note, there was no absolute difference in the amount of money donated by participants who self-identified as low or high socio-economic status. The finding is more subtle and suggests empathy has a differential effect on our altruistic behaviour depending on how we see our standing in the world.

"Our findings have significant implications to the social domain," the researchers said, "in that, besides improving objective socio-economic status, raising subjective socio-economic status via education may possibly manifold altruistic behaviours in human society."

The findings add to a complex literature that suggests lower socio-economic status is sometimes associated with more empathy and altruism, but sometimes associated with reduced empathy.
_________________________________

ResearchBlogging.orgMa, Y., Wang, C., and Han, S. (2011). Neural responses to perceived pain in others predict real-life monetary donations in different socioeconomic contexts. NeuroImage, 57 (3), 1273-1280 DOI: 10.1016/j.neuroimage.2011.05.003

This post was written by Christian Jarrett for the BPS Research Digest.

Monday, August 1, 2011

The hypnotised brain

Forget swinging pocket watches and unedifying stage antics, hypnosis is a genuinely useful tool for studying psychogenic symptoms - that is, neurological symptoms with no identifiable organic cause (known in psychiatry as "conversion disorder", the idea being that emotional problems are "converted" into physical ailments).

Consider hand paralysis, which some patients complain of in the absence of any neurological injury or disease. In a new study led by Martin Pyka at the University of Marburg, hand paralysis was induced in 19 healthy participants through hypnosis, thus providing a model of what may be going on in conversion hand paralysis. The hypnotised participants had their brains scanned while they rested calmly, and these results were then compared against a second scanning session in which the participants were not hypnotised.

The main result is that hypnosis-induced hand paralysis was associated not with brain areas involved with inhibiting movement (e.g. the supplementary motor area, located towards the front of the brain), but with increased coupling between regions associated with representation of the self (especially the precuneus, located in the parietal lobe, and the posterior cingulate cortex), and with regions that represent and monitor one's own movements (the dorsolateral prefrontal cortex). This suggests it's not so much that the participants' hand control was suppressed, but that they no longer believed they had the power to move their hands. This fits the findings from an earlier brain imaging study of a woman with conversion paralysis, which found changes in brain areas associated with self-monitoring and auto-biographical memory, but not areas associated with motor inhibition.

"We believe that the suggestions given during induction of hypnosis, which started with metaphors such as 'the left hand feels weak, heavy, adynamic,' 'any energy leaves the hand,' and continued with direct instructions like 'the left hand is paralysed, you cannot move the hand anymore,' induced an altered self-perception of the participants and their motor abilities," the researchers said. They acknowledged that a weakness of their study was that they'd deliberately recruited highly suggestible participants: "Thus, it is unclear whether the reported functional coupling can only be attributed to the neurofunctional impact of hypnosis or also to the selection of the subjects," they said.

As an aside, Jean-Martin Charcot, the "Napoleon of neurology", considered hyponosis-proneness to be a hallmark of patients with hysteria - a now defunct catch-all diagnosis, which included patients with conversion disorder. At the end of the 19th century at the Salpêtrière Hospital in Paris, Charcot often hypnotised his hysterical patients during his series of hugely popular public demonstrations of the condition. Hypnosis also became a common means of treatment for hysteria (although Charcot himself was not an advocate), whereby the entranced patient revealed, often via new emerging "personalities", the past traumas and fixed ideas at the root of their physical ailments. Hypnosis as a treatment fell out of favour with Freud's rise to prominence: he believed it was possible to get to the root of a patient's subconscious problems by talking to them directly, without the need for hypnosis.
_________________________________

ResearchBlogging.orgPyka, M., Burgmer, M., Lenzen, T., Pioch, R., Dannlowski, U., Pfleiderer, B., Ewert, A., Heuft, G., Arolt, V., and Konrad, C. (2011). Brain correlates of hypnotic paralysis—a resting-state fMRI study. NeuroImage, 56 (4), 2173-2182 DOI: 10.1016/j.neuroimage.2011.03.078

This post was written by Christian Jarrett for the BPS Research Digest.

Tuesday, July 26, 2011

Brain scans could influence jurors more than other forms of evidence

It's surely just a matter of time until functional MRI brain scans are admitted in US and UK courts. Companies like No Lie MRI have appeared, and there have been at least two recent attempts by lawyers in the USA to submit fMRI-based brain imaging scans as trial evidence.

Functional MRI gauges fluctuating activity levels across the brain, with experts divided on the merits of using the technology as a high-tech lie detection measure (see earlier). The late David McCabe who died earlier this year, and his colleagues, have put that debate to one side. They asked: if fMRI evidence were to be allowed in courts, would it have a particularly influential effect on jurors' decisions? There's good reason to think it might. For example, a 2008 study by Deena Weisberg found that lay people and neuroscience students (but not neuroscience experts) were more satisfied by bad scientific explanations when they contained gratuitous mentions of neuroscience.

For the new study, 330 undergrads at Colorado State University read a vignette about a criminal trial in which a defendant was accused of killing his estranged wife and lover. Various points of evidence were mentioned and summaries of testimony and cross-examination were provided (the vignette amounted to two pages).

Crucially, a sub-set of the participants read a version in which fMRI evidence was cited: "... there was increased activation of frontal brain areas when Givens [the defendant] denied killing his wife and neighbour, as compared to when he truthfully answered questions." For comparison, other participants read a version that either included incriminating evidence from polygraph, from thermal imaging technology (which measures changes in facial skin temperature), or that contained no lie-detection technology.

The key finding was that participants who read the brain-imaging version were far more likely (76 per cent) to say they considered the defendant guilty, compared with participants who read the other versions (47 to 53 per cent). Moreover, the lie-detection evidence was more likely to be cited by participants in the fMRI condition as key to their decision, as compared with participants who read versions that didn't mention fMRI.

The participants were not entirely seduced by fMRI. Some of them were given a slightly different version of the fMRI vignette, in which the expert witness warned about the technology's unreliability. These participants came to a similar proportion of guilty verdicts as the participants who read the vignette versions that lacked fMRI evidence. So it seems the persuasive influence of fMRI evidence can be tempered easily enough if people are reminded of its limitations.

The researchers acknowledged the obvious weaknesses of their study: the use of students as mock jurors, the use of vignettes rather than a real trial, and so on. These caveats aside, they said their data show that fMRI evidence could be more influential than other types of evidence. "... [T]hough determining whether that indicates the evidence would lead to unfair prejudice, confusion of the issues, misleading the jury, or needless presentation of cumulative evidence is a complex issue," they said. "At the very least, it appears that juries should be informed of the limitations of fMRI evidence."
_________________________________

ResearchBlogging.orgMcCabe, D., Castel, A., and Rhodes, M. (2011). The Influence of fMRI Lie Detection Evidence on Juror Decision-Making. Behavioral Sciences and the Law DOI: 10.1002/bsl.993

Further reading: The brain on the stand, by Jeffrey Rosen, New York Times magazine.

This post was written by Christian Jarrett for the BPS Research Digest.

Wednesday, July 6, 2011

Why is a touch on the arm so persuasive?

A gentle touch on the arm can be surprisingly persuasive. Consider these research findings. Library users who are touched while registering, rate the library and its personnel more favourably than the non-touched; diners are more satisfied and give larger tips when waiting staff touch them casually; people touched by a stranger are more willing to perform a mundane favour; and women touched by a man on the arm are more willing to share their phone number or agree to a dance. Why should this be? Up until now research in this area has been exclusively behavioural: these effects have been observed, but we don't really know why. Now a study has made a start at understanding the neuroscience of how touch exerts its psychological effects.

Annett Schirmer and her colleagues used EEG to record the surface electrical activity of the brains of dozens of female participants who were tasked with looking at neutral or negative pictures (e.g. a basket or a gun to the head). Before each picture appeared, the participants were sometimes touched on the arm by a female friend; touched by a mechanical device (a pressure cuff); or they received no touch. The idea was to see whether and how being touched changed the way the brain responded to emotional and neutral pictures.

A further detail is that the mechanical touch was described as either under the friend's control, with the friend located elsewhere, or under computer control. This was to see if physical proximity matters and whether it matters who does the touching. For comparison, a final experiment also tested the effect of an auditory tone, which preceded some pictures but not others.

The most important finding is that a touch on the arm enhanced the brain's response to emotional pictures, as revealed by the size of what's known as the late positive component (LPC) of electrical brain activity. The LPC is thought to be associated with evaluative mental processes and a touch led to a greater LPC for emotional pictures compared with neutral ones.

Touch had this effect regardless of how it was administered and who did the administering (friend or machine). This suggests the reported effects of touch are largely "bottom up" - that is, based mainly on the incoming stimulation - rather than "top down", to do with beliefs about the meaning of the touch. Unlike touch, the auditory tone didn't increase the brain's sensitivity to emotional pictures.

"Emotional information presented concurrently with touch may be more motivating such that more processing resources are allocated to them than to emotional information presented without touch," the researchers said.

One consequence of this, Schirmer's team speculated, could be that the touched person is primed to be more altruistic, consistent with previous behavioural results. "Based on the present findings," they explained, "we propose that such behaviour occurs because the tactile signal alerts its recipient and enhances the processing of concurrent events, particularly if they are emotional. Such enhanced processing may then, among others, boost empathy and increase the likelihood that the touch recipient acts in favour of the toucher."
_________________________________

ResearchBlogging.orgSchirmer, A., Teh, K., Wang, S., Vijayakumar, R., Ching, A., Nithianantham, D., Escoffier, N., and Cheok, A. (2011). Squeeze me, but don't tease me: Human and mechanical touch enhance visual attention and emotion discrimination. Social Neuroscience, 6 (3), 219-230 DOI: 10.1080/17470919.2010.507958

This post was written by Christian Jarrett for the BPS Research Digest.

Monday, July 4, 2011

Doubts cast on imagery as a rehab tool for stroke patients

Wouldn't it be marvellous if brain-damaged stroke patients could use mental practice to rehabilitate their weakened limbs? This isn't as far fetched as it sounds. Merely imagining performing a movement, or watching someone else execute a movement, provokes activity in the same brain areas that are involved when carrying out that movement with your own body. This suggests imagery exercises could help forge new connections in damaged neural networks involved in actual bodily movement. Indeed, several small-scale studies have reported that mental imagery helps stroke patients recover their limb use, above and beyond the benefits from standard physical therapy.

What's been lacking is a larger study with recently afflicted patients, an adequate control condition, and with the imagery intervention kept separate from standard physical therapy. Now psychologist Magdalena Ietswaart and her colleagues have published the results from just such a study. Sadly the outcome is disappointing.

Ietswaart's team recruited 121 patients within one to six months of their having suffered a stroke, all of whom had significant weakness in one of their arms. Forty-one of these patients were then enrolled on an intensive four-week mental imagery intervention, which involved a total of nine hours supervised exercises and four hours of independent work.

The programme was extraordinarily thorough. As well as basic imagination exercises designed to target the damaged brain areas involved in motor control (e.g. imagining opening and closing the hand), there were also mirror and video techniques to aid the imagination process. For example, placing the weakened hand under a video display of a moving healthy hand can create the illusion that the weakened limb is moving, thus triggering activity in relevant brain areas. There was also a mental rotation exercise, involving rotating pictures of hands - again this has been shown to stimulate the desired motor areas of the brain.

Of the remaining patients, 39 were enrolled on a four-week placebo programme designed to match all the mental effort and therapist attention involved in the imagery programme. But instead of using motor imagery, this group spent their time visualising flowers and other static scenes. A final group of 41 patients had care as usual. Patients in all groups underwent standard physical therapy, but this was kept separate from the imagery work.

When tested soon after the intervention phase, patients in all groups had shown improvement in use of their weakened limb compared with baseline. But here's the rub: there was no difference between groups, either in the amount of limb improvement, or in secondary measures such as independent living. This result suggests the positive outcome for imagery found in previous small studies may have been based on non-specific effects, such as increased motivation. Alternatively, it may be that mental imagery only works as an adjunct to physical exercises, helping to consolidate the progress made with specific, related movements. This new study is the first to study mental imagery as a separate intervention in its own right.

The new findings undermine the idea that mental imagery on its own can help the brain forge new functional connections. If imagery only works by consolidating the benefits of related physical exercise, the researchers said this would significantly diminish its value as an rehabilitation intervention. Apart from anything else, they noted, it would suggest mental imagery could only be used to help patients who are already capable of performing physical exercises.
_________________________________

ResearchBlogging.org
Ietswaart, M., Johnston, M., Dijkerman, H., Joice, S., Scott, C., MacWalter, R., and Hamilton, S. (2011). Mental practice with motor imagery in stroke recovery: randomized controlled trial of efficacy. Brain, 134 (5), 1373-1386 DOI: 10.1093/brain/awr077

This post was written by Christian Jarrett for the BPS Research Digest.

Wednesday, June 22, 2011

Living in a city, or growing up in one, is associated with heightened brain sensitivity to social stress

Without fanfare or formal announcement, human civilisation has passed a momentous milestone. For the first time, more of us now live in cities than in rural communities. The benefits are numerous: more jobs, better access to educational and health services, more potential friends, and on the list goes. Yet city living has its dark side. Crime, deprivation and inequality are usually higher and so are rates of mental illness, including more anxiety, depression and schizophrenia. A new paper has made one of the first attempts to understand the neural effects mediating this link between urban life and mental strife.

Across several studies, Florian Lederbogen and his team (at the University of Heidelberg and Douglas Mental Health Institute) placed volunteers in a brain scanner and engaged them in a task designed to create social stress. Participants had to answer tricky arithmetic problems as fast as possible, whilst receiving negative, critical feedback from the researchers and others, via headphones or a video display. The crucial question was whether the effect of this task on the brain would vary as a consequence of whether each participant currently lived in a city, a town or the countryside, and also where they grew up. Some participants were recruited via local newspaper advertisements, but unfortunately the majority were university undergrads.

There were two striking results. The stressful task triggered more amygdala activity in city-dwellers than townies, and more amygdala activity in townies compared with rural folk. Second, the task aroused more activity in the anterior cingulate cortex (ACC) of those participants raised in a more urban environment, regardless of where they currently lived. These associations were highly specific - no other brain areas were differentially activated according to urban/rural status. A raft of demographic variables including household size, income, personality, and self-reported health, played no part in the results. Also, demanding tasks (memory and face recognition), with the social stress element removed, did not lead to differential activity in the amygdala or ACC according to participants' current urban/rural status or upbringing.

The amygdala, often likened to an almond, is part of the brain's limbic system and is involved in emotional processing. That this region was apparently sensitised to social stress in the city dwellers "can plausibly be related to epidemiological observations," the researchers said, such as the higher rates of anxiety disorder in cities. The ACC, meanwhile, is involved in stress regulation, among other things. It's also been called the "oh shit" centre, for its function in looking out for unexpected outcomes. The researchers pointed out that schizophrenia, which is more common in cities, is associated with reduced ACC volume and connectivity abnormalities with the amygdala. Schizophrenia usually emerges in adolescence so it's notable that the city-link with ACC activity was based on participants' upbringing location rather than their current dwelling location. A follow-up study by Lederbogen's team further established that an urban upbringing was associated with reduced connectivity between the ACC and amygdala.

There's no question these are interesting results but they are crude. For example, we don't know what aspects of city living led to sensitised amygdala activation, or what aspect of an urban upbringing is associated with ACC function. Moreover, cities vary hugely and these results are based specifically on German urban and rural environments - perhaps the results would be different if the study were replicated on a different continent (although the researchers predict their results would be even larger in countries where the rural/urban discrepancies are greater).

We also don't know what it means to have an amygdala that's more aroused by social stress, or whether that sensitivity is permanent or not. For some broader context, consider that people with larger, more complex social networks have been shown to have bigger amygdalae. Perhaps - and this is pure speculation - city living is associated with having a more complex social life, and therefore an enlarged, more sensitive amygdala. By this account, the amygdala finding in the current study has provided evidence of adaptive neural plasticity, just as much as it may have uncovered a pathological vulnerability. Consistent with this interpretation, it's notable that the study participants were all psychologically healthy (potential volunteers were excluded if they had past or present mental health problems). The cross-sectional nature of the current research also means we don't know if city living causes the observed brain differences, or if people with certain kinds of brain are drawn to urban versus rural environments.

A final short-coming is that the brain differences associated with urban life did not correlate with cortisol levels triggered by the stressful tasks. Cortisol is a biological marker of stress, so if heightened amygdala and ACC activity were indicative of sensitivity to stress you'd expect participants with extra activity in these brain regions to have shown corresponding increases in cortisol.

Lederbogen and his colleagues said their study had shown "neural effects of urban upbringing and habituation on social stress processing in humans" and was a first step in what they hope will be "a new empirical approach for integrating social sciences, neurosciences and public policy to respond to the major health challenges of urbanisation."
_________________________________

ResearchBlogging.orgF Lederbogen, P Kirsch, L Haddad, F Streit, H Tost, and six others (2011). City living and urban upbringing affect neural social stress processing in humans. Nature : 10.1038/nature10190

This post was written by Christian Jarrett for the BPS Research Digest.

Tuesday, June 7, 2011

The neuroscience of batman, or how the human brain performs echolocation

Over the last few years it’s become apparent that humans, like bats, can make effective use of echolocation by emitting click sounds with the tongue and listening for the echoes that result. Now a team led by Lore Thaler at the University of Western Ontario has conducted the first ever investigation into the neural correlates of this skill.

Thaler and her colleagues first had to overcome the practical challenge of studying echolocation in the noisy environment of a brain scanner, in which participants are required to keep their heads still. The researchers established that two blind, experienced echo-locators, EB and LB, were able to interpret with high accuracy the recordings of tongue clicks and echoes they’d made earlier, and so this form of passive echolocation was studied in the scanner.

Among several remarkable new insights generated by the research, the most important is that EB and LB exhibited increased activity in their visual cortices, but not their auditory cortices, when they listened to clicks and echo recordings taken outside, compared with when they listened to the exact same recordings with the subtle echoes omitted. No such differential activity was detected among two age-matched, male sighted controls.

The finding suggests that it is the visual cortex of the blind participants that processes echoes, not their auditory cortex. This visual cortex activity was stronger in EB who was blind from an earlier age than LB, and is more experienced at echolocation. However the echolocation skill of both blind participants is remarkable. Both are able to cycle and they can identity objects, and detect movement. EB, but not LB, showed evidence of a contra-lateral pattern in his echo-processing brain activity, just as sighted people do with the processing of light. That is, activity was greater in the brain hemisphere opposite to the source of stimulation.

Just how the visual cortex extracts meaningful information from subtle echo sounds must await future research. The researchers best guess is that the relevant neurons perform ‘some sort of spatial computation that uses input from the processing of echolocation sounds that was carried out elsewhere, most likely in brain areas devoted to auditory processing.’ Establishing the functional role of the cerebellar processing that was also differentially activated by echo sounds in the echo-locators must also await future research.

‘… [O]ur data clearly show that EB and LB use echolocation in a way that seems uncannily similar to vision,’ the researchers concluded. ‘In this way, our study shows that echolocation can provide blind people with a high degree of independence and self-reliance in their daily life. This has broad practical implications in that echolocation is a trainable skill that can potentially offer powerful and liberating opportunities for blind and vision-impaired people.’

If this research has piqued your interest in echolocation, a previous research paper on the topic by Antonio Martinez and his co-workers explained that anyone, blind or sighted, is able to learn the skill. In fact they said that after two hours practice a day for two weeks you should be able to recognise blindfolded whether there is an object in front of you or not.
_________________________________

ResearchBlogging.orgL Thaler, S Arnott, and M Goodale (2011). Neural Correlates of Natural Human Echolocation in Early and Late Blind Echolocation Experts. PLOS One DOI: 10.1371/journal.pone.0020162

This post was written by Christian Jarrett for the BPS Research Digest.

Friday, May 27, 2011

Debunking people's belief in free will takes the intention out of their movements

Undermining a person's belief in free will alters the way their brain prepares for a voluntary movement. Davide Rigoni and his colleagues, who made the finding, aren't sure what the precise mechanism for this effect is, but they speculated that bursting the free will bubble somehow causes people to put less intentional effort into their movements.

Rigoni's team tested thirty participants on a version of Benjamin Libet's classic task from the 1980s. This requires that participants watch a dot proceed round a clock face, that they make a voluntary finger movement at a time of their choosing (the current study had participants press a button), and then make a mental note of the position of the clock at the time they made their decision to move. Libet's controversial discovery, replicated here, was that the brain begins preparing for the finger movement several hundred milliseconds prior to the conscious decision to move, as revealed by electrical activity recorded via electrodes on the scalp. The finding implies that free will is illusory.

For Rigoni's task, an additional detail was that half the participants read a passage debunking our sense of free will (see comments for the text) before they completed the Libet task. The other half acted as controls and read a passage about consciousness that didn't mention free will.

The new finding was that the earliest phase of preparatory brain activity known as "the readiness potential" differed between the two groups. This early component (around 1300 to 400 ms prior to the voluntary movement) was weaker in the brains of the participants who'd had their belief in free will diminished. Moreover, a questionnaire administered afterwards showed that this effect on brain activity was greater among the participants who reported having less belief in free will. In contrast, later phases of the brain's preparatory brain activity were not correlated with belief in free will.

What do we know about the early phase of preparatory brain activity that was affected? Quoting Lang (2003), Rigoni and his colleagues said that this early phase is associated specifically with movements that are executed with the "introspective feelings of the willful realisation of the intention to move at a particular time." In English, this means the early phase of preparatory brain activity is associated with just the kind of movement under study - a deliberate movement initiated at a consciously chosen time. The implication is that undermining someone's sense of free will leads them to invest less intention into an intentional movement. Exactly how one does that, and what it means, remains unclear. Rigoni's team conceded in their conclusion: "How disbelief in free will affects intentional effort is an open question."

They added: "In sum, our results indicate that beliefs about free will can change brain processes related to a very basic motor level, and this suggests that abstract belief systems might have a much more fundamental effect than most people would expect."

The study builds on past research showing how undermining people's belief in free will affects their social behaviour, for example encouraging them to cheat.
_________________________________

ResearchBlogging.orgRigoni, D., Kuhn, S., Sartori, G., and Brass, M. (2011). Inducing Disbelief in Free Will Alters Brain Correlates of Preconscious Motor Preparation: The Brain Minds Whether We Believe in Free Will or Not. Psychological Science, 22 (5), 613-618 DOI: 10.1177/0956797611405680

Previous Digest reports featuring Libet's classic task: Libet Redux: Free will takes another hammering and Exposing some holes in Libet's classic free will study.

This post was written by Christian Jarrett for the BPS Research Digest.

Wednesday, April 13, 2011

Your brain unscrambles words in the mirror but then switches them back again

We humans can recognise things from different angles and orientations. As Jon Duñabeitia and his colleagues observe in their new paper, a tiger is still a tiger whether you see it facing rightwards or leftwards. When it comes to words, though, this skill largely vanishes - mirror-reversed words are especially tricky to read. It makes sense that the brain becomes sensitive to orientation in this way because, unlike the tiger, a 'd' isn't a 'd' when it faces the other way: 'b' (and the same is true for other letters).

The question that Duñabeitia set out to answer is what happens, in the case of letters, to the brain's usual ability to recognise things regardless of their orientation? Is the automatic reversal process somehow unlearned for letters, or is it merely suppressed at a later stage of processing? Given how recently in our evolutionary history we started reading and writing, the latter seems more likely.

However, a recent brain imaging study using fMRI, led by Stanislas Dehaene, suggested that the automatic reversal process was completely blocked when dealing with letters. Dehaene's team found that mirror-reversed words failed to produce a priming effect, either in terms of brain activity or behavioural performance. That is, the subliminal flash of a mirror-reversed word didn't speed up participants' recognition of that same word when it subsequently re-appeared the right way around. This suggests the mirror-reversed words weren't switched around and processed normally by the brain.

But what if the temporal resolution of fMRI is too poor to detect early mirror reversal processes? Duñabeitia's team performed an experiment in which normal and mirror-reversed words were flashed up subliminally prior to repeated presentations of those same words, but they used electroencephalography (EEG) to measure their participants' brain activity. Unlike fMRI, EEG can measure changes in brain activity over sub-second periods (although its spatial resolution is much poorer).

In contrast with Dehaene, Duñabeitia did observe a priming effect for mirror-reversed words. Although at 150ms after a prime, brain activity was different between mirror-reversed and normally oriented prime words, by 250ms the brain's response to these two kinds of prime was the same. In other words, the brain detects the mirror-reversed orientation but by 250ms it has switched it around the right way. By 400ms (still less than half a second) after the prime, the pattern had changed again, so that now the mirror-reversed prime and normally oriented prime provoked different patterns of activity (located towards the back of the brain). This could be the postulated suppression process in action.

The intriguing implication of this research is that when reading mirror-reversed words your brain automatically flips them the right way around - for an imperceptible instant you have a mirror-reading ability - but then it suppresses that effect, putting the mirror reversal back in place again, hence the words appear as awkward to read. This interpretation is consistent with the finding that many young children are capable of spontaneous mirror-writing and reading, perhaps because they have yet to develop the suppression of the automatic reversal process. There are also reports of brain injury prompting the onset of mirror reading.

This new research is more than just curiosity, it could help further our understanding of dyslexia, which in some cases is associated with the unwelcome automatic rotation of letters and words. 'Now we know that rotating letters is not a problem that is exclusive to some dyslexics, since everybody does this in a natural and unconscious way,' said Duñabeitia. 'But what we need to understand is why people who can read normally can inhibit this, while others with difficulties in reading and writing cannot.'
_________________________________

ResearchBlogging.orgDuñabeitia, J., Molinaro, N., and Carreiras, M. (2011). Through the looking-glass: Mirror reading. NeuroImage, 54 (4), 3004-3009 DOI: 10.1016/j.neuroimage.2010.10.079 [Article pdf via author website].

Tuesday, April 5, 2011

Psychologists destroy money for the sake of science

When the British acid house band The KLF videoed the burning of a million pounds in 1994 on the Isle of Jura, they might not have realised it, but they were likely activating the left hemisphere tool network of anyone who watched.

In a splendid case of science imitating one of the quirkier corners of life, Cristina Becchio and her colleagues, including the British husband and wife team Chris and Uta Frith, scanned the brains of twenty people as they watched brief video clips of 100 or 500 Danish Kroner bank notes (worth ten or fifty pounds, respectively) being torn or cut in half*. For comparison, the participants also viewed the same value notes being folded or looked at, and they also viewed valueless notes with scrambled imagery on them being destroyed or folded.

Compared with the other video clips, the sight of bank notes being destroyed led to increased activation in brain regions previously associated with looking at, identifying and using tools - that is, the left fusiform gyrus and the left posterior precuneus. This activation was greater when it was higher value notes being destroyed. Participants also said they felt more aroused and less comfortable when watching the money being destroyed than when watching the other videos.

Why wasn't the inferior parietal lobule, the final part of the so-called left hemisphere tool network, activated? Perhaps because activity here is associated with specific motor skills and hand movements involved in tool use, and the use of money isn't dependent on any particular skilled movements.

The researchers' interpretation of their finding is that the videos showing the cutting and tearing of money prompted participants to focus on the usual function of money as a tool for representing the value of goods and services. '... [T]he fact that the brain does treat money as a tool for tracking exchange on a precise scale suggests that a tool explanation of money is more than just a useful metaphor,' they said.

An alternative explanation for the results is simply that the extra activation during the destructive clips was caused by the emotional effect of seeing money destroyed, in line with the participants' subjective accounts of how they felt. But Becchio and her team doubt this is the true cause of their results - they found no activation in brain regions usually associated with financial loss and there were no correlations between levels of brain activity and the arousal and comfort ratings.

The KLF were unavailable for comment.
_________________________________

ResearchBlogging.orgBecchio, C., Skewes, J., Lund, T., Frith, U., Frith, C., and Roepstorff, A. (2011). How the brain responds to the destruction of money. Journal of Neuroscience, Psychology, and Economics, 4 (1), 1-10 DOI: 10.1037/a0022835

*Prior agreement for this was obtained from the Danske Bank of Denmark, to whom damaged notes were returned after the study was completed.

Thursday, March 31, 2011

People who are more aware of their own heart-beat have superior time perception skills

What underlies our sense of time? A popular account claims an internal pacemaker emits regular pulses, which are detected by an accumulator. The amount of accumulated pulses represents the amount of time that's passed.

Trouble is, this is all very theoretical and no-one really knows how or where in the brain these functions are enacted. One suggestion is that the pulses are based on bodily feedback and in particular the heart-beat. Consistent with this is a recent brain imaging study that showed activity in the insular (a brain region associated with representing internal bodily states) rose linearly as people paid attention to time intervals (pdf). Now a behavioural study by Karin Meissner and Marc Wittmann has built on these findings by showing that people who are more sensitive to their own heart-beat are also better at judging time intervals.

Thirty-one participants listened to auditory tones of either 8, 14, or 20 seconds duration. After each one, they heard a second tone and had to press a button when they thought its duration matched the first. Counting was forbidden during the task and a secondary, number-based memory task helped enforce this rule. Heart-beat perception accuracy was measured separately and simply involved participants counting silently their own heart-beats over periods of 25, 35, 45 and 60 seconds.

The take away message is that the participants who were more in tune with their heart-beats also tended to perform better at the time estimation task. A further detail is that physiological measures taken during the encoding part of the task showed that as time went on, the participants' heart-rate slowed progressively, and their skin conductance (i.e. amount of sweat on the skin) reduced. Moreover, the rate of change in a participant's heart-rate (but not skin conductance) was linked with the accuracy of their subsequent time estimates.

'These results suggest that the processing of interoceptive signals [i.e. of internal bodily states] in the brain might contribute to our sense of time,' Meissner and Wittmann concluded.

The new findings add to past research showing that patients with cardiac arrhythmia are poorer than controls at time estimation tasks, and that drug-induced speeding or slowing of the autonomic nervous system (including heart-rate) affects people's under- or over-estimation of time intervals.
_________________________________

ResearchBlogging.orgMeissner, K., and Wittmann, M. (2011). Body signals, cardiac awareness, and the perception of time. Biological Psychology, 86 (3), 289-297 DOI: 10.1016/j.biopsycho.2011.01.001

Wednesday, March 30, 2011

Sweaty work in the hunt for the brain basis of social anxiety

Anxiety has overtaken depression to become the most commonly diagnosed psychological disorder in the United States, with social anxiety its most frequent manifestation. Part of the cause of extreme social anxiety is thought to be related to bad experiences - being laughed at in class, blushing in front of friends, choking on a first date - so that a person learns to fear social situations. But that's unlikely to be the whole story. Social anxiety runs in families suggesting some people have an innate predisposition for the disorder. The authors of a new study believe they've identified, for the first time, a neural correlate of this vulnerability.

Wen Zhou and colleagues scanned the brains of nineteen women while they were exposed to the smell of two types of men's sweat, a floral scent, and the human steroid (and putative pheromone) androstadienone. One of the male sweat types was sexual, the other was neutral, and they were collected from men's armpits as they watched either a sexual film or an educational documentary. The women weren't told what the different smells were or where they came from.

Human sweat is known to convey social signals. For example, it's been shown that people can tell a person's emotional state purely from the smell of their sweat. The key findings in this new study are that the two types of sweat, compared to the other odours, led to increased activation in the orbitofrontal cortex (OFC) of the women's brains, and that the level of this activation was related to the women's amount of self-reported trait social anxiety. The women didn't have any psychiatric diagnoses but the higher they scored on a measure of trait social anxiety (e.g. they said they felt uncomfortable in large groups), the less activation they exhibited in their OFC when exposed to the men's sweat.

It's important to emphasise that most of the women (nearly 90 per cent) didn't realise the smells were from humans, and the smells had no effect on their in-the-moment mood or anxiety levels. Consistent with this, the different smells didn't differentially affect the amygdala, a bilateral subcortical structure associated with fear processing. What the study appears to be showing is that subconscious social signals trigger increased OFC activity compared with nonsocial smells, and that the level of this activity is moderated by trait social anxiety.

Why the OFC? The OFC is heavily interconnected with the amygdala and is known to be involved in the learning of rewards and punishments and in decision-making. Another brain imaging study found that public speaking was associated with increased activation in the amygdala and reduced activation in the OFC. So it makes sense that people with a predisposition for social anxiety may have an OFC cortex that functions differently from those without such a disposition.

'Whether such inherent variations can be directly mapped onto genetic differences or personality traits in both normal and clinical populations, is an important open question and this deserves serious studies in the future,' the researchers said.
_________________________________

ResearchBlogging.orgZhou, W., Hou, P., Zhou, Y., and Chen, D. (2010). Reduced recruitment of orbitofrontal cortex to human social chemosensory cues in social anxiety. NeuroImage DOI: 10.1016/j.neuroimage.2010.12.064

Friday, March 25, 2011

More serious brain injuries associated with more life satisfaction

Psychologists investigating the well-being of patients with an acquired brain injury (ABI) have documented a curious phenomenon, whereby the more serious a person's brain injury, the higher their self-reported life-satisfaction.

With the help of the charity Headway UK, Janelle Jones and her colleagues recruited 630 people (aged 9 to 81) with an acquired brain injury. Most had sustained their injuries from road accidents, with other causes including stroke and falls. Based on the time they'd spent in a coma, the majority of the participants' injuries were judged to be moderate to severe.

The participants answered a brief, 20-item questionnaire about their sense of identity (e.g. 'I think of myself as someone who has survived a brain injury'), their social support, relationship changes since their injury, and their life-satisfaction.

Having a strong sense of identity, seeing oneself as a survivor, having plenty of social support and improved relationships were all independently related to higher life satisfaction. These different factors also influenced each other. '...[I]t is likely that personal identity and social network support factors operate in a cyclical way,' the researchers said, 'whereby becoming personally stronger from effectively relying on social support also makes individuals more likely to continue to seek out social support and, in that way, to develop social capital.'

Perhaps the most curious finding was that participants who'd sustained more serious injuries tended to report being more satisfied with their lives. This association was mediated by the social and identity factors - that is, participants who'd sustained a more serious injury also tended to identify more strongly as a survivor, and to have more social support and improved relationships.

An obvious suggestion is that the more seriously injured participants might not have complete insight into their lives. Jones and her colleagues doubt this is the case, in part because of the logic of the results, with identity and social support mediating the higher life satisfaction among these participants.

'Sustaining a head injury does not always lead to a deterioration in one's quality of life,' the researchers concluded. '...[D]ata from this study serves to tell a coherent story about the way in which the quality of life of those who experience ABIs can be enhanced by the personal and social "identity work" that these injuries require them to perform. ... Nietzsche, then, was correct to observe that that which does not kill us can make us stronger.'
_________________________________

ResearchBlogging.orgJones, J., Haslam, S., Jetten, J., Williams, W., Morris, R., and Saroyan, S. (2011). That which doesn't kill us can make us stronger (and more satisfied with life): The contribution of personal and social changes to well-being after acquired brain injury. Psychology and Health, 26 (3), 353-369 DOI: 10.1080/08870440903440699

Monday, March 14, 2011

A real study of magicians' fake movements

Magicians trick us with their sleights of hand, reaching for objects that aren't there and pretending to drop others that they've really kept hold of. This ability is all the more remarkable because research has shown how poor the rest of us are at faking reaching gestures and other movements. Now Cristiana Cavina-Pratesi and her colleagues have used motion-tracking technology to investigate how the magicians do it.

First off, ten magicians and ten controls reached for and picked up a wooden block, or mimed reaching and picking up an imaginary block situated next to the real one. Just as the participants began reaching, their sight was completely obscured by shutter glasses - this was to simulate the way that magicians often look away from where they're reaching. The participants' grasps were performed either with forefinger and thumb or little-finger and thumb, and markers were worn on these digits so they could be monitored with a motion-tracking system.

Just as has been found in earlier research, the controls' pantomime grasping movements were quite distinct from the real thing - the 'maximum grip aperture' (the maximum gap between thumb and finger) was smaller, as was a metric called the 'grip overshoot', calculated from the position of the thumb and fingers during the actual grasp. In contrast, the magicians' maximum grip aperture and grip overshoot were the same whether they actually grasped a real wooden block, or mimed grasping an imaginary one next to it.

Having confirmed that magicians' fake movements really are like the real thing, a second experiment, involving batteries rather than wooden blocks, made things harder. This time, the miming condition was performed without a real, to-be-grasped object anywhere in sight. The seven magicians and seven controls performed their real grasps as before, but when the miming grasps were performed, the batteries were hidden away. Curiously, under these conditions, the magicians were no better at faking than the controls.

The researchers said this suggests that 'the talent of magicians lies in their ability to use visual input from real objects to calibrate a grasping action toward a separate spatial location (that of the imagined object).'

How do they develop this ability? Cavina-Pratesi's team think it reflects a flexibility in the magicians' occipito-parietal system (located towards the back of the brain). 'This flexibility,' they said, 'might exploit mechanisms similar to those underlying people's ability to adapt to spatially displacing prisms through repeated target-directed movements.' They're referring here to the human ability to adapt to prism glasses that distort the visual world. At first the glasses are disorientating, but most people are able to adapt quickly. The researchers said future brain imaging studies will help reveal exactly what's going on in the magicians' brains as they perform their trickery.
_________________________________

ResearchBlogging.orgCavina-Pratesi, C., Kuhn, G., Ietswaart, M., and Milner, A. (2011). The Magic Grasp: Motor Expertise in Deception. PLoS ONE, 6 (2) DOI: 10.1371/journal.pone.0016568

Friday, March 4, 2011

Beauty and its neural reward are in the eye of the crowd

Following the crowd really can change the value we see in things
Let's be honest, most of us do it, at least some of the time. We modify our own opinions in line with what other people think, especially our friends and peers.

A problem for psychologists investigating the effect of peer influence is that it can be tricky to tell whether people are simply acquiescing in public, for show, or if their attitudes really have changed. A new study by a team of psychologists at Harvard University has used an innovative mix of behavioural and brain-scan methods to show that peer influence really can change how people value something, in this case the attractiveness of a face.

Fourteen male participants performed a series of 'hot-or-not' style ratings of pictures of 180 women's faces. For the majority of the faces, after they'd made their own rating, the students were shown the average rating given to that face by hundreds of previous participants. This was actually fixed by the researchers and was sometimes higher than the participant's own rating and sometimes lower.

About half an hour later, the participants rated the same faces again, but this time had their brains scanned whilst they did so. The revelation here was that the effect of the faces on reward-related regions in the participants' brains depended on the feedback the participants had received earlier about how their peers had rated those faces.

Let's focus on those faces that a participant had earlier given equal attractiveness ratings to, and which you'd therefore think they'd find equally rewarding to look at. In fact, among these faces, those that they'd been told earlier were rated as more attractive by previous participants, triggered more reward-related brain activity (the participants also increased the attractiveness ratings they gave to these faces). In contrast, the faces they'd earlier been told were rated as less attractive by peers, triggered less reward activity, and were now rated as less attractive by the participants.

A financial game played during the same scanning session allowed the reearchers to pin-point the brain areas involved in receiving monetary reward - the orbitofrontal cortex and nucleus accumbens. It was these same brain regions that were more active when the participants looked at female faces which they'd earlier been told were rated as more attractive by other men.

This isn't the first time that brain imaging has been used to show how social factors can alter the value we place on things. For example, a wine-tasting study tricked participants into drinking the same wine twice, once thinking it was an expensive bottle and another time thinking it was a cheap one. The participants' reward pathways were activated more when they thought the wine was expensive. In a similar fashion this new study suggests that the pleasure we find in looking at a face is dependent not just on what we think of it, but on what we think other people think of it.

'Rather than the result of individual weakness and faulty character, conformity appears to arise from the same neural systems that guide behaviour towards highly-valued outcomes, including such basic needs as food, water, and opportunities for reproduction,' wrote the team, led by Jamil Zaki. 'This emerging understanding of the neural basis of social influence suggests that members of our species are not only remarkable in their willingness to adopt the opinions and norms of others, but equally remarkable in their fundamental motivation for doing so.'
_________________________________

ResearchBlogging.orgJamil Zeki, Jason Mitchell, and Jessica Schirmer (2011). Social influence modulates the neural computation of valuePsychological Science, In Press.

Tuesday, February 22, 2011

Stroke cures man of life-long stammer

The cerebellum is coloured green in this model
Thanks to the success of the King's Speech movie, most of us are familiar with the 'developmental' kind of stammering that begins in childhood. However, more rarely, stammering can also have a sudden onset, triggered by illness or injury to the brain. Far rarer still are cases where a person with a pre-existing, developmental stammer suffers from brain injury or disease and is subsequently cured. In fact, a team led by Magid Bakheit at Mosley Hall Hospital in Birmingham, who have newly reported such a patient, are aware of just two prior adult cases in the literature.

Bakheit's patient, a 54-year-old bilingual man, suffered a stroke that caused damage to the left side of his brain stem and both hemispheres of his cerebellum - that's the cauliflower-shaped structure, associated with motor control and other functions, which hangs off the back of the brain. The man's brain damage left him unsteady on his feet, gave him difficulty with swallowing and his speech was slightly slurred. But remarkably, his life-long stammer, characterised by repetitions of sounds, and which caused him social anxiety and avoidance, was entirely gone - an account corroborated by his wife. By the time of his discharge from hospital, the slowing of his speech was much improved and yet thankfully his stammer remained absent.

The researchers can't be sure, but they think the remission of the man's stammer is likely related to his cerebellum damage, which may have had the effect of inhibiting excessive neural activation in that structure. This would be consistent with previous research showing that people who stammer have exaggerated activation in the cerebellum compared with controls, and with the finding that successful speech therapy is associated with reductions to cerebellum activation compared with pre-treatment. A second, related possibility is that, pre-stroke, the man's cerebellum was somehow having a detrimental effect on his basal ganglia (a group of sub-cortical structures involved in motor control and other functions) and that this adverse effect was ameliorated by the stroke-induced damage. This would be consistent with reports of stammers developing in patients with diseases, such as Parkinson's, that affect the basal ganglia.

A third and final possibility, the researchers said, is simply that the slowing of the man's speech somehow aided his stammer. Indeed, reducing the rate of speech is a therapeutic approach. However, this certainly wasn't a conscious strategy employed by the patient, and as we've seen, his stammer remained in remission even as his speech rate improved.

'The complete remission of stammering following a posterior circulation stroke in our patient suggests that the cerebellum and/or its connections with brain structures has an important role in maintaining developmental stammering,' the researchers concluded.
_________________________________

ResearchBlogging.orgBakheit AM, Frost J, and Ackroyd E (2011). Remission of life-long stammering after posterior circulation stroke. Neurocase : case studies in neuropsychology, neuropsychiatry, and behavioural neurology, 17 (1), 41-5 PMID: 20799135