crasch (crasch) wrote,

Heart joins head in moral maze
14 September 2001

Heart joins head in moral maze

People may rely on emotion as much as reason deciding moral
dilemmas. 14 September 2001


If five people are trapped on a railway
track and a train is approaching, is it
morally right to divert the train onto
another track where there is only one
person? Most people would say yes. Would
it be right to push a person onto the track
to prevent the train from hitting the other
five? This time, most people would say no.

The different responses puzzle
philosophers, because the principle -
sacrifice one life to save five - is the
same in both cases.

Magnetic resonance images now show that our brains process
the two dilemmas in fundamentally different ways, using
brain regions responsible for emotion only in the second

"We've known almost nothing about how the brain handles
moral dilemmas," says psychologist Jonathan Cohen, who
conducted the experiment with colleagues at Princeton
University in New Jersey. "Now it appears that when people
make moral decisions, emotional responses play as much of a
role as logical analyses."

When study participants made moral decisions about
situations that have a personal element, such as throwing
people off a sinking lifeboat, activity soared in four parts
of the brain involved in processing emotion. Meanwhile, it
sank in three regions associated with working memory, which
stores and processes information in the short term.

The reverse happened when subjects judged less personal
moral dilemmas, such as keeping the money found in a lost
wallet, or considered questions that were not moral issues,
such as whether to travel by bus or train in a given

"This is fascinating research, which brings emotion firmly
into the process of reasoning itself," says Helen Haste, an
expert on the psychology of morality at the University of
Bath in England. Many researchers have regarded moral
reasoning as a purely analytical process, and deemed emotion
as "something that gets in the way of reason", she says.

Gut feeling

Psychologists have had many clues to the importance of
emotion in moral decision-making, says Joshua Greene, who
led the Princeton study. Most famous is the
nineteenth-century Vermont railroad worker Phineas Gage. He
was transformed from a well-respected, law-abiding citizen
to a shiftless, quarrelsome drifter after an iron rod passed
through his eye socket and out of the back of his head in an

"Right after the accident he seemed fine - he could talk,
and do mathematics," Greene says. "But his moral behaviour
changed dramatically, even though his basic reasoning
ability seemed intact." More recently, Greene says, a
patient who suffered similar brain damage started making
disastrous moral decisions in his personal life, even though
he could analyse abstract moral dilemmas logically.

Perhaps the most crucial finding of the
study, Greene says, was that people took
significantly longer to conclude that it was
appropriate to push a person in front of the
train than to decide it was
inappropriate. "The people who said it was
appropriate had to fight their emotions, so
they were more hesitant," he says. "This says
that emotion isn't just incidental, but
really exerts a force on people's

The study makes no judgement about what decisions are moral,
Cohen emphasizes. "What we've done has nothing to do with
what is morally right, we are just describing how people
come to decisions," he says. "That doesn't mean they've come
to the right decision."

At the same time, there could be good reasons to trust our
gut responses, he suggests. "Emotions may well be important
adaptations. We don't have to write them off as silly,
murky, irrational responses."


1.Greene, J. et al. An fMRI investigation of emotional
engagement in moral judgment. Science, 293, 2105 - 2108,
  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened