Thursday, March 22, 2007

Folie et déraison: Histoire de la folie à l'âge classique

Peter Singer, fearless consequentialist, writes without irony:


"When we condemn the behaviour of a politician, celebrity, or friend, we often end up appealing to our moral intuitions. "It just feels wrong!" we say. But where do these intuitive judgments come from? Are they reliable moral guides?

Recently, some unusual research has raised questions about the role of intuitive responses in ethical reasoning. Joshua Greene, a philosophy graduate now working in psychology at Harvard, studied how people respond to a set of imaginary dilemmas. In one, you are standing by a railroad track when you notice that a trolley, with no one aboard, is heading for a group of five people. They will all be killed if it continues on its current track. The only thing you can do to prevent these five deaths is to throw a switch that will divert the trolley on to a side track, where it will kill only one person. When asked what you should do in these circumstances, most people say you should divert the trolley on to the side track, thus saving a net four lives.

In another dilemma, the trolley is about to kill five people. This time, you are standing on a footbridge above the track. You cannot divert the trolley. You consider jumping off the bridge, in front of the trolley, thus sacrificing yourself to save the people in danger, but you realise you are too light to stop the trolley. Standing next to you is a very large stranger. The only way you can prevent the trolley from killing five people is by pushing this stranger off the bridge into the path of the trolley. He will be killed, but you will save the other five. When asked what you should do in these circumstances, most people say that it would be wrong to push the stranger.

This judgment is not limited to particular cultures. Marc Hauser, at Harvard University, has put similar dilemmas on the web in what he calls a Moral Sense Test (moral.wjh.harvard.edu). After receiving tens of thousands of responses, he finds remarkable consistency despite differences in nationality, ethnicity, religion, age and sex.

Philosophers have puzzled about how to justify our intuitions in these situations, given that, in both cases, the choice seems to be between saving five lives at the cost of taking one. Greene, however, was more concerned to understand why we have the intuitions, so he used functional magnetic resonance imaging, or fMRI, to examine what happens in people's brains when they make these moral judgments.

Greene found that people asked to make a moral judgment about "personal" violations, like pushing the stranger off the footbridge, showed increased activity in areas of the brain associated with emotions. This was not the case with people asked to make judgments about relatively "impersonal" violations like throwing a switch. Moreover, the minority of subjects who did consider that it would be right to push the stranger off the footbridge took longer to reach this judgment than those who said that doing so would be wrong.

Why would our judgments and emotions vary in this way? For most of our evolutionary history, human beings have lived in small groups, in which violence could be inflicted only in an up-close and personal way, by hitting, pushing, strangling, or using a stick or stone. To deal with such situations, we developed immediate, emotionally based intuitive responses to the infliction of violence on others. The thought of pushing the stranger off the bridge elicits these responses. On the other hand, it is only in the past couple of centuries - not long enough to have any evolutionary significance - that we have been able to harm anyone by throwing a switch that diverts a train. Hence the thought of doing it does not elicit the same emotional response as pushing someone off a bridge.

Greene's work helps us understand where our moral intuitions come from. But the fact that our moral intuitions are universal and part of our human nature does not mean that they are right. On the contrary, these findings should make us more sceptical about relying on our intuitions. There is, after all, no ethical significance in the fact that one method of harming others has existed for most of our evolutionary history, and the other is relatively new. Blowing up people with bombs is no better than clubbing them to death. And the death of one person is a lesser tragedy than the death of five, no matter how that death is brought about. So we should think for ourselves, not just listen to our intuitions." (Peter Singer, "Reason With Yourself," The Guardian, Tuesday, March 20, 2007).

Ogged of Unfogged blog with Swiftian modesty appends a footnote of a study indicating:

"Damage to an area of the brain behind the forehead, inches behind the eyes, transforms the way people make moral judgments in life-or-death situations, scientists are reporting today. In a new study, people with this rare injury expressed increased willingness to kill or harm another person if doing so would save others' lives."

To which Ogged comments:

"Consequentialism: Like Brain Damage, But Without The Excuse." (Ogged, Unfogged, 03/21/2007).

0 Comments:

Post a Comment

<< Home