The Wildlife Conservation Society has launched an international effort to prevent the extinction of a dozen turtle species. I marvel at this: people from around the world collaborating to save creatures because it’s the right thing to do.
The marvel is that human beings are capable of selflessness, empathy, doing the right thing. We need more of those marvels. Of course, the Wildlife Conservation Society is not unique in this regard—yet another marvel. It is also not unique in getting itself into moral dilemmas.
Last year, the head of the organization in India was forced to resign because he participated in implementation of the Forest Rights Act—environmental legislation common to many developing countries where the traditional rights of tribal people to land are restored. International organizations such as the Wildlife Conservation Society oppose these laws because they believe that tribal peoples will not properly manage wild places.
The dilemma was: protect the wildlife or protect the tribal people. Some argued that it was a false dilemma, that in fact protecting tribal rights would also protect wildlife. However, although almost all moral dilemmas have questions of fact at their core, I want to bring to your attention how humans actually work out what to do—in particular, the critical role of the context in which moral dilemmas arise and are resolved.
In the last decade or so, both psychologists and neuroscientists have uncovered what goes on in our brain when we make moral decisions. A typical experiment consists of posing a moral dilemma to someone and identifying what areas of the brain light up when he or she decides what to do.
A classic moral dilemma used in these experiments is called the Trolley Problem.
A runaway trolley is headed toward a switch that you control. Along one track, five people are working. Along another, one person is working. You can’t stop the train, but only change which track the trolley runs along. If you throw the switch, the trolley kills the single worker, saving the other five. Do you throw the switch?
Nine out of ten people throw the switch. The brain area that lights is used for calculation. Essentially, those who throw the switch are making a utilitarian decision: one life for five.
But if the dilemma is changed, a different area of the brain lights up. This dilemma, again a classic, is called the Man on the Bridge Problem. The trolley is coming, there are five workers on the track, but instead of standing at a switch, you’re standing on a bridge over the track. Next to you is a large man who, if thrown in front of the trolley, would bring it to a stop before hitting the five workers. Do you throw the large man in front of the trolley?
This time nine out of ten people would NOT throw the man in front of the train, sacrificing the five to save the one. The brain area that lights up is used, not for calculation, but for emotion and empathy.
Those who chose to throw the switch justified their action with utilitarian reasoning: greatest good to the greatest number. Those who chose not to throw the large man from the bridge appealed to their gut feeling, their intuition: it just didn’t feel right.
Researchers frequently use this contrast to suggest that the intuitionists have made a moral choice of poor quality—in other words, the utilitarian credo of greatest good to the greatest number defines good moral choices. That bias is understandable—as Marx observed, utilitarianism is the philosophy of the market, which saturates the air we breathe. It also turns out to have some unsavory associations.
The people whose brains light up in resolving moral dilemmas with a utilitarian calculation are those prone to analytic thinking (such a philosophers and scientists) and men and people who score high for antisocial characteristics on tests for psychopathology and Machiavellianism.
But that information doesn’t answer the question of whether it’s right to throw the switch or the large man. Of course, it would be hard to suggest that it’s wrong to throw the switch at all. Or would it? What if the one worker who gets killed is your child? Faced with that kind of dilemma, the split is half and half, not nine out of ten who would throw the switch.
So context is critical. And so is your emotional distance from the subjects of your actions.
Consider the following dilemma. A scientific panel has decided that Teflon causes testicular and kidney cancer. You are an executive at DuPont, responsible for Teflon. Do you stop producing Teflon throwing hundreds of people out of work?
Or consider this dilemma. You are a public official. Before you is an application for construction of a cell tower. Most people want the better cellphone service that is promised. A small number of people fear for their health and the health of their children. Do you vote for or against the application?
The moral dilemma is not just about the possible outcomes. It’s also about how you create the context in which the dilemma is resolved and how you feel about the people who are affected.