From The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom, an academic study that shows how people are frequently biases toward helping themselves first, assuming they’re able to maintain a public image of being morally free from bias. Aside from being an example of a well thought out in-the-lab scenario, this study is illustrative of the difference of shame versus guilt.
The simplest way to cultivate a reputation for being fair is to really be fair, but life and psychology experiments sometimes force us to choose between appearance and reality. Dan Batson at the University of Kansas devised a clever way to make people choose, and his findings are not pretty. He brought students into his lab one at a time to take part in what they thought was a study of how unequal rewards affect teamwork. The procedure was explained: One member of each team of two will be rewarded for correct responses to questions with a raffle ticket that could win a valuable prize. The other member will receive nothing. Subjects were also told that an additional part of the experiment concerned the effects of control: You, the subject, will decide which of you is rewarded, which of you is not. Your partner is already here, in another room, and the two of you will not meet. Your partner will be told that the decision was made by chance. You can make the decision in any way you like. Oh, and here is a coin: Most people in this study seem to think that flipping the coin is the fairest way to make the decision.
Subjects were then left alone to choose. About half of them used the coin. Batson knows this because the coin was wrapped in a plastic bag, and half the bags were ripped open. Of those who did not flip the coin, 90 percent chose the positive task for themselves. For those who did flip the coin, the laws of probability were suspended and 90 percent of them chose the positive task for themselves. Batson had given all the subjects a variety of questionnaires about morality weeks earlier (the subjects were students in psychology classes), so he was able to check how various measures of moral personality predicted behavior. His finding: People who reported being most concerned about caring for others and about issues of social responsibility were more likely to open the bag, but they were not more likely to give the other person the positive task. In other words, people who think they are particularly moral are in fact more likely to “do the right thing” and flip the coin, but when the coin flip comes out against them, they find a way to ignore it and follow their own self-interest. Batson called this tendency to value the appearance of morality over the reality “moral hypocrisy.”
Batson’s subjects who flipped the coin reported (on a questionnaire) that they had made the decision in an ethical way. After his first study, Batson wondered whether perhaps people tricked themselves by not stating clearly what heads or tails would mean (“Let’s see, heads, that means, um, oh yeah, I get the good one.”). But when he labeled the two sides of the coin to erase ambiguity, it made no difference. Placing a large mirror in the room, right in front of the subject, and at the same time stressing the importance of fairness in the instructions, was the only manipulation that had an effect. When people were forced to think about fairness and could see themselves cheating, they stopped doing it. A Jesus and Buddha said in the opening epigraphs of this chapter, it is easy to spot a cheater when our eyes are looking outward, but hard when looking inward. Folk wisdom from around the world concurs:
“Though you see the seven defects of others, we do not see our own ten defects.” (Japanese proverb)
“A he-goat doesn’t realize that he smells.” (Nigerian proverb)
Proving that people are selfish, or that they’ll sometimes cheat when they know they won’t be caught, seems like a good way to get an article into the Journal of Incredibly Obvious Results. What’s not so obvious is that, in nearly all these studies, people don’t think they are doing anything wrong. It’s the same in real life. From the person who cuts you off on the highway all the way to the Nazis who ran the concentration camps, most people think they are good people and that their actions are motivated by good reasons. Machiavellian tit for tat requires devotion to appearances, including protestations of one’s virtue even when one chooses vice. And such protestations are most effective when the person making them really believes them. As Robert Wright put it in his masterful book The Moral Animal, “Human beings are a species splendid in their array of moral equipment, tragic in their propensity to misuse it, and pathetic in their constitutional ignorance of the misuse.”
If Wright is correct about our “constitutional ignorance” of our hypocrisy, then the sages’ admonition to stop smirking may be no more effective than telling a depressed person to snap out of it. Curing hypocrisy is much harder because part of the problem is that we don’t believe there’s a problem. You can’t change your mental filters by willpower alone.