Self-Esteem, Cheating, Voting and Confirmation Bias

Posted by Beetle B. on Sun 23 October 2016


Is self-esteem about how we perceive ourselves, or about how we perceive other’s perceptions about ourselves?

Levy Study: Subjects were asked to rank themselves on how much they cared what others thought of them, and he split them into the high and the low groups. He asked them to speak into a microphone, and told them that the numbers they saw on the screen in front of them indicated someone else’s willingness to talk to them. Both groups were equally impacted by declining numbers.

Psychopaths do care about what others think of them - but only for the purpose of manipulating them.

Confirmation Bias

The book emphasized the need to discuss your thoughts with others. You yourself are very prone to confirmation bias. Others are not prone to it when discussing other people’s ideas.

Perkin’s Study: Does education correlate with lower confirmation bias? He asked people to write an initial (i.e. not deeply thought out) judgment for a social issue. He then asked them to come up with reasons for both sides. Regardless of education levels, many more reasons were provided in favor of the initial judgment compared to the opposing viewpoint.

The only difference in education levels was the number of reasons they came up with (correlates to IQ as well).

High IQ people are as bad at confirmation bias. However, they are better at arguing in favor of their stance. In a sense, their IQ is “wasted” in this endeavor.

In the study, there was no incentive for self-preservation or self-gain.

The Prevalence of Cheating

In another study, people are asked to do a task and given a slip to get paid. They have to take the slip to another room to a cashier. In the study, the cashier would overpay them.

Only 20% of the subjects pointed out the error. However, when the cashier would ask “Is this the correct amount?”, that percentage went up to 60%. Being asked directly removes some deniability. It makes the lie explicit.

The correlation between those who rate themselves as honest and those who return the money is poor.

Dan Ariely showed that most people will cheat, but only a little bit. When the error goes above a threshold, more people speak up.

Cheaters who left the room did not rate themselves any lower than before on the honesty scale.

Confirmation Bias

must vs can. People respond negatively to must, even if it is for an activity they enjoy!

If we want to believe something, we look for any possible reason and stop! We ask ourselves “How can I believe it?” If we do not want to believe something, we ask: “Must I believe it?” We search for contrary evidence, and once we find it, we stop searching.

Personal anecdote: I’ve seen this often: People can become very analytical when confronted with evidence that opposes their viewpoint, but very lax when provided data that is more in line with their world view.

Voting and Polarization

It has been widely studied that people generally do not vote for their self interests. It is a very poor predictor on how one will vote:

  • Parents of school going children are not more likely to raise taxes for schools.
  • The uninsured are not more likely to support government run health insurance.

The exceptions are when the benefits are substantial, immediate, and well publicized.

However, people do support policies that seem to support the group they identify with.

Attitude polarization is shown to occur when the same fact or study is shown to groups with opposite leanings (e.g. a study showing that the death penalty deters crime). The study/fact is unambiguous, yet it causes more polarization.

So more facts and knowledge are not a way to bridge gaps in understanding between two polarized sides.

This effect is not always present, but is more common when the person is passionate about the topic.

Another study: People show conflicting emotions when a preferred candidate demonstrates hypocrisy but none when the opposing candidate does. The judgments on the candidates are not equal (even though the act is).

When then shown an explanation for the hypocritical behavior of their own candidate, pleasure centers are activated.

Strongly partisan people go through this so often (seeing hypocrisy followed by some rationale for it) that being partisan is possibly addictive. They are so accustomed to the mental gymnastics that they frequently get their pleasure centers activated, and get a high each time. Perhaps ultimately their bodies are dependent on getting that high. This may explain why partisans can be incredibly stubborn and can believe the most outlandish things.

tags : trm, morality, Haidt