Tell me about your mother Psychology |
For our next session... |
Popping into your mind |
“”No man, for any considerable period, can wear one face to himself and another to the multitude, without finally getting bewildered as to which may be the true.
|
—Nathaniel Hawthorne, The Scarlet Letter |
Cognitive dissonance is a psychological term which describes the uncomfortable tension that results from having two conflicting thoughts at once, or from engaging in behavior that conflicts with one's beliefs. It is also a description of the behaviors that allow people to soothe, override, or otherwise overcome such dissonance.
A very simple example of this involves the act of giving blood. You are there and it is uncomfortable, but you know it is a good and necessary thing to do. So when asked "Are you comfortable?", you lie without thinking and say "everything is fine." You may even exaggerate and say out loud with a smile on your face, "it's great", even though there's a 19-inch needle in your arm. Then, having said you are fine, your brain subconsciously begins to convince yourself that you are fine to alleviate the cognitive dissonance. This entire process is studied under the rubric of "cognitive dissonance".
Some famous non-religious examples include:
The concept of cognitive dissonance was developed and tested by observing some cults and observing how they reacted when their beliefs (in the end of the world) were shattered (by the world simply not ending), first and most famously in Leon Festinger, Henry Riecken, and Stanley Schacter's When Prophecy Fails.[1][2] As the sensation of dissonance is very unpleasant, most people tend to resolve it by converting their knowledge, beliefs, behaviours, and perceptions so that they are consistent between each other. Sounds logical, indeed, but there is a catch: the resolution is usually through the path of least psychological resistance. For example, when the cults' prophecies were proved to be wrong, the followers' faith didn't diminish; to the contrary, it strengthened, because it is much easier to simply disavow pieces of evidence as "false", put up an excuse, and keep on believing, than it is to change a belief that has grown to be an individual's entire soul, fiber, and character. Even their memories are distorted; one such person claimed that the date of the world's end had never been given with certainty, and evinced genuine surprise when his own words were played back saying that the world would absolutely totally for sure end on that date.
A prominent political example of resolving dissonance in this way can be found in the various smear attempts made against US President Barack Obama throughout 2008 and 2009. WorldNetDaily editor Joseph Farah — a key player in the Birther movement — has a firm belief that Obama is not American, and evidence to the contrary was dismissed as insufficient or fraudulent. Evidence for such beliefs, on the other hand, is usually blindly accepted by way of confirmation bias.
In summation, the fact that many people can be persuaded to accept a (poorly-constructed) argument, despite — or even because of — it being riddled with logical fallacies, can often be explained by acceptance of the argument producing less cognitive dissonance in the audience than rejection of the argument would.[citation needed]
Many social engineering techniques, such as milk before meat and foot-in-the-door salesmanship, owe a large part of their success to the exploitation of cognitive dissonance.[citation needed] For example, cognitive dissonance is a large part of why hazing builds loyalty; if you go through a rough initiation to get into a fraternity, you'll go to great lengths to convince yourself that the organization is awesome enough to have been worth it. Similarly, end of the world cultists often give away everything they own shortly before the appointed date, and will go to great lengths to avoid thinking it was for nothing.
Creationists have particular problems with scientific concepts that conflict with (read: flat-out disprove) a 6,000-year-old earth and a global flood occurring 4,300 years ago. This causes them either to propose silly stuff like baramins, a changing speed of light, problems with radiometric dating, and the like or to claim the Big Bang and/or evolutionary theories are beliefs, saying for example plants appeared before the Sun but the light of God replaced the Daystar and that no other life-bearing planet exists because the Bible does not mention it, but at the same time to have no problems, say, with galaxies at billions of light years away despite the Bible not mentioning them and an Universe far bigger and richer than anything the biblical authors could fathom.
“”People don't like to think. If one thinks, one must reach conclusions. Conclusions are not always pleasant.
|
—Attributed to Helen Keller (possibly erroneously) |
Most people will (eventually) change their beliefs on a subject after enough contradictory evidence emerges, because sometimes said evidence is so solid and undeniable that it is easier to give up a complex worldview than to constantly generate excuses why the evidence against the worldview is and must be false. Other individuals, especially when they have support networks of others reinforcing a delusion or worldview, will go to such great lengths to rationalize away dissenting ideas that after a certain point, an admission of error would cause the collapse of an entire web of mutually-supporting beliefs. This would leave the brain with no ability to do its work, as everything it thought it knew would now be useless, resulting in agony/extreme fear of death and the activation of emergency self-protection mechanisms. Those mechanisms cause the individual to either go into an introverted reaction, with all-encompassing (willful) ignorance and cutting off any contact to those conflicting parts of the real world, or an extroverted reaction of trying to attack and destroy the sources of the conflicting information for heresy.
By comparison, a human being who would actually go through such an event without that protection would end up in a state of complete inability to accept himself/herself and to choose the “right” actions for even the simplest situations, making it impossible for him/her to continue living. So the former protective reactions are still the better (because it means survival) of three really bad choices.