Comprehensive coverage

What to do when the facts are not convincing?

Why a worldview undermines the evidence

Have you ever noticed that when you present people with facts that contradict their deepest beliefs they immediately change their minds? I didn't notice that either. In fact, people seem to strengthen their hold on their beliefs in the face of overwhelming evidence against them. The reason has to do with people believing that the conflicting data undermines their overall worldview.

creationists, for example, do not accept the evidence supporting evolution, based on fossils and DNA, because they are concerned that secular forces are gnawing away at religious belief. Opponents of vaccines do not trust the big pharmaceutical companies and believe that money corrupts medicine, a view that makes them believe that vaccines cause autism despite the disturbing truth for them, that the one and only study that claimed such a connection was deleted from the journal in which it was published and its author was accused of fraud. Supporters Connection theories regarding the 11/XNUMX attacks focus on specific details, such as the claim that the melting point of the skeleton steel in the World Trade Center towers caused their collapse, because they believe that the US government is lying and conducting diversionary operations to create "new world order". Climate change deniers delve into tree rings, ice cores and concentrations of greenhouse gases because they are passionate advocates of freedom, especially the freedom of markets and industries to operate without restrictive government regulations to weigh them down. the doubters Since Obama was born in the USA, they analyzed the President's full birth certificate for evidence of forgery because they believe that the first African-American President of the USA is a socialist, who was determined to destroy the country.

In all of these examples, followers felt that the skeptics threatened their deepest worldviews, so facts became an enemy to be destroyed. The power of faith over evidence comes from two factors: cognitive dissonance וThe boomerang effect. In the psychologist's classic book Leon Festinger and his colleagues, which was published in 1956 under the title "When a prophecy is false", they describe what happened to cult members who believed in aliens, when the alien mothership failed to arrive on the specified date. Instead of admitting their mistake, "members of the group frantically tried to convince the world of their beliefs, desperately trying to eliminate the dissonance gnawing at them through repeated prophecies in the hope that one of them would come true." Festinger called the phenomenon cognitive dissonance, or the nagging tension that accompanies holding two contradictory opinions at the same time.

In the book "mistakes were made (but not by)” published in 2007 by two social psychologists, Carol Tavaris וElliot Aronson (a former student of Festinger), they document thousands of experiments demonstrating how people "twist" facts to fit their preconceptions, to reduce dissonance. In the metaphor developed by the researchers, the "pyramid of choice", they describe two people standing side by side at the top of a pyramid, and show how the people quickly move away from each other and end up in two opposite corners at the bottom of the pyramid, because each of them chose to defend a certain position.

In a series of experiments conducted by a professor Brendan Nihan of Darmouth College and Professor Jason Ripler From the University of Exeter, researchers identified another close factor: The boomerang effect. According to their findings "correction of misconceptions in fact amplifying the grip on them among the group being tested". Why is this happening? "Because the amendment threatens their worldview or their self-concept." For example, subjects were given fake newspaper articles confirming common misconceptions, such as the presence of weapons of mass destruction in Iraq before the Second Gulf War. When the subjects were then given a corrective article stating that no such weapon had ever been found in Iraq, the liberals among them, who opposed the war in the first place, accepted the corrective article and rejected the previous articles. In contrast, conservatives, who supported going to war, used to do the opposite. And what's more: they reported that after reading the corrective article they were convinced even more that such a weapon was found in Iraq. According to them, the article only proves that Saddam Hussein hid the weapon or destroyed it. In fact, Nihan and Reifler say that among many conservatives "the belief that Iraq possessed weapons of mass destruction immediately before the US invaded Iraq survived long after even the Bush administration itself had come to the opposite conclusion."

If corrective facts only make things worse, what can be done to convince people that their beliefs are wrong? From my personal experience, 1. Do not mix emotions in the conversation, 2. Discuss the substance of the matter and don't attack your interlocutors (don't discuss a person's body and don't compare his views to those of Hitler), 3. Listen carefully and try to eloquently express the opposite position accurately, 4. behaved respectfully, 5. Admit that you understand why someone might have a different position than you, and6. Try to prove that accepting the new facts does not require a change in worldview. These methods won't necessarily change people's minds, but now that the American nation (and the world at large) is caught in a political battle of fact-checking, they may help reduce unnecessary partisanship.

about the writer

Michael Shermer - The publisher of the journal Skeptic (www.skeptic.co. His new book: "The Moral Noah's Ark" was recently published. Follow him on Twitter: @michaelshermer

6 תגובות

  1. An important article, especially in its central idea. At the same time, in the first paragraph, in which several examples were presented to illustrate the automatic denial of people with a certain opinion, it seems to Nd that some of the examples are strongly influenced by a personal view, and therefore harm the transmission of the message of the article.

    The most striking example in my eyes was regarding the "denial of climate change".
    Even if it is said that at one end there are people who ignore any climate change, at the other end, anchored in media relations, there is another group that focuses on one scientific theory of many and makes it predict everything while mixing between science that should be pure and politics.

    Known claims against Greenpeace Co.:
    Atomic energy and even hydraulic energy are much more efficient than solar and wind turbines. Why not consider using them?

    How much global warming is indeed man-made and not a natural phenomenon?
    And those who want to go deeper are invited, for example, to listen to Prof. Yoni Dovi, an expert in scientific methodology.

  2. I add that from my experience and understanding, what keeps people from being able to change their minds, as it says in the article, are their internal beliefs at the level of identity (beliefs like: "I'm stupid, I'm zero, I'm not good enough," etc.) because as soon as there is some external event (including the presentation of contradictory facts) This threatens the perception of the person's own identity, and if he has one of the limiting beliefs on an identity level - then admitting a mistake is apparently admitting that that belief is correct.
    Since they spend most of their lives making eights in the air in order to prove that this belief about themselves is wrong (through compensation patterns - behavior patterns whose main purpose is to prove to the world and to themselves that that belief is wrong) they are very invested in their opinion.
    The sense of cognitive dissonance also threatens the same thing, and if they have more than one such belief, so be it.

    These limiting beliefs are usually formed in childhood, and if they are not released, or replaced with empowering beliefs, the person will develop patterns of compensation and also a pattern of compensation called: winning image. This is a specific compensation pattern that brings him many results in life (and hurts him elsewhere).

    Conversely, people who have released this inner belief will find it much easier to change their minds and/or deal with cognitive dissonance because their identity level is not threatened.

  3. In my opinion, the processing of the new information in the brain should support the change of opinion, but the problem is that a person's worldview is deep in the person's consciousness and is the result of many experiences created by many interpretations given on many occasions over the years and they created an opinion that is in the form of a person's semantic memory, therefore even if the facts The news that we are exposed to supports a different opinion. This can only affect the memory in the short term, which is not capable of changing the person's worldview at once. Therefore, as soon as the person activates the so-called automatic pilot, it will be activated by the memory that is in the subconscious. In order for the person's opinion to change, he has to go through events. Many who will make the new interpretation assimilate into his memory
    It is sometimes possible to see even a sudden change in the knowledge when it is accompanied by a deep emotional experience equal in its energy to the experience of the opposite knowledge found in the subconscious
    For example, when a person who hates someone very much will do a DNA test. And he will see that the one he hates is his brother, so in one second his mind will completely change about him and that's because of the deep experience it caused him

    Regarding the reason why the person tends to close in on the old opinion when he is exposed to information that supports a different opinion, the reason may be that as soon as the person sees that there is a partner who cares to convince the other opinion, he feels that he needs to represent his opinion because we actually know that even the one who tries to convince us of an opinion that is different from ours comes from the subconscious his and as a counterweight I am supposed to present my subconscious and therefore really as soon as the person trying to convince presents the relevant information in a neutral way without his personal opinion then the listener also treats the information in the same way

  4. I'm one of those who change their mind from end to end without blinking and I'm not ashamed of it at all.
    In politics, for example, I have never voted twice for the same party, I am constantly disappointed with the politicians, and I have zigzagged strongly from the right to the left and back again... and lately I don't bother going to vote at all because I think the game is already fixed, everyone who comes to Israel to be a member of the Knesset must be a member of the Knesset A man who is a manipulative man... and there is no connection between what a member of the Knesset declares when he is in the opposition and what he does later in Goelizia, and therefore there is no one in the group of 120 who is worth the trouble for him to stand in line at the polls...

  5. The opposite happened to me, in which the change of facts made me change my view. This is my political position.
    For many years I advocated and preached and voted in favor of a certain camp, but a combination of well-known and drastic events that happened convinced me that my view was wrong and I had to examine its correctness and adaptation to the constraints of reality.
    Of course, it was a process that lasted for some time and as far as I could see it had four stages:
    Step 1 - Acknowledging that my view does not meet the data of reality.
    Stage 2 - The stage of deliberation and examination, what is the right way?
    Step 3 - Forming and establishing the new point of view.
    Stage 4 - The stage of presenting and reasoning my new, updated view of my social environment, including a difficult, and sometimes emotional, confrontation with people who, as described in the article, did not identify with the change I presented and stuck to their position (which was, as mentioned, my old position), despite the facts I presented to them that justified and obliged ( in my opinion) the change in the political position.

    From this I conclude that, as indicated in the article, the same data causes some people to change their minds and others to stick to their opinions even more strongly.
    It is interesting if the above two groups can be characterized into two "types".
    The type who changes his mind according to the changing data and against him, the type who sticks to his position despite the changing data?
    And what are the other personal characteristics (if any?) of each type?

  6. In my opinion, there are people who cannot be convinced to change their minds, but there are also people who can be convinced to change their minds.
    The problem is that the kind of people who tend to change their minds are seen as weak in character, weak in faith, unreliable, unfaithful, zigzagging... and all the other traits in our culture are for some reason considered negative traits.
    And why are these qualities: faith, reliability, loyalty - considered positive qualities?
    Because it is convenient for the leaders of our herds to rule over a herd that is entirely made up of such people, therefore our rulers and our leaders make efforts to educate us in this direction.

    Gan - The author is Avi Cohen and not the site manager. (Because when I change my nick, the message from Mozo automatically appears)

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.