Home Psychology The Truth About Human Brain: Why Facts Don’t Change Our Minds

The Truth About Human Brain: Why Facts Don’t Change Our Minds

SHARE

In the 1970’s, the Stanford University conducted a study to determine how flexible human beings are when it comes to changing their beliefs. The study involved undergraduate students who were told it was about suicide.

Each of them received two suicide notes, one fake and one genuine, and they needed to identify the genuine one. Then, the participants were divided into two groups according to their test results. The participants in the first group were informed they did very well. The other group was told they did much worse.

In stage two, the participants were told the results were fake. Then, they were required to guess the number of notes they’d got right, and the number of notes a normal student could get right.

Strangely enough, the students who had been told they’d scored high in the first stage of the experiment estimated they had done very well, even though they knew the scores were fake. The students from the other group estimated their performance as poor, based on the equally fictitious initial results.

In another study conducted by the Stanford University, another group of students was given sets of information on two firefighters, Frank and George, including the firefighters’ results on the ‘Risky-Conservative Choice Test’.

In one of the versions, firefighter Frank was described as a good firefighter who chose the least risky option when tested. In the second version, Frank was described as a poor firefighter but also went for the least risky option on the test.

Again, at this point, the participants were told that the info they received was not true. Afterwards, they were asked to express their own opinion on how a good firefighter would react in high-risk situations. Similar to the first study, the students who had the first set of information stating Frank was a good firefighter believed he would stay away from risks.

Those who had the set saying that Frank was a poor firefighter thought he would be inclined towards taking risks. Just like in the first study, previous information influenced the students’ opinions, even though they knew it was fake.

These two Stanford studies confirmed the shocking truth: no matter how intelligent, people find it hard to change their formed beliefs.

Why are rational human beings so irrational in situations like these?

In “The Enigma of Reason” Dan Sperber and Hugo Mercier try to answer this question. The book strives to explain reason, the trait that actually makes us human and why it doesn’t work the way we expect it to.

The two cognitive scientists claim that our reasoning has developed in the course of evolution and should be regarded in that light. They explain humans’ inclination towards biased conclusions with the concept of cooperation.

Cooperation, not reason, is what sets us apart from other species. As individuals, we find it difficult to cooperate. According to the book, reason was not developed so that we can draw logical conclusions, but to facilitate socialization with other individuals.

But even though our reason aspires towards the truth, it ends up leading us away from it. Our tendency to believe what we want to be true is called “confirmation bias”. This form of thinking has often been explained as wishful thinking. Confirmation bias suggests that we are only selecting the data that makes us feel good and ignore the information that might change our view.

One such study on confirmation bias was conducted at Stanford in 1979. In this study, there were two groups of students with opposing opinions on capital punishment. One portion of the group was against it and believed it did not affect crime, whereas rest were in favor and thought capital punishment could decrease the crime rate.

The participants were given two studies which were, of course, fake, and were asked to evaluate them. The first study contained data that supported capital punishment, while the other challenged it. The statistics included in both studies were designed to be equally strong.

As expected, the pro-capital punishment participants assessed the first study as highly-reliable and thought the study that questioned capital punishment as dubious. The other group of participants did the exact opposite.

When the experiment finished, the participants were required to express their opinion on capital punishment again. Those who’ been in favor of capital punishment at the beginning, now supported it even more. Those who’d been against capital punishment at the beginning were now even more antagonistic.

Hugo Mercier and Dan Sperber view confirmation bias as a serious flaw that could be potentially dangerous for humans. The analogy they use to explain this is the one with a mouse which thinks like humans do.

That mouse would like to live in a world where cats don’t exist and would eventually persuade itself that this is true. A mouse that refuses to believe in a very real threat will probably end up dead.

The fact that humans have survived despite being prone to ignore evidence of existent threats is strange. The scientists believe that our ability to adapt and our “hyper-sociability” are the most probable explanations for this evolution irony. 

Strangely enough, humans have no problem being objective with other people’s opinions. We tend to be stubborn only when it comes to our own beliefs.This is what Mercier and Sperber call “myside bias.”

To prove this theory, Mercier conducted an experiment in which the participants were required to answer several reasoning problems. Then, they were given an opportunity to change them. Most participants felt there was no need to do so.

Finally, the participants got to see one previously answered reasoning problem, followed by their own answer and a different answer given by another participant. At this point, they could also change their response.

However, there was a catch; the answers that were said to be their own were actually given by another participant. Nearly 50% of the participants noticed something was off. At the end of step three, almost 60% changed the original responses.

This evolutionary ‘mistake’ worked in the past when people lived in small groups of hunters and gatherers. To our ancestors, logical reasoning was obsolete. Back then, it was important to be right in order keep your social status. In other words, men were supposed to stick to their opinions and not let others deceive them.

Modern society is much different. Today, we’re surrounded by fake news, fabricated studies, and contrasting information.

Obviously, our reasoning capacity can’t keep up with the fast-developing world. Put in this context, the fact reason is constantly failing us doesn’t come as a surprise.

If reason only believes what we want to be true, why does it even exist?

In the book, “The Knowledge Illusion: Why We Never Think Alone”, Steven Sloman and Philip Fernbach, both cognitive scientists, also argue that the fact we’re social beings has hugely influenced how we reason.

“The human mind is both brilliant and pathetic.” From mastering fire to stepping on the moon, humans have achieved so much. And yet, many don’t even know how the toilet shell actually works!

The Yale University conducted a study in which students were required to assess their comprehension of how simple devices like toilets and locks functioned. Afterwards, they were asked to describe their knowledge in detail, and then rate themselves again.

At the end of the study, students realized how ignorant they actually were on these topics.

How is it possible that the same creatures who’ve managed to sequence genomes don’t know how a lock works? The answer lies in the ways we create intelligence using the community around us.

The authors call this phenomenon “illusion of explanatory depth.”We believe that we know more than we actually do. From the moment our ancestors learned how to collaborate when hunting, we’ve been dependant on each other’s knowledge.

In fact, we cooperate so well that we are unable to draw a line between what we know and what other people know.

According to the scientists, this ‘borderlessness’ is the key factor to explain our progress. If every individual was trying to learn everything, the development of humanity as a whole would be much slower.

However, this becomes problematic in certain areas such as politics. Not knowing how a lock works won’t cause any trouble; but when it comes to our understanding of wars our country wages, things are a bit different.

In a 2014 survey, shortly after Russia appropriated Crimea, participants were asked to locate Ukraine on the map and state how they think the USA should react.  Most of them didn’t know where the country even was, but the further they thought it was, the more they favored a military intervention. Sloman and Fernbach concluded that understanding isn’t always the main factor that triggers strong feelings.

This could become dangerous in some cases. If a number of individuals whose reason is clouded by personal feelings unite, they feel even stronger about their opinion. Imagine what will happen if such a group of people is politicians that decide the country’s fate.

In conclusion, the authors encourage us to focus less on expressing personal opinions and focus more on understanding things thoroughly. According to them, this is the only way of reasoning that will change people’s views.

The good news is that science works as a corrector. It moves forward no matter what and is bound to drag us along.

The psychiatrist Jack Gorman and his daughter, Sara Gorman, a health specialist, tested the differences between scientific facts and personal opinions in their book “Denying to the Grave: Why We Ignore the Facts That Will Save Us.” They focused in particular on health science denial by raising questions like “Why do some parents refuse to vaccinate their children?”

Although there is enough scientific evidence that proves us wrong, we persist in opinions that could be potentially fatal!

One possible explanation is the fact that there’s a dopamine rush (a chemical that triggers feelings of pleasure) in our bodies when we’re going through information that supports our opinions. Basically, it’s wrong but it makes us feel good!

The biggest challenge is to change such strong beliefs. Scientific information doesn’t seem to help, whereas appealing to people’s emotions would mean distancing from scientific facts. What is the best approach in this case? It’s a question that remains unanswered, at least for the time being.

Source: The New Yorker