The Science Behind Why Your Facebook Friends Ignore Facts

In the new edition of Monday posts, we bring you interesting tidbits of information about things in our everyday life. In today’s post, Mike Fishbein explains cognitive bias, and how it affects you.


You may find yourself wondering: Why is the world so divided on religion and politics? Why do people support Donald Trump? Or Hillary Clinton? Why can’t I convince my friend to change his mind?

In this article, I share how our brains deal with information overload — and the associated cognitive biases that prevent us from correctly understanding the facts.

1. The Availability Heuristic: We Believe What’s Top of Mind.

The availability heuristic is a mental shortcut that relies on immediate examples that come to mind to determine truth or falsehood. It posits that when it comes time to make a decision, we leverage what is already top of mind. We give greater credence to this information and tend to overestimate the probability and likelihood of similar things happening in the future.

This shortcut is helpful in decision-making because we often lack the time or energy to investigate complex issues in greater depth. The availability heuristic allows people to arrive at a conclusion more quickly.

However, like other shortcuts, it can lead us astray. Of course, just because something is in our mind, doesn’t necessarily mean it’s true.

For example, after Donald Trump referred to Hillary Clinton as “crooked”, we were primed to keep interpreting her behavior as crooked. The interpretation doesn’t necessarily mean she’s not crooked, it just means that our brains are more likely to come to the conclusion that she is because it’s easier than evaluating the situation from scratch.

2. Attentional Bias: We Believe What We Pay Attention To.

Attentional bias is the tendency for our conclusions to be affected by our recurring thoughts. Furthermore, attentional bias predicts that attention will be preferentially allocated towards threatening compared to neutral or positive stimuli.

If you think what you see is the whole story, you’re displaying attentional bias. To arrive at a more accurate conclusion, you also need to consider the things you don’t see.

For example, when someone looks at only one, or a few, economic data points and then determines that the economy is strong or that the government is doing a great job, he is forgoing the time and energy necessary to gain a more complete picture.

3. The Illusory Truth Effect: We Believe What’s Repeated.

Repetition is another way that misconceptions can enter our knowledge base. Per The Illusory Truth Effect, repeated statements are easier to process, and subsequently perceived to be more truthful than new statements. Our brain spends less time and effort on processing information that’s been repeated and takes it as truth simply because it’s familiar.

The reverse is also true: people interpret new information with skepticism and distrust.

Take the topic of nutrition as an example. For decades we’ve been repeatedly been told that eating fat is unhealthy. Despite recent studies proving the contrary, our diets continue to be high in sugars and processed carbohydrates.

It doesn’t always matter what is told to us — a truth or a lie — we’ll believe it as long as it’s repeated enough. It’s the frequency, not just the plausibility that matters.

4. The Mere Exposure Effect: We Believe What’s Familiar.

Not only is repeated exposure more likely to make us believe something, it’s more likely to make has have a favorable opinion of it. Per The Mere-exposure Effect, also known as the familiarity principle, we tend to like things more when they’re familiar to us.

Repeated exposure of a stimulus increases perceptual fluency, which is the ease with which information can be processed. Perceptual fluency, in turn, increases positive sentiment. Familiar things require less effort to process and that feeling of ease signals truth.

We are attracted to familiar people because we consider them to be safe and unlikely to cause harm. We can even adapt to like fairly objectively unpleasant things, such as when prisoners miss prison.

If people were rational, there would be no need for emotional marketing, advertising or political speeches, we would simply educate people about the facts.

When I first began realizing that people were irrational, I was confused and frustrated. I felt hopeless. I wished things were different. But that only caused anxiety.

Accepting that most people — myself included — are irrational most of the time actually eased the stress. Now I don’t have to wonder about why my Facebook friends believe half the crap the media espouses, why they endorse a political candidate that I don’t agree with, or why they believe in their respective religion.

We live in an incredibly complex world. Familiar information is easier to understand and repeat. You can see how the cognitive biases above can help us make decisions faster and therefore stay alive, but not necessarily to find truth. In a weird way, being irrational actually seems rational.


Original article can be found here.

Did you like the new contents? Let us know here at our blog or over at our forum!