Dunning-Kruger Effect

The Dunning-Kruger Effect is a bias where people think that they are smarter than they actually are. A person with low ability is more likely to overestimate themselves. This is due to their lack of perception of the task at hand and lack of previous experience. Both leading them to mis-calibrate there judgement. If a person has tried to shoot a basket a 1,000 times, they are more likely to judge their own skills as compared to someone who has tried 10 times.

Neural networks during learning also exhibit some similar behaviour. When a neural network is undergoing training by use a dataset, it optimize it’s own parameters to create a best guess for the problem at hand. However, if there are not enough data to learn from the network will more often fail to generalize the problem at hand. This would mean that it has failed to get an “average” of the problem at hand forcing it to misread real world data.

Everyone is susceptible to this.

Study conducted. Tests on group of students. People who scored less overestimated while competent people underestimated. This is common when we set out to learn something new. Initially with little knowledge we seem to believe that we know a lot more than what we actually do. But as we progress and see the big picture as to how hard it is or how vast the topic is we tend to doubt ourselves. Psychologists call this the valley of despair.

This slope is similar to the hype cycle. Even on a large scale, especially in the realms of new technology adoption, a large group of people tend to follow these steps.

Self-serving Bias

This is a bias that leads to a person taking credit for positive outcomes and blaming external factors for negative outcomes. In other words, anything that we do that we perceive is good or leads to a good outcome we tend to positively reinforce it with self-serving bias. The hidden problem is whether the action is actually good for us in the long term, whether it aligns with our big-picture and that of the civilization and that of society. Assessing each action for its won merits and demerits prior to the outcome can be more helpful so that the outcome is not affecting our judgement.

A concept that is tied to this bias is the locus of control. It is the degree to which a person is affected by their own beliefs vs. external influence. In everyday life this translates into when a person is thinking for himself/herself, it is said that the person has an internal locus of control. Studies show that people develop a more internal locus of control with age.

Action Bias

What is it?

Action bias is a tendency to do something in a situation due to factors like to gain a sense of control, social norms, peer pressure etc. We feel compelled to act even if there is no evidence that it might lead to a favourable outcome. Taking a decision without processing all the information might lead us to take less effective action. In the society, the general bias is against inaction. It gives an impression of not doing what is necessary or not putting in the effort. Against that backdrop, taking premature action might look much better than it actually is. This bias can be considered as a survival instinct. Our innate instinct to hunt, find shelter have carried over despite the very different environment and lifestyle today.


An example of this bias is when a person would choose to take a medical treatment as it is better than no-treatment at all. Even if the treatment haven’t yet been proven to work. In meetings and conversations, we have a tendency to “say something”. It doesn’t have to actually add to the conversation but it gives a sense of contribution nevertheless. Charlie Munger calls it the Say Something Syndrome in his famous talk. This is common in investing too. Activity from your peers, general market sentiment from news, market activity all can lead to suboptimal action.

Extrapolation Bias

Extrapolation bias is a tendency to take recent experience and project that it will continue into the future. We tend to think in a straight line. And it is very hard for us to imagine or perceive future anomalies. A most common way this occurs in finance is when valuing a company. If the company had a 25% earnings for the past 3 years, we are easily drawn to choosing the forecast value for the coming years to be 25% as well. But that is far from certain. The ability of a business to create value in the past does not readily influence it’s ability to do the same in the future. It is the same as correlating a coin toss in the past to another one that is going to take place. Here, we know that the previous coin toss outcome does not influence the nest one. The worst part of this bias is that it is very easy to take a decision with an underlying assumption that the status quo will be maintained in the future.

Freudian Slip

A Freudian slip is an error that occurs due to an internal train of thought. It is said that this happens due to the interference of the unconscious. It is characterized as a slip because it is an indication of what you are thinking behind the scenes. Sometimes these slips can be quite revealing to you and to others. A peak into what you believe, what you wish and what you haven’t addressed yet. At other times, these slips can be a way for your mind to tell you it’s standing on something. Verbal errors are not random, they are puzzles to be solved.

If a text engine like GPT-3 is trained with a lot of text concerning to a person. Maybe all speeches of a famous personality. And the engine writes a speech posing as them. If some parts of the speech or even a sentence didn’t make sense, it would be easily discarded as an outlier. But what if that is also another indication of a Freudian slip. Can this trait be replicated in systems like these?

Availability Bias

Availability Bias is a tendency to over-value something just because you were able to recall it or you remember something well. The underlying assumption is that if we remember it so well, it must be really important. That can be far from reality. If we see 2 recent stories of airplane crashes in recent times, we would perceive a higher probability for another one to occur. One part of this bias is how well you can recall a similar incident or related memory. The more recent such a memory is more chances that you’ll recall it better. The second part is the frequency. How many such incidents can you recall. That has a direct effect on what we perceive. In both cases the ease of retrieval is quite important. You would easily remember and retrieve an accident that you were a part of compared to another one that was narrated to you.

Anchoring Bias

Anchoring bias causes you to get distracted by the information that was presented first. Even if that is not relevant to the task at hand. Information that follows will also be judged relative to what was presented first. The first piece of information has now become anchored in your thought process. This happens when you look at the price of stock. When you buy a stock at a price and eventually gives you higher return, anchoring kicks in. Even though the fundamentals of the business hasn’t changed, you wouldn’t want to add to your existing position because the initial price is what you are anchored with. When looking at deals, we are trained to compare it to see if it is a bargain or not. And by default, the first price is what the comparison is done against. This happens a lot in sales as well. The salesman almost always starts their pitch with a much higher price. Thus anchoring the buyer to that price. So any discounts the salesman further makes looks like relatively a good offer for the buyer.


Metacognition is “thinking about thinking”. It is being aware of what and how we think and why we think in a particular way. This is a skill that can compound very well if used properly. Metacognition is critical to learning. We need to engage different ways of learning for different type of knowledge we are trying to acquire. Reflecting on the very steps of learning and the acts involved is part of metacognition. Spending time on reflection after tasks that were particularly testing the limits of your ability is a good way to practice metacognition.

One key aspect of metacognition is self-regulation. This can be split into 3 parts; Planning, Monitoring and Evaluation. Planning involves trying to understand the problem and choosing the right set of strategies and tasks. Monitoring involves using your own awareness to judge during the process how effective you are. Evaluation refers to appraising the final goal compared to the outcome of the task and figuring out how it can be improved the next time around. This fits well into Charlie Munger’s Latticework of mental models. The mental models are the set of tools you would have to use to solve a problem. Each time you perform a self-regulation you are in a way sharpening some of the tools bit by bit.

First Conclusion Bias

This is bias where there is a tendency to reach a conclusion based on the first stimuli we get. This tendency may also be attributed as an energy/saving behaviour. Once we think about out first conclusion, we have given it more ammunition. This can lead to 2 things. Firstly, we will stop accepting newer ideas and viewpoints into our thinking. Secondly, we try to reason out why the first conclusion is the right one. A mental routine to counter this bias is to acknowledge this and to think inversely. Where you assume that the first conclusion HAS to be wrong. This will allow us to ask more objective questions rather than settling in with the first conclusion.

An example, we see a shelf in the supermarket that is empty. Our first conclusion might be that the particular product is currently trending and it is being “over-bought”. But there could be other explanations as well. Maybe the product was recalled due to some defect. Or the shelves were emptied as they were not being sold at all and it was closing in on its expiration date.

Concision Bias

This is a bias that occurs mainly in media. With lowering attention spans, media institutions are forced to summarize and condense news to make it more appealing to its readers. But summarization comes at a cost. The content loses context and that can have a big effect on how the news is perceived by the reader. Summarization also means that something has to be omitted. Different view points can portray a news story in different ways. The choice of omission can depend on the institution and the biases built into it. The other end of the spectrum are books. They offer a lot more context and background information on a particular topic or issue.

This bias has a political application as well. Politicians and campaign managers can focus on certain issues and leave out nuances. This then trickles down to other media like Twitter or Sound-bites. Concision bias is at the center of the whole “right-leaning” and “left-leaning” media discussion. They lean by leaving out some parts of the story.