What are cognitive biases?
Either because our brain has a limited capacity, or because we do not always have all the information we would like or because we are overwhelmed by the uncertainty of the consequences of making one or the other decision, so on many occasions we take mental "shortcuts" to reach the solution of problems. These mental shortcuts that we take unconsciously, in psychology are called "Heuristics", and help us to simplify the large number of mental processes that we constantly carry out and make our daily lives more bearable.
And it is that our brain is not able to process all the information it receives through the senses, so it needs to make a selection of it. When our mental or heuristic shortcuts lead us to errors of conclusion, we call them cognitive biases.
The main cognitive biases that are known
We all know that our memory is not perfect, it fades with time and easily leads us to unconscious errors. Research carried out reveals that When we evaluate memories in order to make decisions about our future, they are often biased for events that are very positive or very negative, and we tend to remember unusual or unusual events rather than daily, everyday events. The reason is that the brain gives much more importance to extraordinary phenomena or not as usual, probably due to the importance they had in learning throughout evolution. As a result, that bias in our memory affects our ability to predict in the future.
In order to avoid this bias it is recommended to try to remember as many similar events as possible, in this way it is intended to avoid falling into extremes, often not very representative.
This bias refers to the tendency we have to underestimate the time it takes to complete a task. Apparently we tend to plan projects with some lack of detail that would allow us to estimate individual tasks. The planning fallacy not only causes delays, but also excessive costs and reduced benefits due to erroneous estimates.
As the American scientist Douglas Hofstadter says, keep in mind that "Doing something will always take you longer than you think, even if you consider Hofstadter's Law"
This bias is behind many superstitions and irrational behaviors. It isthe tendency we have to believe that we can control certain events, or at least influence them. It is thanks to this thought that humans, from time immemorial, create rituals and superstitions that give us some security. An example of today can be seen in athletes who repeat certain behaviors hoping that they condition things like their ability to score goals, and that obviously depends on many other objective factors.
Election support bias
At the moment we chose something (from a couple to a piece of clothing) we tend to see that choice with a more positive approach, even if that choice has clear defects. We tend to optimize its virtues and minimize its defects.
Effect of environmental perception
Although it seems strange to us, the environment around us exerts a great influence on human behavior. A deteriorated, chaotic and dirty environment causes people to behave in a less civic way, and also inclines them to commit more vandalism and criminal actions. This effect is the basis of the "broken window theory" studied by psychologist Philip Zimbardo.
Availability bias or heuristic is a mechanism that the mind uses to assess how likely it is that an event will happen or not. The more accessible the event, the more likely it will seem to us, the more recent the information will be easier to remember, and the more obvious, the less random it will appear..
This cognitive bias applies to many areas of our lives, for example, it has been shown that doctors who have diagnosed two cases followed by a certain not very common disease, believe they perceive the same symptoms in the next patient, even being aware that it is Very unlikely (statistically speaking) to diagnose three cases followed with the same disease. Another example is that of a person who says that smoking is not so harmful to health, based on the fact that his grandfather lived more than 80 years and smoked three packets a day, an argument that overlooks the possibility that his grandfather was a Atypical case from the statistical point of view.
The bottom line is to overestimate the importance of the available information (and therefore draw erroneous conclusions). Lotteries, for example, exploit availability bias, and if people understood the real odds they have of winning, they would probably never buy a tenth in their lives.
The Dunning-Kruger Effect
The Dunning-Kruger cognitive bias effect consists of a distorted self-perception, according to which individuals with poor skills or knowledge, think exactly the opposite; they are considered more intelligent than other more prepared people, they are certain that they are superior in some way to others, thus incorrectly measuring their ability above the real thing. This distortion is due to the subject's cognitive inability to recognize their own ineptitude, because their real ability would weaken their own confidence and self-esteem. On the contrary, competent individuals falsely assume that others have a capacity or knowledge equivalent to or even greater than their own.
The authors of this discovery David Dunning and Justin Kruger of Cornell University, tried to find out if there was any remedy to level the overvalued self-esteem of the most incapable. Luckily it turned out that there was: education. Training and teaching could help these incompetent individuals realize how little they really knew.
Charles Darwin already said at the time: "Ignorance breeds more confidence than knowledge."
The halo effect is a cognitive bias whereby the perception of a trait is influenced by the perception of previous traits in a sequence of interpretations. That is, if we like a person, we tend to rate it with favorable characteristics even though we don't always have much information about itFor example, we think of someone who is friendly, and this makes us assume that we already know other more specific characteristics such as: he is also intelligent.
The best example to understand this bias is the media stars (actors, singers, celebrities ...) demonstrate the halo effect perfectly. Because they are often attractive and friendly, then and almost automatically, we assume that they are also intelligent, kind, have good judgment and so on. The problem appears when these assumptions are wrong, since they are often based on superficial aspects.
This trend seems to be present even at all social levels, both low and high, including where objectivity is paramount. For example, on average, attractive people have shorter prison sentences than others who were convicted of similar crimes.
Bias of corrupt power
Surely many are not going to miss the reality of this bias, which says that there are a demonstrated trend in which individuals with power are easily corrupted, especially when they feel they have no restrictions and have full freedom. Does it sound like something? Politicians, businessmen, famous actors, elite athletes and even royalty are full of corruption cases.
This bias tells us about the unconscious tendency to assume that others have thoughts, beliefs, values or positions similar to ours. As if they were a projection of ourselves.
Lake Wobegon effect or better than average effect
Is the human tendency to self-describe favorably, communicating the goodness of oneself and thinking that it is above average in intelligence, cunning or other qualities. Of course, if it is not a person with self-esteem problems, of course.
This bias refers to the tendency we have to overestimate our emotional reaction, overestimating the duration and intensity of our future emotional states. But research shows that most of the time we don't feel as bad as we expected when things don't go as we want, for example. This bias is one of the reasons why we are often wrong in predicting how future events will affect us emotionally. Studies have shown that months after a relationship ends, people are not usually as unhappy as they expected and that people who have won the lottery will eventually return to their usual degree of happiness or that they had before winning the award.
Effect of false consensus
The bias of false consensus effect is similar to the previously described projection bias, and that is that most people judge that their own habits, values and beliefs are more widespread among other people than they really are.
This heuristic is a Inference we make about the probability that a stimulus (person, action or event) belongs to a certain category. For example, if we say that Alex is a methodical young guy whose main fun is computers. What do you think is more likely? That Alex is an engineering or humanities student?
When asking questions of this kind, most people tend to say that Alex surely studies engineering. Such a trial results, according to psychologist Daniel Kahneman from the automatic application (immediate or not) of the representational heuristic. We assume that you study engineering because your description fits with a certain engineering student stereotype. But this ignores facts such as, for example, humanities students are much more numerous than engineering students, so it would be much more likely to find humanities students that match this description.
This bias is not only anecdotal, but It is part of the foundation of certain social prejudices. For example, when we judge the behavior of a member of a certain group, such as immigrants, we tend to rely on supposedly representative stereotypes, ignoring objective frequency and probability data.
This bias refers to when a person considers that he has a certain status, will tend to deny and defend himself from any comment that contradicts him, even if you have to deceive yourself.
Retrospective or recapitulation bias
It is the tendency that we have to see past events as predictable phenomena. People skew our knowledge of what really happened when we evaluated our prediction probability. Actually this is a memory error. In the same way that we also tend to assess past events in a more positive way than they actually happened.
Fundamental Attribution Error
It refers to the tenure that we show to prioritize our personal skills to assess our successes and to attribute our failures to external circumstances. On the other hand, when it comes to another person, the tendency is the reverse, we attribute their success to luck or help and internal failures.
Bias of disagreement
It is the tendency that we have to make a negative critique of the information that contradicts our ideas, while we perfectly accept that which is congruent with our beliefs or ideologies. In this way a selective perception is produced by which people perceive what they want in the messages of others or the media. And usually people tend to see and interpret things based on our frame of reference. We are also more likely to seek information favorable to our ideas than to seek information that challenges our ideologies or line of thinking.
Forer effect or subjective validation effect
The Forer effect is the tendency to accept vague and general personal descriptions as exceptionally applicable to themselves, without realizing that the same description could be applied to anyone. This effect seems to explain, at least in part, why so many people think that pseudosciences work, such as astrology, card reading, palmistry, divination, etc., because they apparently provide successful personality analysis. Scientific studies of these pseudosciences show that they are not valid personality assessment tools, however each has many adherents who are convinced that they are accurate.
Heuristic anchor and adjustment or focus effect
This heuristic describes the human tendency to rely too much on the first information they get and then make decisions: The anchor". During decision making, anchoring occurs when people use a “piece” or initial information to make subsequent judgments. Once the anchor is fixed, the rest of the information is adjusted around it by incurring a bias.
For example, if we ask some students 1) how happy do you feel with your life? and 2) how many appointments have you had this year ?, we have that the correlation is null (according to the answers having more appointments would not alter the level of well-being). However, if the order of the questions is modified, the result is that students with more appointments are now declared happier. It is lacking in logic, but apparently, focusing on appointments makes them exaggerate their importance.
Apparently, when a phenomenon has recently focused our attention, we think that this fact suddenly appears or happens more often, even if it is unlikely from the statistical point of view. Actually, this happens because now we perceive it differently (we didn't pay attention to it before) and therefore we mistakenly believe that the phenomenon occurs more frequently. It also happens with objects,
Illusion of trust
This bias is about the confusion between the trust of those who speak to us with their credibility, in such a way that we perceive a person as more credible the more confidence he shows in his arguments. The reality is that research has shown that trust is not a good indicator, nor is it a reliable way to measure a person's ability or aptitude.
Reference point or status-quo
Apparently The same prize does not have equal value for two different people. For example, if I have two thousand euros and I win a hundred in a bet, I value it less than if I have five hundred euros and I win those same hundred in the bet. The benchmark is very important. But its implications may be somewhat greater, because it is not only about the reference I have regarding my own initial wealth, but with the wealth of my circle of close people. If someone unknown to me wins one hundred thousand euros in the lottery, I am not affected. On the other hand, if my co-worker wins them, I get the feeling that I am poorer and miserable, even if I had not played the lottery.
Bandwagon effect or drag effect
This error consists of the tendency to do (or believe) things just because many other people do (or believe) those things. Apparently, the probability of a person adopting a belief increases depending on the number of people who possess that belief. It is a strong group thinking.
Is the predisposition to systematically contradict the ideas or formulations that another person makes and with which he does not sympathize, just for this fact, because we no longer want him to be right and we are more predisposed not to believe his words.
We leave you an interesting video about Cognitive Distortions so you can learn more about this topic.
Subscribe to our YouTube channelRelated tests
- Depression test
- Goldberg depression test
- Self-knowledge test
- how do others see you?
- Sensitivity test (PAS)
- Character test