Cognitive biases and decisions: when facts interpreted...

Cognitive biases and decisions: when facts interpreted...
Article Jul 16, 2025 4 minutes
  • Recruitment
PerformanSe

Psychology shows us that decision-making situations are always complex: they involve many elements, are implicated and uncertain by nature, with real risks of error associated with them. In these contexts, our thinking uses mental shortcuts, known as heuristics, which can impair our judgement. The risk of cognitive bias is therefore particularly high.

These biases affect our reasoning and distort our perception of situations, filtering information through the prism of our own workings, at their very least objective. Let's take a look at four of the most frequently observed classic biases.

Because an informed reader is worth two, and understanding these mechanisms is the first step towards making more informed decisions!

What is cognitive bias?

Think of your mind as a powerful computer, sometimes plagued by little bugs known as cognitive biases. These subtly influence our daily decision-making, like a filter distorting our perception of reality. One of the most fascinating is the authority bias, which pushes us, almost in spite of ourselves, to bow down to authority figures, sometimes muting our precious critical spirit.

The famous psychologist Stanley Milgram brilliantly highlighted this phenomenon, demonstrating how the argument of authority can make us accept situations that we would normally reject. A single person in a position of authority can exert an inordinate influence on our choices, without even having to prove the validity of their arguments. What's even more disturbing is that confirmation bias reinforces this tendency, causing us to naturally dismiss points of view that don't fit in with our pre-established vision.

This fascinating mechanics of the mind has been of particular interest to Daniel Kahneman, the brilliant Nobel Prize-winning researcher who has devoted his career to understanding how these cognitive biases govern our decisions, sometimes leading us into a spiral of escalating commitment from which it becomes difficult to extricate ourselves.

Confirmation bias: I always find I'm looking for

In psychology, confirmation bias is certainly one of the most natural mental shortcuts in our thinking. As Daniel Kahneman (one of the founding fathers of cognitive bias) has shown, it influences our reasoning by causing us to unconsciously filter information according to our preconceptions, creating a veritable illusion of validity. This way in which our brain processes the information we receive profoundly affects our judgements: without even realising it, we use these heuristics to select only those questions and data that confirm our initial beliefs.

Faced with each new event, we tend to validate our original hypotheses, without really taking the time or trouble to test them. This phenomenon is particularly acute in emergency situations and under pressure, where our judgement becomes even more vulnerable to these cognitive distortions.

Authority bias: the boss is always right

Psychology shows us how our thinking uses astonishing mental shortcuts. Take this classic situation: an important meeting where your judgement has to take into account multiple elements. This is where our reasoning can play tricks on us through authority bias, one of those heuristics that makes us believe in a perfectly rational decision.

When a crucial event occurs, we tend to give more weight to information coming from authority figures. Our judgements are then influenced by the way a recognised expert, charismatic leader or experienced manager expresses themselves, creating a dangerous illusion of certainty.

This reflex is particularly strong in times of crisis, when we instinctively look for reassuring points of reference. It's natural, but beware: even the most competent manager may have a limited view of the situation. The best ideas sometimes come from the most unexpected people, like a colleague in the field who knows the day-to-day reality of the problem.

To put it simply: it's like systematically choosing the restaurant suggested by the most eloquent friend, while ignoring the advice of the more reserved food enthusiast!

Conformity bias: daring to be different?

In the vast world of cognitive biases, the conformity bias reveals a fascinating social dimension of our brain. The behavioural sciences show us how each individual, confronted with group pressure, can see his or her thoughts subtly transformed.

Imagine the situation: you're in a meeting and your information contradicts the prevailing opinion. Despite your expertise, you have a strong tendency to align your ideas with those of the majority or influential people. This cognitive distortion is particularly powerful when we doubt our skills: faced with a united group or a charismatic manager, we risk making the mistake of silencing our vision, even if it is relevant and justified.

Because in the reality of group cognitive dynamics, opposing a dominant opinion requires much more than the simple certainty of being right. You have to navigate the complex waters of power plays, social alliances and established hierarchies. It's like swimming against the current: even if you know the right direction, the force of the current can seem insurmountable!

Expertise bias: sure to know... sure to be wrong?

Here's a fascinating paradox that shows how our brains can play tricks on us: in many fields, the most seasoned experts are sometimes victims of their own biases! Their thoughts, forged by years of experience, can lead them to make surprising errors of judgement.

Why? Their expertise can create a form of cognitive overconfidence. Whereas a less experienced professional will approach each situation with caution, taking the time to check their information and develop their reasoning, experts can sometimes jump to conclusions too quickly. Their cognitive abilities, exceptional though they may be, can paradoxically do them a disservice when they believe they 'know it all'.

It's like a great chef who, on the strength of his experience, no longer tastes his dishes: often his intuition will be right... but the day an ingredient is different, he risks missing it! This situation becomes particularly critical in times of crisis, when each case is unique and crucial information can be hidden in a seemingly insignificant detail.

Halo effect and other biases: how they influence us...

Cognitive biases are like invisible filters that subtly colour our day-to-day decisions, far beyond those we have just listed.

Take the halo effect, that devious mental shortcut that makes us make an overall judgement of a person based on a single quality that appeals to us, eclipsing any genuinely critical analysis. It's as if a single shining star prevented us from seeing the rest of the constellation!

Our minds also have a fascinating tendency to look for connections where there aren't always any. This is the correlation bias, which leads us to see cause-and-effect links between events that are simply simultaneous, such as thinking that wearing a certain tie brings good luck because you have succeeded in an important presentation while wearing it.

In our quest for confirmation, we collect arguments that reinforce our pre-existing beliefs, like a collector who only keeps the pieces he likes. This distortion can have a significant impact on our work, particularly in companies, where it risks diminishing the influence of atypical profiles or experts who dare to challenge the consensus.

Yet there is a powerful antidote: a team that cultivates a diversity of opinions and is rooted in rigorous scientific methods develops a collective brain that is sharper and more effective. The most enlightened decisions emerge from a friendly confrontation of perspectives.

The solution? More aware and more collective!

Understanding how our brain works is the first step towards controlling our cognitive biases. While our thoughts and judgement are naturally influenced by these tendencies, we can learn to recognise them and limit their effect.

Of course, it is impossible to be perfectly objective - our reasoning will always be coloured by our experience. But we can develop a finer cognitive awareness, like a detective examining his own information and perceptions for potential distortions. It's a bit like adjusting the focal length of a camera to get a sharper image!

The key also lies in sharing, comparing points of view and getting to know ourselves. Modern tools (personality tests, cognitive assessments, 360° feedback) offer us a valuable palette for multiplying perspectives, getting to know ourselves better and thus reducing the risk of errors of judgement. It's fascinating to see how these different approaches can complement each other to create a richer, more balanced vision!

Because let's not forget: collective intelligence is our best defence against individual bias. It's as if each member of a team contributed a unique piece of the jigsaw, enabling us together to see the whole picture with greater clarity.

Would you like to learn more about our psychometric tests?

Newsletter

Thank you for subscribing to our newsletter!

You’ll now receive our articles, be informed of our events and benefit from our white papers.

Ready to discover your next talents?

Complete the form to get a demo with one of our experts

Footer form

Thank you!

Your info was submitted successfully.

Attention

Please rotate your device