April 9, 2024
19.3.24

A Guide to Human Behaviour

Human Mind

A Guide to Understanding Human Behaviour

What is Behavioural Science?

Behavioural scientists study how and why people do the things they do and seek to understand the factors which influence these. We delve into the intricacies of motivation, decision making and communication, exploring how our judgements translate into actions. We examine the effect our behaviour has on others, and the impact other people's behaviour has on us. And we seek to understand how the world around us shapes all these things.

Two Systems of Thinking

At the heart of behavioural science lies Daniel Kahneman and Amos Tversky’s Dual Process Theory, which posits that we use two different processes to make decisions and that these processes can yield quite different outcomes even when they are using the same information. The reason for having two systems is simple necessity. We face a staggering number of decisions each day (roughly 40,000 according to some studies). Naturally, we can’t analyse every single one, so our minds have evolved two processes which we employ according to the complexity and the importance of the decision. We call these System One, and System Two.

System One is a fast and intuitive mode of thinking. It mostly operates beneath our conscious awareness and demands minimal effort. When we encounter a simple task, like opening a door, System One kicks in effortlessly. We rely on visual cues,such as the position of the door handle, to decide whether to push or pull. While these decisions are usually quite accurate, System One tends to settle for solutions which are merely "good enough." It lacks precision. What's more, it is highly susceptible to external influences, which means it can be unreliable.

System Two,on the other hand, is a deliberate, analytical mode of thinking. When we are faced with complex or high stakes decisions, like manoeuvring a car into a tight parking space, System Two takes charge. We carefully assess the gaps between the car and potential obstacles, constantly monitoring the situation. System Two shines when it comes to abstract ideas and generally produces more accurate outcomes. However, it comes with a drawback: it is slow, and mentally demanding.

System One is driven by pattern recognition. When confronted with a decision, we instinctively search for familiar reference points and match what we see in front of us with existing patterns stored in our sub-conscious memory. We call these patterns recognition processes heuristics. They are mental shortcuts or rules of thumb which have evolved to enable rapid decision making. Research suggests these heuristics are common to all of us and therefore identifiable and predictable.

Heuristics – Algorithms for the Brain

Heuristics are essentially the algorithms our minds use to help us make faster and easier decisions. Without these decision-making rules we could never cope with the volume of choices which confront us every day.

Three of the best-known heuristics are availability, representativeness and anchoring. Contrary to popular opinion, these are not biases, they are tools we use to make decisions quickly, or under pressure. The outcomes we get from them can be good or bad.

Availability and Representativeness – Where Stereotypes Come From

The availability heuristic describes our tendency to make judgements based on how readily information comes to mind. It is based on the notion that if we recall something easily, it must be more significant than a less easily recalled alternative. The availability heuristic is exacerbated by highly emotive or newsworthy types of information. For example, people will often cite shark attacks as a more likely cause of death than common diseases, despite this clearly making no sense.

The representativeness heuristic describes the way we estimate the likelihood of something based on how similar it is to our mental image of the thing in question. The representativeness heuristic causes us to form a pre-determined impression of what somebody or something should look like. A research study showed how people rated someone wearing a suit as being more likely to be a lawyer than someone in a bathing suit, even when both photos were of the same person.

It's easy to see how both these heuristics can be disruptive to our decision making and to the way we treat people, because they use reference points which may be random or inaccurate. They involve making judgements and decisions with only partial information and they cause us to overlook or ignore information once we've formed an initial impression.

These heuristics are the root of stereotypes. If we've only ever seen a certain type of person in a role, we readily assume others in the same role will conform to its stereotype. If we have a pre-formed idea of the type of person in a certain role, we may assume only those with the same characteristics are suitable for the role.

The Power of Anchors

Anchoring is one of the most researched concepts in behavioural science. Anchoring is the process by which we use an existing reference point to estimate the value of something else. If we are asked to estimate how long a project will take to complete, we will use a previous project as a reference point and then iterate from it. Economists and financial analysts use anchoring all the time when making forecasts.

We might use anchoring when deciding how much we are willing to pay for something. It's also a commonly used tactic by companies trying to sell a product or service. Anchoring is a great tool if the reference point is relevant and accurate, but it can lead to errors if the reference point is skewed. A great example of ineffective anchoring can be seen in a study from 1997, in which participants were asked to answer two questions:

- Half of the group were asked whether Gandhi was older or younger than 9years when he died and then asked to estimate his exact age at his death.

- The other half were asked if he was older or younger than 140 years when he died and then asked to estimate the exact age of his death.

Unsurprisingly all the participants got the first part of the questions right, however, the group given the anchor of 9 years guessed, on average, that Gandhi was 50 years old when he died, while the group with the anchor at 140 years guessed he was 67 when he died. Even though the anchors were obviously misleading, they had a significant effect on the decision outcome.

The effect of anchoring is particularly powerful in negotiations and in forecasting. This is because counter-offers or re-estimates remain anchored by the original reference point and new information is only used to make adjustments, it does not override the first piece of information.

Cognitive Biases - The Result of Outdated Evolutionary Preferences

Our mental frameworks have evolved over millions of years. For most of this time, survival has been our primary concern and the threats we faced were often physical. To ensure our survival, System One responds rapidly to perceived threats with an in-built "freeze, flight or fight" heuristic. This has led us to develop some deeply-ingrained preferences:

First, we focus on the immediate present rather than the future. Second, we pay more attention to risks than rewards. Both would have made sense when the immediate situation may have represented a life-or death threat but in today’s world it can mean ignoring or underestimating a long-term benefit relative to a short-term risk.

We also seek safety in numbers, which means we may find ourselves saying or doing whatever is necessary to avoid conflict.

These preferences are not random, they are systematic deviations from rational decision-making. When they deliver sub-optimal outcomes, caused by using the wrong frame of reference, we call them cognitive biases. Biases can negatively influence our decision making in many ways:

* They encourage us to follow the herd, whether it's right or wrong.

* They lead us to be swayed by the views of certain influential individuals.

* Biases may prevent us from seeing our mistakes and they can encourage us to keep making the same mistakes.

* They discourage us from speaking up, or asking for help, when things are going wrong.

* They can freeze our decision processes and stop us acting when there's a problem.

In a world where most of our challenges no longer threaten our survival, a slower, more deliberative System Two approach would often yield better outcomes. Unfortunately, we aren’t always good at discerning which decisions to assign to System One, and more importantly, which decisions to pass over to System Two. While we would like to believe we predominantly rely on our analytical System Two, our natural inclination is to conserve energy. Consequently, we often let System One take the reins, then retrospectively craft a narrative to justify our decision and make it appear analytical.

Mitigating the Risk of Cognitive Biases

Heuristics and biases are deep-rooted and strong. They get into every part of an organisation, influence decisions, and shape the way people communicate and collaborate.

It isn’t possible to eliminate heuristics, and nor should we try to. They are the foundations of decision making in a complex world. They are highly effective when speed is more important than accuracy. But, like algorithms, heuristics require training. They only work if we feed in the right information and use them in the right context.

Fortunately, because they frequently recur, they are recognizable and we can learn to predict them. This means we can introduce circuit breakers into our process, prompting ourselves to engage System Two when the stakes are high. We can also design our own heuristics to deliver the outcomes we want from higher frequency everyday decisions.

Stay in the loop

Subscribe for actionable insights directly into your inbox.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.