Seven deadly shortcuts cognitive biases and aviation

10803

The habits and tricks your brain uses to get you through everyday life become a problem when they run riot in the cockpit.

We do no end of feeling and we mistake it for thinking. Mark Twain

The Nobel Memorial Prize for economics had an unusual winner in 2002. Daniel Kahneman became the first psychologist to win the world-renowned award, with a version of an idea he had first developed 30 years earlier with Amos Tversky – cognitive bias.

If his life’s work can be summed up in a proverb, it would be that we are not as smart as we like to think we are. As well as changing economic thought, Kahneman and Tversky’s insights have unsettling ramifications for aviation safety.

In 2 minds

A widely quoted but unsourced statistic says the average person makes 35,000 decisions a day. Whether you agree or disagree with this extraordinary number – which works out to a decision about every 2 seconds – is not the point. Each of us makes a huge number of decisions every day, starting when we get out of bed.

Most of these decisions are automatic. The unconscious mind has been known of and accepted since the time of Sigmund Freud in nineteenth century Vienna. Kahneman and Tversky’s contribution was to show, by experiment, how the unconscious mind could influence conscious decision-making in ways that were irrational – and potentially dangerous if they happen in a cockpit.

Kahneman proposes our brain has 2 operating systems, which he calls System 1 and System 2.

System 1 is the fast-thinking mind:

  • unconscious, automatic, effortless
  • no self-awareness or control
  • assesses the situation, delivers updates
  • does 98% of thought.

System 2 is the slow but thorough part of our thinking:

  • is deliberate and conscious, effortful, controlled and rational
  • has self-awareness/control, logical and scepticism
  • seeks new/missing information, makes decisions
  • does 2% of thought.

The automatic System 1 lightens the load on the deliberate System 2, in 2 ways:

  • takes care of our more familiar tasks by turning them into automatic routines, also known as habits
  • rapidly and unconsciously sifts through information and ideas by prioritising whatever seems relevant and filtering out the rest by taking shortcuts, called heuristics.

Expectation bias occurs when a pilot hears or sees something they expect to hear or see, rather than what actually may be occurring.

Your first few flights as a pilot use System 2 as you struggle with the effects of controls, radio procedures, engine management and your instructor’s helpful suggestions. Remember how exhausted you used to be after an hour of circuits? But, as you learn, the task of flying becomes transferred to System 1 – and gets easier. The problem for safety is that System 1’s network of decision- making shortcuts can ‘leak’ into our System 2 thinking. These leaks are biases.

The number and scope of all our cognitive biases is a matter of discovery and debate, but there are at least 7 that are potentially troublesome for aeronautical decision-making:

  1. confirmation bias
  2. continuation/sunk cost bias
  3. outcome bias
  4. anchoring bias
  5. expectation bias
  6. framing bias
  7. ambiguity effect.

1. Confirmation bias

Just as I’ve always thought.

Confirmation bias is the tendency for a person to seek out information that is consistent with an individual’s existing beliefs or expectations when confronted with unusual situational factors. In the setting of VFR flight into IMC, confirmation bias might result in a pilot subconsciously searching for environmental cues that the weather conditions are slightly above the minimum required, steady or improving, when the opposite is true.

2. Continuation/sunk cost bias

Stepped in so far, that, should I wade no more / Returning were as tedious as go o’er. Macbeth, William Shakespeare

Continuation/sunk cost bias is the tendency to continue a decision, endeavour or effort to preserve an investment of money, effort or time. As the goal – such as arrival at the destination – becomes closer, people may tend to change their decision-making. The mid-point of a flight can be a significant psychological turning point for pilots when faced with adverse weather decisions, regardless of the distance flown. An analysis of 77 general aviation cross-country accidents in New Zealand between 1988 and 2000 found weather-related accidents occurred further away from the departure aerodrome and closer to the destination than other types of accidents.

3. Outcome bias

No harm no foul. Chick Herne

Outcome bias is the tendency to judge a decision based on its outcome rather than on an assessment of the quality of the decision at the time it was made. Outcome bias can arise when a decision is based on the outcome of previous events, without taking into account how the past events developed. Outcome bias can contribute to the organisational phenomenon of normalisation of deviance, where obvious hazards are ignored or downplayed because they have not so far contributed to an accident.

4. Anchoring bias

I can see clearly now the rain is gone.  Johnny Nash

Anchoring bias is the tendency for a person to rely substantially on the first piece of information (the anchor) that is received and make estimates or judgements based on the anchor. This first piece of information becomes an arbitrary benchmark for all other information. A pilot may perceive a ceiling of 500 feet as good after many days of 200-foot ceilings, or bad after many days of CAVOK. A more relevant reference point for decision-making would be fixed minimums, whether personal, operator or regulatory.

5. Expectation Bias

Believing is seeing. Karl Weick

For the second time that afternoon, the first officer ran the pre-take-off checklist. The first departure had been abandoned after an anomalous reading in an engine temperature probe. Now, with the problem fixed, they were going again, not a minute too soon. But 3 seconds after take-off, the MD-82 began rolling and yawing. Spanair flight 5022 crashed near the boundary of Madrid Airport on 20 August 2008, killing 154 people.

Expectation bias occurs when a pilot hears or sees something they expect to hear or see, rather than what actually may be occurring. This bias likely played a role when the MD-82 first officer called out a flap setting of 11 degrees while conducting both the take-off briefing and the final check before take-off.

‘There is a natural tendency for the brain to “see” what it is used to seeing (look without seeing),’ the final accident report said. ‘In this case, the first officer, accustomed to doing the final checks almost automatically, was highly vulnerable to this type of error. … The captain, for his part, should have been monitoring to ensure that the answers being read aloud by the first officer corresponded to the actual state of the controls.’

6. Framing Bias

Good advice is one thing, but smart gambling is quite another.
Hunter S. Thompson

Framing bias is the tendency for a person to respond differently to the same information and choices, depending on how the information is presented to, or received (framed) by, the decision-maker. A decision can be framed as a gain or loss. Kahneman and Tversky showed when a decision is framed positively, as a gain, a person is likely to be more risk adverse. When the same decision is framed as a loss, people tend to exhibit more risk-seeking behaviours. They called this prospect theory.

Adherence to checklists, written personal minimums and standard operating procedures are the best tools for keeping biases at bay.

In the setting of VFR flight into IMC, the framing effect plays a role when pilots are considering whether to divert or continue, when faced with adverse weather. If a pilot perceives a diversion as a gain (safety is assured), they are more likely to adopt a risk-averse decision and divert.

7. Ambiguity effect

Better the devil you know. Kylie Minogue

The ambiguity effect is a cognitive bias that describes how we tend to avoid options that we consider to be ambiguous or to be missing information. We dislike uncertainty and are, therefore, more inclined to select an option for which the probability of achieving a certain favourable outcome is known.

An example of the ambiguity effect is when a pilot decides to fly an approach in questionable weather, rather than diverting to an alternate airport which may have better weather but may have other unknown issues.

Battling your biases

CASA Sport and Recreation Aviation Branch Manager Tony Stanton, who wrote a PhD on the hazards of biases in general aviation, says the first safety step is to understand biases are real. ‘They are natural responses to the environment and the volume of information we receive,’ he says.

Biases also apply on an organisational level and Stanton is inspired by the success of a select group of high-reliability organisations. Individually, we can steal some of these concepts and steel ourselves with them, he says.

In aviation, it’s sometimes good to be like Spock. Ask yourself, ‘What would Spock do?’

Stanton nominates 3 of Kathleen Sutcliffe and Karl Weick’s high-reliability principles as particularly apt:

  • Preoccupation with failure rather than success. ‘Think, “everything is going well, so what do I need to examine,”’ he says.
  • Reluctance to simplify. ‘Realise not everything is a simple as it seems, and you bring your own perspective to what you are looking at or thinking about.’
  • Deference to expertise. ‘Ask someone else whose biases are different.’

A trekkie solution

As a doctor, CASA Deputy Principal Medical Officer Tony Hochberg sees cognitive biases as a result of the interplay between the brain’s subsystems: the prefrontal cortex which is the seat of rational thought, and the limbic system of hippocampus and amygdala which are the centres of emotion and sensation/reward.

Like Stanton, Hochberg has no doubt biases are real, and deep seated. Controlling their influence over your thinking is analogous to the struggle of instrument flight, where pilots are taught to believe their instruments instead of their vestibular systems.

Adherence to checklists, written personal minimums and standard operating procedures are the best tools for keeping biases at bay, Hochberg says.

He agrees with the usefulness of seeking a second opinion and cross-checking. And he has a novel mental shortcut to help identify and counter biased thinking. As an easily recalled example, he invokes the extraterrestrial, emotionless and implacably logical intelligence of Star Trek’s Mr. Spock, who was never afraid to contradict the mercurial Captain Kirk.

In aviation, it’s sometimes good to be like Spock. Ask yourself, “What would Spock do?”

Want more on this topic?

Also see ‘I’m good, just ask me: The killer factor in weather crashes‘ by Anthony Stanton and Robert Wilson (January 2022). It provides a psychological perspective on VFR-into-IMC accidents.

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here