Flying is all about decisions. From the moment you wake up and look at the weather, to the moment you close the throttle and wait a handful of seconds for the wheels to touch the runway, there are literally hundreds of decisions in a typical flight. But how do we make those decisions, and how can we make them better? Psychology has some interesting observations on this, but no easy answers.
The traditional model of decision-making sees it as a rational process, beginning with a session during which options—the more the better—are developed and considered. Each option is ideally assessed, compared and prioritised against evaluation criteria. In the end, a solution is proposed that promises to be the best possible choice. But this takes hours, or even days, of deliberation. Pilots don’t have the luxury of such a process.
Most in-flight decisions are routine; slowing from cruise speed, descending, putting down the flaps, to name three. They are done often and within a predictable context and become habitual. However, when an emergency arises, which is better: this naturalistic decision making (NDM) or a more formal process? Or is some combination of both the ideal?
Gary Klein outlined his recognition primed decision (RPD) model in 1985, after several years of observing and conducting experiments on pilots in simulators. He also studied firefighters and found they did not compare different options in critical situations, but were able to recognise specific patterns in a situation and were thus able to react appropriately. Klein found experts evaluate the first option that comes into their mind for its feasibility; if the option seems feasible, they choose it, if it doesn’t seem feasible, they reject it and evaluate the next option which comes to mind.
Klein has described the RPD model as a blend of intuition and analysis. ‘The pattern matching is the intuitive part, and the mental simulation is the conscious, deliberate and analytical part.’
The most elegant and famous application of the RPD model is the landing of US Airways Flight 1549 on the Hudson River in January 2009 after the Airbus A320 hit a flock of geese. Pilot Chesley Sullenberger’s rapid, decisive actions can be heard in the air traffic control record of the ditching and his conclusion ‘we’re going to be in the Hudson’, comes relatively early in the 208 seconds sequence between birdstrike and water landing.
A grimmer example of how a recognition primed decision was implicated in a disaster was the crash of British Midland Flight 92, near Kegworth, England in January 1989. The experienced crew of a Boeing 737-400 evaluated and weighed up environmental cues following a loss of engine power and smoke on the flight deck. But their evaluation was based on earlier 737 models, which drew air from the right engine for flight-deck ventilation. The 737-400 used a different system. The crew shut down the good engine and continued on the damaged one, which coincidentally began running more smoothly as the autothrottle was disconnected on the right engine.
RPD decision making appears to be a relatively rare skill, practised only by the most experienced pilots. A German study conducted in a simulator found only about one-third of the pilots made recognition-primed decisions.
Another decision-making model is Rasmussen’s skill rule and knowledge (SRK) model. This model describes three different levels of cognitive activity during task performance and decision making.
- Skill-based level: Performance and decision making is at the subconscious level and is more of an automatic response to a particular situation. People who usually make skill-based decisions are very experienced with the task at hand.
- Rule-based level: People will operate on this level when they are familiar enough with the task but do not have enough experience and will look for cues or rules that they may recognise from past experience to make a decision.
- Knowledge-based level: When the task at hand is novel and when people do not have any rules stored from past experiences, people will resort to analytical processing using conceptual information which involves problem definition, solution generation and determining the best course of action or planning before making a decision.
In this model, a person may operate in one, two or even all three levels depending on the task and how experienced the person is.
In Thinking, Fast and Slow, Nobel Prize winner Daniel Kahneman summarises a lifetime of research (conducted with Amos Tversky) into how people really think. The book focuses on two modes of thought: System 1, which is fast, instinctive and emotional; and System 2, which is slower, more deliberative and more logical. There are clear parallels with the Klein and Rasmussen models of decision making.
Kahneman’s System 1 allows us to make rapid, approximate decisions without expending too much mental effort. There are a variety of cognitive short cuts that we use to help make these decisions and, unfortunately, these heuristics and biases are prone to error, as the Kegworth example shows. (A heuristic, in this context, is a rule of thumb or shortcut.) Kahneman fills an entire book with outlines and examples of the biases and heuristic traps that distort our decision making.
System 2 is a slower, cognitively taxing system that requires a lot more effort but is more likely to come up with the correct decision. System 2 is ‘nutting out’ the answer rather than blurting it out. Kahneman says System 1 is always running in our heads and we need to find ways of decreasing workload so that we free enough mental resources to use System 2 to confirm or correct what System 1 is telling us. In short, thinking is hard work, but we need to make time for it, because intuition has well-known traps, that thought avoids.
An Australian review of aeronautical decision making by Peter A. Simpson argues that heuristics are quite useful and highly economical, but can sometimes lead to systematic and predictable errors. ‘Such errors may include sampling only salient data, ignoring data that conflicts with initial situation assessment, and basing situation assessment and diagnosis on recently occurring events because they “come to mind” easily.’
Simpson concludes ‘it appears possible to teach novice pilots at least the basics of NDM, which they are then able to develop.’ But he is also interested in improving the reliability of NDM through metacognition training. Metacognition is being aware of your environment, heuristics and biases, and how these may affect your decisions.
‘Metacognition training may allow pilots to become more aware and regulative of their cognitive processes and resulting behaviours. Learning to monitor decision-making processes may mean that pilots are better able to manage time and realise when they have a workable solution, negating the need to search further,’ Simpson says.
‘Although only speculative, it could be suggested that metacognition training may help inexperienced pilots from getting bogged down with information and options, allowing for faster, more intuitive decision processes.’
Simpson quotes Means et al. (1993) about how metacognitive skills cannot be taught in isolation, and must be taught using relevant situations and examples as students do not learn how to incorporate these skills into NDM situations.
Further information:
Evans, J. (1990). Bias in Human Reasoning: Causes and Consequences. London: Lawrence Erlbaum Associates.
Klein, G. A., & Woods, D. D. (1993). Conclusions: Decision making in action. In Klein, G.A., Orasanu, J., Calderwood, R., & Zsambok, C.E. (Eds.), Decision Making in Action: Models and Methods. Norwood, NJ: Ablex Publishing.
Means, B., Salas, E., Crandall, B., & Jacobs, T. O. (1993). Training decision makers for the real world. In Klein, G.A., Orasanu, J., Calderwood, R., & Zsambok, C.E. (Eds.), Decision Making in Action: Models and Methods. Norwood, NJ: Ablex Publishing.
Simpson, Peter A. (2001). Naturalistic Decision Making in Aviation Environments, Air Operations Division Aeronautical and Maritime Research Laboratory, Defence Science and Technology Organisation.
http://dspace.dsto.defence.gov.au/dspace/bitstream/1947/3813/1/DSTO-GD-0279%20PR.pdf