Russian roulette

6962
image: © Thinkstock

A harmless but disturbing incident highlights how complex systems and procedures can be vulnerable to the effects of misunderstanding and miscommunication.

The worst thing that happened to the 422 passengers on Emirates Airline flight EK131, from Dubai International Airport to Domodedovo Airport, Moscow, was that they landed about 20 minutes late.

There had been a go-around, not unusual in itself, and a longer than normal delay before touchdown that only the most observant among them might have noticed had included a second go-around. They did not know—and might not have cared to know—that the intricate web of safety systems woven around them had come close to breaking.

On that clear evening, Sunday 10 September 2017, with the lights of the Russian capital bright in the north, the Airbus A380 with 448 people on board had descended to 395 feet above ground level (AGL), activating the ‘Glideslope!’ and ‘Terrain Ahead – Pull Up!’ alerts. During the second approach, the flight plan disappeared from the flight management system (FMS) leaving a blank screen. When the captain selected the runway waypoint using the ‘direct to’ function in the FMS, the aircraft levelled off at 2600 feet QNH (about 2000 feet AGL) and a third attempt had to be made.

The report, published in April 2020 by the United Arab Emirates General Civil Aviation Authority, examined the event in granular detail. It presented a picture of human-machine interface breakdown, less than ideal communication and shortcomings in training. This was duly reported by the media as pilot error, a phrase which does not appear anywhere in the official document.

The report describes how the flight had approached Moscow’s southern airport at Domodedovo from the south-west, turning left for a base leg for runway 14R. The flight deck became quite a busy place during this phase because the radar controller had vectored the aircraft closer to the runway than on the published standard terminal arrival (STAR) route, and away from their initial approach fix (IAF). The IAF was a non-compulsory waypoint and the controller had discretion to vector aircraft away from it. The radar controller also requested the aircraft to maintain 170 knots if possible to avoid minimum separation with traffic behind it. This situation lasted about a minute.

ATC communication with other aircraft was in Russian, which hampered the Emirates crew’s situational awareness. Practising the pilot’s art of projecting their minds a few minutes ahead of the aircraft’s present situation, the captain and first officer (commander and co-pilot in the GCAA report) began to form a mental picture that had them high and possibly fast. The captain discussed this with the first officer, who was the pilot-flying.

But communication soon broke down. From the report: The Investigation believes that their shared mental model of ‘becoming high on profile’ started to diverge when the commander suggested to request a descent to 500 metres QFE, as a prevention strategy to avoid being high on profile (altitudes below flight level transition altitude are given in metres in Russian aviation). This, probably, happened since the commander did not verbalise or explain to the co-pilot the intent of his suggestion, which was to prevent the aircraft becoming high on profile.

The first officer did not recognise the descent to 500 metres would place the aircraft on the desired vertical profile. Thinking they were still high, he attempted to intercept the glide slope from above.

Why would he do this? The report mentions ‘confirmation bias, which resulted in testing his beliefs or hypothesis in a one-sided way by only searching for evidence or information that supported his belief.’ The first officer was watching the glideslope indicator, but its deviation information was invalid because the aircraft position was outside the azimuthal (side-to-side) coverage of the ILS glideslope signal. And when the aircraft came within the glideslope signal’s azimuthal coverage, it was outside elevation coverage. The report notes the first officer could have cross-checked the indicator against pressure altitude, vertical and navigation displays and the DME distance table in the approach chart, but concludes he did not do so.

Meanwhile, the aircraft continued to descend. ‘No action by either flight crew member took place to stop the aircraft from descending,’ the report says of this period. The radar controller called with a lengthy (17 seconds) instruction in non-standard phraseology ‘not to descend further’. Several seconds later, at 504 feet, the crew began a go-around. The A380’s considerable inertia meant it went down to 395 feet AGL before climbing.

Things only got worse for the crew. As they prepared for the second approach, in accordance with standard operating procedures, a multi-waypoint sequencing in a row of the flight plan occurred at a location where several waypoints satisfied the FMS’s geometrical waypoint sequencing rules. This caused the automatic reset of the captain’s FMS.

How welcome the sonorous ‘50, 40, 30, 20, 10!’ intonation of the radar altimeter must have sounded at the end of their third approach.

Rather than condemn the crew’s human factors, situational awareness and human-machine interface performance, the report put them into context.

It quoted Key Dismukes, a NASA human factors expert, and Ben Berman (authors of the 2007 book, The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents) who wrote in a 2010 paper:

Even though automation has enhanced situation awareness in some ways, such as navigation displays, it has undercut situation awareness by moving pilots from direct, continuous control of the aircraft to managing and monitoring systems, a role for which pilots are poorly suited.

Also, the very reliability of automation makes it difficult for pilots to force themselves to stay in the loop.

Research is needed to develop ways to help pilots stay in the loop on system status, aircraft configuration, flight path, and energy state.

These new designs must be intuitive and elicit attention as needed, but minimise effortful processing that competes with the many other attentional demands of managing the flight.

The first officer was watching the glideslope indicator, but its deviation information was invalid.

The incident report also noted the Flight Safety Foundation’s four reasons why monitoring is difficult for pilots:

  • the human brain has difficulty with sustained vigilance
  • the human brain has quite limited ability to multitask
  • humans are vulnerable to interruptions and distractions
  • humans are vulnerable to cognitive limitations that affect what they notice and do not notice.

Emirates reviewed its procedure for intercepting the glideslope from above. It published a flight safety update in 2018 that discussed the problems of using ILS outside its certified envelope. And recognising that this bar story also belonged in a classroom, it incorporated the tale of flight 131 into recurrent training for its crews.