Three-ring mistake

23954
photo: © urbancow| iStockphoto

By Mike McGrath

My name is Mike McGrath. I am (or was) a professional skydiver with over 12,000 jumps (7000 tandem jumps) and was the chief instructor at Newcastle Sport Parachute Club.

Had you asked me before 12 October 2014 how I would deal with discovering an inverted three-ring on a tandem deployment, I would have answered that it could not happen to me.

On that day, my gear checks failed to pick up an inverted three-ring packing error on the left-riser assembly, the load of the opening shock was transferred to the loop and Bendix cable, which were damaged and could have broken at any second. (I couldn’t cut away and had to land it.) I find it hard to reconcile the fact that I missed this equipment defect during both of my pre-jump checks.

But after reviewing the video evidence of the jump, I can only conclude that I must have missed this because the misrouted three-ring is clearly visible on the ground, in the aircraft, and in free-fall. It happened, and as such I accept full responsibility.

A critical situation

It was a very busy day with 37 tandems, six students that wanted to do as many jumps as possible, and many fun jumpers. Before donning my equipment I did my usual visual and physical check of the back of the equipment, and a visual check of the front of the equipment. My logic was that any issue with the front of the equipment would be picked up during my final 10-point check.

I did not become aware of the problem until the opening sequence, when I heard an unusual sound in my left ear sometime during line stretch. After deployment, I checked the left three-ring (this is where the parachute is clipped to the harness, using an arrangement of three metal rings) and found it was not as it should be. The canopy was open and flying in every other respect.

I could not completely ascertain exactly what was wrong and decided if it was possible to cut away, (I was now at 2800 ft) then I should try. Reasoning that I would rather be under a perfect reserve than an imperfect main, I gave it a go first with one hand on each handle, then with both hands on the cutaway. It became obvious that this was not an option.

My next step was to further examine the three-ring. It soon became clear that there was a catastrophic failure of some form. I could see that the two smaller rings, the yellow Bendix cable and the riser loop were a long way from the riser. I suspected the riser loop had gotten hitched around the small ring, and I tried, gingerly, to dislodge it. It wouldn’t move. At this point I stopped pursuing the option of cutting away and began exploring the consequences of flying this through to landing.

I suspected my customer and I were suspended by the riser loop and/or the Bendix cable. I was hopeful, but not at all confident, that it would hold our weight all the way to the ground.

I decided to leave the customer’s lower attachment points connected until we were below 1000 ft in case the riser gave way and I could execute emergency procedures. I flew with minimal and very light toggle input so as not to increase wing loading and to remain ready to cut away and deploy the reserve at a second’s notice in case the left riser assembly gave way. I left my cutaway and reserve handles unstowed, for quick use, if needed.

Once this plan was formed, I decided, just in case the worst happened, to leave a short message for my family on my GoPro hand-cam. I asked the customer if he wanted to do the same, and explained our situation as best as I could. To his credit, the customer stayed calm.

Once below 1000 ft I disconnected the lower right attachment point, but could not disconnect the lower left attachment point. We did a practice legs-up for landing, and then set up for a straight-in approach.

After landing I continued to film the three-ring system and invited other senior instructors and a packer to inspect it with me. I believed with total conviction that I had just been dealt a random catastrophic equipment failure which I had dealt with relative calm, integrity and professionalism. As far as I was concerned something had happened to me and I had dealt with it.

It wasn’t until I checked the video that I was confronted with the stark reality that an inverted three-ring was the actual cause and that I had made a mistake.

To say that this was a shock to my system is the biggest understatement of my life. I was shaken to the core. It was like the scene in Fight Club when you realise that Tyler Durden and the narrator are the same person.

I have since found out that the whirlpool of negative emotions I experienced when unable to resolve my subjective belief of what happened with the objective facts in the video is called cognitive dissonance.

I had believed that if I followed certain procedures, checked my gear and had a safety first attitude, then I could jump out of a plane in relative safety. This belief had kept me alive and skydiving safely for over 20 years. I now have to reconcile that belief with the video evidence in front of me showing me making one of the biggest and most clear-cut mistakes in the business: failing to properly check my gear.

Lesson learnt

Cognitive bias or ‘we’re only human’—is a pattern of deviation in judgment, where we can draw illogical inferences about other people and situations. Individuals create their own ‘subjective social reality’ from their perception of the input.

An individual’s construction of social reality, not the objective input, may dictate their behaviour in the social world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

The execution error I performed when I checked but failed to pick up the inverted three-ring packing error was likely due to one or more of the following cognitive biases:

Optimism bias—almost every human being who is not in a state of depression could be said to be suffering from optimism bias in some form. If they weren’t, it would be hard to get out of bed in the morning.

I suspect we skydivers have a lot of optimism bias baked into our very culture! It may be our optimism bias that actually sets us apart from a lot of people who don’t skydive. You need to be a pretty positive person to risk your life jumping out of a plane, no matter how much you know, or don’t know about the objective risks of skydiving.

As skydivers we don’t just underestimate the risk of something bad happening to us. We also over estimate our ability to deal with a problem if it arises. This bias alone has some serious implications in terms of the potential to cause a fatal incident.

But here’s a twist—scientific studies suggest that to lead a happy and fulfilling life optimism bias is a positive delusion to have.

Confirmation bias—if optimism bias is doing a tandem and ‘thinking it is going to be ok’. Then confirmation bias is having thousands of jumps and ‘thinking that you were right’. If you believe something is true, you notice information in the world supporting that worldview more than you notice information that doesn’t.

Expectation bias—after performing a checklist hundreds of times an individual can become predisposed to ‘see’ the item in the correct position, even when it is not, i.e. ‘looking without seeing’. Expectation bias, or expectancy is a factor that can influence the visual system, including how and where people look for information.

Inattentional blindness—or the ‘looked-but-failed-to-see-effect’ is a failure to perceive what would appear to others as an obvious visual stimulus. This occurs when your attention is engaged on another task, and does not necessarily mean you were ‘not paying attention’, but that your attention was occupied elsewhere. Research suggests that inattentional blindness can be influenced by workload, expectation, conspicuity and capacity.

Automaticity–while not a bias in the strictest sense, automaticity refers to the fact that humans who perform tasks repeatedly eventually learn to perform them automatically. That is advantageous; however, automaticity teaches us that the flip side of this is that we also reduce novelty and individual attention from the experience which can lead us to perform a lapse execution error such as a person automatically performing a function (such as a checklist item) without actually being aware of the task itself.

As a result of this incident, my understanding of the frailties and shortcomings of the human mind has undergone a significant re-adjustment. We are fallible—very fallible. And it would appear that sometimes we can’t even trust our own senses.

As I write, I still haven’t jumped since the incident. I have a two-year-old son and that changes how you see the world, especially when it comes to managing personal risk. I will be undergoing some pretty comprehensive retraining as well as psychological counseling in the hope of jumping again in the near future, but how I actually feel when I get in the air remains to be seen.

14 COMMENTS

  1. Thank you Mike for sharing. It is brave of you to share this information, with the intent, I believe to keep others safe.
    I would like to ask if you would mind if we share this, in our recurrency meeting we have every spring, I think it would help others.
    I hope you do get back into the air, with a new awareness, but also a new appreciation of our Gift of Flight.
    Share, with others, your message, and know you have been the strong one to pass this forward.
    Enjoy this Gift, and the new life you share this world with, may he see you living life to the fullest, and sharing this Gift with others.
    Yours, in safety,
    Blue skies, soft winds
    :)

  2. Excellent article!

    I have taken the liberty of sharing this with the TIs in Norway, who will hopefully benefit from your experience.

    Best regards,
    Øyvind Nikolaisen
    Tandem Instructor / Examiner

  3. This is a great article, I particularly like the neat summation of cognitive bias and your recognition of the contribution to your own incident. Would you mind if I replicated part of your piece for a newsletter article, with full attribution of course?

  4. great article & illustration of human factor weakness/dangers, many, many thanks for sharing, will stay with me.

  5. Great article, exploring cognition where its critical. Im not a big time jumper, but a bloke told me when checking a rig to actually place your finger on the point you are checking the physical connection improving the cognitive link, he said he had seen pilots doing it on pre flight checks with plane dials. I used to do it
    I stopped jumping when I made mistake I wasnt happy with myself about, we all umans except for the gods haha.

Comments are closed.