Fire and fury: The destruction of Piper Alpha

18010

An oil platform disaster 30 years ago has grim, but abundant, parallels and lessons for aviation safety.

The first sign that something was wrong, was when the ceiling fell on Ed Punchard. He was diving coordinator on the Piper Alpha oil platform in the North Sea, east of Scotland, and it was 10 pm on the evening shift of Wednesday 6 July 1988.

Punchard was 28, well paid, in love with his job, and fitter than he had ever been. He was also blessed by the location of the diving office—immediately above the dive skid on the lower level of the platform. It would give him a survival advantage over most of the other men on the installation, 167 of whom would soon die. But he didn’t know that yet. No-one knew anything.

‘We found ourselves with all the filing cabinets fallen over, the ceiling down and the lights out,’ he recalls. ‘I helped the diving superintendent into a breathing set and he told me to go and find a way out.’

‘I made my way up towards the control room. After a couple of flights of stairs, I could go no further—all the routes up were blocked by smoke. By the time I had made my way back down, the diver in the water had been recovered and the dive team was gathered on the one corner of the platform that was free of smoke. There would have been about 20 of us.’

He remembers the group wondering what to do. ‘There was no obvious way up or down.’

Restless, Punchard noticed a ladder to a small platform and climbed down it to explore. After a few rungs, he found he could see right underneath the platform. Thirty years later he inhales, then adopts a deliberately undramatic tone to describe the view as ‘an alarming sight’.

‘On the other side there was an enormous fire. Anyone who works on a platform knows a fire round the well heads is an extremely dangerous thing. I called up and said, “we’ve got to get off!” There was a rope near where we were standing so we threw it over the side.’

Punchard was first to slide down the 25 metre (80 foot) drop. ‘I reached out with my toe to grab the corner of the spider deck (just above sea level). I pulled the rope in and people came down and followed me. By that time a standby vessel, the Silver Pit, had come close and launched an inflatable fast-rescue craft. It came in, and we climbed down a small ladder and got picked up.’

Punchard was one of the first of the 61 men to escape Piper Alpha. But his ordeal was not over. On board the Silver Pit he took part in further rescues, was nearly drowned, nearly burned, and saw at first hand the terrible explosions when the Tartan and Claymore gas ‘riser’ pipelines feeding Piper Alpha ruptured. They fed the fire with 30 tonnes per second of gas pressurised to 1800 psi.

Other survivors told equally harrowing stories. One comes from Scottish journalist Stephen McGinty’s definitive account, Fire in the Night.

Meanen thought he was going to die. And yet this single thought was to be the spark that ignited a burning will to live. The next 30 seconds were as if a machine had taken over his body. He backed down the steps, squeezed through the bars and began to run across the helideck. When he reached the metal poles that stuck out and supported the safety net, he slowed down, took off his life jacket, stepped out onto the metal poles and looked down into the water 170 feet below. He then threw the jacket over, backtracked, ran and jumped. It was only when his back foot took off and his entire body had left Piper Alpha for the final time that, hanging in the air, in that fraction of a second before gravity took grip, consciousness returned. At that exact second running through his mind was a single sentence: ‘What the f*** have I done?’ He had six seconds to contemplate his actions, and as he fell he burned.

Later escapees from the inferno described hearing unearthly groaning and scraping noises as the structure of the platform melted and buckled from the tremendous heat. At 11:50 pm, the accommodation block, where most of the men had gathered to wait for rescue, in accordance with their drills, fell into the sea. All within it died. It had been less than two hours since the first explosion.

The operational role of aviation that night was limited. Helicopter rescue was made impossible by frequent explosions and flames which soared to over 500 feet. But the themes of the disaster should be clear to anyone with a realistic understanding of aviation safety. The most sobering lesson is how the disaster was the product not of intentional malice, incompetence or negligence, but of a culture, ‘the way we do things round here’, which disguised and normalised these things. The platform had been anchored to the seabed, but in an organisational, metaphorical sense, it drifted to its destruction.

Communication and procedure

In fire engineering terms, an oil installation and an aircraft have more similarities than differences. Both involve placing people in hostile environments in close vicinity to large amounts of volatile hydrocarbon fuels. And both rely on redundancy and cross checking for safe operation.

These had broken down on Piper Alpha.

About the time the shift changed, one of the platform’s two condensate (compressed gas) pumps failed. This was a serious threat to production. The night crew turned on the second condensate pump. Unknown to the night shift, a pressure safety valve had been removed from the second condensate pump and a hand tightened flange had been installed in its place. Paperwork outlining these changes was sitting unseen on a manager’s desk. The flange promptly failed under the high pressure, resulting in what survivors remembered as a banshee scream, before the first in a series of explosions.

Design and drift

Engineering risk analyst Marie Elizabeth Pate-Cornell described the platform as being ‘decapitated’ by the first explosion, which damaged the control room and killed or disabled senior staff. This was a consequence of how the platform had evolved in use beyond its design brief.

Punchard sums up: ‘Normally the pumps would be switched on manually from the control room. But Piper had been designed to process oil, not to process gas. The wall that protected the control room was only a firewall, not a blast wall. When the explosion happened, there was a high degree of devastation in the control room which meant that it wasn’t possible to switch the pumps on. It was a combination of bad design and lack of analysis.’

Pate-Cornell also noted the closeness of living quarters to the production modules, which led to the rooms filling with smoke, and the poor planning of exits and passageways. This led to early blocking of passageways and made the lifeboats inaccessible.

She adds a chilling surmise: ‘The offshore installation manager probably knew this, which may have contributed to his state of panic and his inability to function and give orders.’

Culture: Lord Cullen’s displeasure

A Scottish judge, Lord Cullen, produced the official report into the tragedy. He was scathing about how the offshore oil industry’s culture had both tolerated and created human shortcomings, and about how the systems that were meant to assure safety, had been allowed to decay.

‘Before I got to grips with the inquiry I imagined it would be concerned with hardware,’ Lord Cullen told a commemorative conference in 2013. ‘But I quickly realised that fundamental, and running through everything, was the management of safety, and as I dug down into the background of what happened I discovered it was not just a matter of technical or human failure. As is often the case, such failures are indicators of underlying weaknesses in the management of safety.’

‘Management shortcomings emerged in a variety of forms,’ Lord Cullen said. ‘There was no clear procedure for shift handovers. The permit to work system was inadequate, but so far as it went, it had been habitually and frequently departed from.’

‘Training, monitoring and auditing were poor, and the lessons of a previous relevant accident had not been followed through.’

How had such a state of affairs evolved? Lord Cullen recognised the importance of culture and leadership in creating, or destroying, the preconditions for safe work. ‘No amount of requirements for safety improvements can make up for deficiencies in the way in which safety is actually managed.’

The originator of the Swiss cheese model of accident causation, Professor James Reason, linked Piper Alpha’s cultural deficiencies with specific deadly practices. These are:

  • ‘Front-line errors are more likely in organizations that are insufficiently concerned about the working conditions known to promote the slips, lapses and mistakes of both teams and individuals,’ Reason said.
  • ‘Second, an inability to appreciate the full extent of the operational dangers can lead to the creation of more longer-lasting holes in the defences. These may arise as latent conditions during maintenance, testing and calibration, or through the provision of inadequate equipment, or by downgrading the importance of training in handling emergencies.
  • ‘Perhaps the most insidious and far-reaching effects of a poor safety culture, however, will be evident in an unwillingness to deal proactively with known deficiencies in the defences in-depth. In short, defensive gaps will be worked around and allowed to persist.

Lord Cullen’s report made 106 specific recommendations, all of which were accepted by the United Kingdom’s offshore petroleum regulator, (and many of which were adopted by Australia’s offshore regulator).

But his most far reaching recommendation was for the offshore industry to manage risk by presenting safety cases rather than solely through adherence to regulation. A safety case is an argument about how a proposed activity can be done with acceptable safety. In effect, it is the regulator saying, ‘So you say you are safe. Prove it!’ NASA has a useful definition:

A risk-informed safety case is a structured argument, supported by a body of evidence, that provides a compelling, comprehensible and valid case that a system is or will be adequately safe for a given application in a given environment. This is accomplished by addressing each of the operational safety objectives that have been negotiated for the system, including articulation of a roadmap for the achievement of safety objectives that are applicable to later phases of the system life cycle.

Lord Cullen summarised a safety case as ‘Asking and answering the what-if questions and avoiding making do with preconceptions.’

With 30 years of hindsight Punchard says Piper Alpha was a tipping point. ‘I think it was the moment at which modern systems of health and safety were instigated in a new way that requires corporations and individuals to be much more self-motivated and self-proving.

Oil and gas industry safety analyst Professor Patrick Hudson makes a similar point. ‘Looking back, we can see that safety has undergone a development from an unsystematic, albeit well-meaning collection of processes and standards, to a systematic approach specific to safety. Piper Alpha served as the catalyst for this major change,’ Hudson wrote 10 years after the accident.

Hudson also saw in Piper Alpha the need for safety to become a cultural, rather than management imperative. ‘In a managed organisation it is still necessary to check and control externally. In a safety culture it becomes possible to find that people carry out what they know has to be done not because they have to, but because they want to,’ Hudson said.

‘Advanced safety cultures can only be built upon a combination of a top-down commitment to improve and the realisation that the workforce is where that improvement has to take place.’

Punchard elaborates on Hudson’s point when he says the disaster showed the difference between complying with regulations and actively striving for safety. ‘I describe it as the difference between traffic lights and roundabouts. Both do the same thing, but roundabouts require a certain skill level, traffic lights don’t, and there’s a danger in relying on traffic lights and an important vitality in skill and self-motivation.’

The lesson? ‘Don’t instruct your workers in such a way that they zone out, stop thinking. It’s important to keep your workers engaged, skilled and motivated.’

Aftermath

Ed Punchard never went back offshore. Within hours of the blast, he found himself experiencing unexpected emotions, the first of which were feelings of joy and invincibility when the rescue helicopter brought him ashore. A newspaper photographer captured a shot of him grinning broadly as he stepped onto solid ground. Later, in grief and shame, Punchard stole the photo from a press archive. Still later, he returned it when he realised his reaction to the experience had been entirely normal.

‘In the aftermath of Piper Alpha, I, like many of the survivors, suffered with post-traumatic stress disorder,’ he says. ‘Especially for somebody who was a diver and regarded himself as pretty resilient … that sort of thing can take you by surprise. Now when I look back, I can see it was a completely unsurprising thing to have occurred. None of us were trained to be in what was in effect a combat zone, where you were surrounded by explosions and had to deal with an escape and then a rescue organisation, like the escape, largely coordinated by survivors.’

‘The reassuring thing is that with some quite simple and now well-established techniques of counselling and recognition of the condition, the vast majority of us recovered very well.’

Punchard says for anyone involved in a traumatic event it’s important to accept a high probability of PTSD, and also to realise that it need not be a lasting condition. ‘There’s no reason to believe you won’t recover but it’s important to take it seriously.’

For Punchard, Piper Alpha was the catalyst to a new life. He moved to Australia and became a successful television and film producer. A diving helmet sits in the corner of his office.

Further Information:

Cullen, Lord. (1991). The public inquiry into the Piper Alpha disaster: Vol. 1. London: HMSO.

Cullen, Lord. (1991). The public inquiry into the Piper Alpha disaster: Vol. 2. London: HMSO.

Hudson, P.T.W. (2001). Safety Management and Safety Culture: The Long, Hard and Winding Road. In: Pearse, W., Callagher, C. and Bluff, L. (eds.) Occupational Health and Safety Management Systems. Crown content, Melbourne, Australia, pp 03–32.

McGinty, S. T. (2008). Fire in the night. Pan MacMillan, London.

Paté-Cornell, M. (1993). Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors. Insurance: Mathematics and Economics, 13(2), 165. doi:10.1016/0167-6687(93)90921-b.

Punchard, E. and Higgins, S. (1989). Piper Alpha: A survivor’s story. W.H. Allen, London.

Reason, J. (1998). Achieving a safe culture: Theory and practice. Work & Stress, 12(3), 293-306.

1 COMMENT

  1. Tell this to the CEO’s, not the workers, the management who fight constantly with the staff to be more efficient as EVERYTHING corporate wise is based on MONEY, safety is always second despite the ‘front’ that is being shown to authorities….that old saying…”if they can get away with it they will”

Comments are closed.