The idea that people should be treated fairly is important for developing a culture of reporting safety issues. But what is fair, and who gets to decide?
What is justice? What is culture? How long is a piece of string?
Because there will always be different ideas about the meaning of ‘justice’ and ‘culture’, the foundations of just culture are necessarily vague. No single, universal definition will ever be accepted by everyone. But just culture is an important idea that is central to strengthening aviation safety, and in any consideration of ‘just culture’, it is important that everyone knows what is being discussed, even if they don’t all agree.
CASA has adopted a rational approach to just culture that aligns fully with international best practice. Adopting the language used in the new European legislation, CASA’s Regulatory Philosophy defines just culture as, ‘an organisational culture in which people are not punished for actions, omissions or decisions taken by them that are commensurate with their experience, qualifications and training, but where gross negligence, recklessness, willful violations and destructive acts are not tolerated’.
Professor Sidney Dekker, of Griffith University, expanded awareness and debate on the concept in his book Just culture: Restoring trust and accountability, now in its third edition.
While aviation has been progressing towards a working definition of just culture for decades, he says, there are unsettling developments in some regions. ‘The reason for embracing the concept now, though you see this more in Europe than Australia, is an increasing prosecutorial assertiveness in the wake of accidents and incidents. Prosecutors jump into the aftermath of an accident, appropriating evidence. There have been cases where it was almost a tussle who walked away with the black boxes from the aircraft: air safety investigators or the police.’
Dekker acknowledges that just culture will always be a contested term because justice itself is not, and can never be, fully and finally defined.
‘Smart, reasonable people can discuss for hours and still leave with sharp disagreements. Justice is an essentially contested category,’ Dekker says. ‘It’s like love or beauty; these things are difficult to agree on. The real question for us is, how do we keep learning?’
CASA’s General Manager of Legal Affairs, Regulatory Policy and International Strategy, Jonathan Aleck, makes a similar point. ‘The concept is well embraced. It’s easy to say people shouldn’t be punished for honest mistakes. There should be learning opportunities and people should be encouraged to be candid, without fearing the consequences.
‘The more contentious issues involve just what honest mistakes are. Just culture has always excluded protection for deliberate, reckless, wilful and grossly negligent conduct. These kinds of things have always been “off the table”. But someone has to decide how to characterise the acts that are being assessed for just culture purposes.’
Another contentious just culture issue involves deciding what does, and what doesn’t, constitute punishment. Modern approaches to just culture have effectively resolved an important aspect of that issue, Aleck says, and CASA’s definition of just culture’ reflects this.
‘If someone is simply sacked for having made an error, and sent away with the warning “never darken our door again”, such action has a clear and distinctively punitive quality,’ he says.
‘But what about saying, “You clearly have displayed some deficiencies in your job-related skills or competencies, which need to be addressed and rectified—through training, education and practice—before we can permit you to perform tasks with safety-related implications.”’
The difference between such preventive, corrective and remedial measures, and a punitive or disciplinary response is that limiting actions (taking a pilot or engineer off the line for retraining, or curtailing or restricting an organisation’s activities) are only taken until the person demonstrates they can perform their duties safely, Aleck says.
That is why CASA’s definition of ‘just culture’ expressly recognises that: ‘Requiring a person to undertake further training and, where necessary in the interests of safety, to refrain from exercising the privileges of a relevant authorisation pending the successful demonstration of competence where deficiencies have been identified, shall not be regarded as discipline or punishment.’
‘For these reasons,’ Aleck says, ‘CASA’s regulatory philosophy rejects the suspension of an authorisation as a punitive sanction. In the case of a pilot licence, for example, this kind of action should only be taken when it’s unsafe to allow someone to fly; and then, they should only be suspended for as long as it takes to regain their competence, if that’s the issue.’
There is no question that the personal experience of clearly safety-related actions of this kind can be unpleasant, embarrassing and inconvenient. ‘These are not unimportant considerations, to which CASA and other members of the aviation community will continue to turn our constructive attention,’ Aleck says. ‘In the interest of safety, however, such experiences are sometimes unavoidable.’
Aleck points out the important relationship between maintaining and improving safety. ‘If general action is taken to reduce the likelihood that certain kinds of events don’t happen again in the long term, that’s improving safety. At the same time, however, it is vitally important to maintain safety, by recognising that particular action may need to be taken for more immediate and specific corrective or remedial purposes, so that the individual or organisation involved doesn’t make the same “honest mistake” again in the near term.’
These different, but complementary, considerations are reflected in CASA’s regulatory philosophy, as they are in the new ICAO Annex 19 standards and the European legislation.
‘The distinction is really a matter of perspective,’ Aleck says. ‘Accident investigators tend to look forward into the distant future, with a general view to preventing events similar to the particular accident or serious incident they are investigating from recurring tomorrow. Safety regulators, like CASA, have an obligation to focus on immediate and potentially imminent events, with a view to preventing accidents or incidents tomorrow or the next day.’
‘In most jurisdictions, occurrence reporting is managed by the regulatory authority. In Australia, most of these reports go to the ATSB, and people have sometimes confused that as meaning all occurrence (or incident) reports get the same kind of protection as accident and serious incident investigation reports get. That is not, and has never been, nor was it ever intended to be, the case. Limitations on the use to which the safety information conveyed in certain occurrence reports may properly be put has always been subject to exceptions,’ Aleck says.
‘This is explained on the ATSB’s website, and in the joint CASA/ATSB policy statement that appears on both agencies’ websites. Using occurrence reports and other kinds of information for the demonstrable purpose of maintaining safety,’ he adds, ‘is not, and shouldn’t be seen as, exceptional.’
Just culture is ‘not meant to be a “get-out-of-jail-free card”. But CASA is not about putting people in jail,’ Aleck says. ‘The key word is “accountability”, which means you’re answerable for what you do, and prepared to take the steps necessary to reduce the risk of harm that can result from even the most honest mistakes.’
Dekker’s latest thinking on just culture refines the model with alternative, complementary forms of justice. He contrasts restorative and retributive models of justice. ‘For an internal process inside an organisation, I’ve recently proposed a restorative rather than retributive justice model, as a way to restore trust and accountability. Restorative justice is as old as humanity itself, as old as Solomon. It asks, “Who was hurt? What do they need? And what can be done to meet that need?”
‘An airline had an issue with an aircraft leaking oil four hours into a flight. Afterwards, the engineer involved spontaneously grabbed his iPhone. He filmed himself on the ramp, explaining what he did and how things went wrong. It was a confession, repentance and a request for forgiveness, done by someone who was probably not very used to thinking about these matters. It impressed me that the organisation had created a climate in which this restorative gesture was possible. And I cannot imagine a better learning opportunity for other engineers.’
Dekker notes that South Africa did this on a national scale, with the truth and reconciliation commission that followed the apartheid years. ‘In statecraft there are trade-offs between justice and peace; in aviation, health-care and high-consequence industries, the trade-off is between justice and learning,’ he says.
‘There are certainly people who might think that the engineer got off too lightly but the payoff was not in the punishment and the feelings of righteousness that accompanied it. It was in learning, and improving—avoiding making the same mistake again.
The amazing thing that I find with the whole issue of action that is deemed to be punitive against those found to have contributed to an incident is, why aren’t the actions of those further up the chain of command who have had a bearing on a failure further up the causal path dealt with. While ever this continues to occur, the people at the pointy end will never feel that they have been treated with fairness.
Just culture is just a couple of BS made up words to appease the ever increasing power of HR. Don’t forget any department within an Airline is run by humans, the species that is selfish, vindictive & corrupt!
Walter, that is an unfortunate point of view.
Just Culture has had enormous benefits in organisations that have adopted it. When properly executed, it works and creates not only a safer workplace but a better one for everyone…
Understanding how people work best within an organisation helps in far more ways than just safety. I would recommend you do some reading on the real world effects of a Human factors/Just Culture system.
As a leader in aviation, I have found this approach invaluable and the results undeniable,.
Just Culture is absolutely essential in building the trust, on which successful organisations run.
I personally led a ramp safety audit and intervention program for a major US airline and my going-in position was that we were not going to ‘take names’, as had been done in the past.
We were going to record objective details (aircraft tail #, time etc) and then try to determine why safety deficiencies existed and the context in which they arose.
That last part is critical because no action can be understood outside of its context.
I had a team of supervisors working 24/7 and when we would turn up at a gate, would correct whatever procedure hadn’t been completed, help the crew turn the aircraft and then after the departure, talk about what we noticed and ask them to help us understand the situation.
When they realised that we wanted data and understanding and were not going to punish them, they began to trust us and we discovered all manner of incredibly valuable information about why things happened the way they did.
In most cases, they weren’t being lazy or careless, they were under enormous pressure and operating with equipment and procedures that were not always optimised for the task.
Humans in a complex system often create workarounds because of competing pressures.
Work as imagined (the procedure) is often not ‘work as done’, for many reasons.
Turning a wide-body in 40 minutes (sometimes in horrific weather) creates a range of competing priorities and these must be understood and managed properly.
By using Human Centred Design/Human Factors principles and engaging the workers in the solution, we were able to redesign processes, get new or more suitable equipment and engage them in the mission of being efficient and safe.
The result was a huge success. Aircraft damage, injuries and even some types of delays dropped dramatically. More importantly, people started coming to us to point out issues before they caused problems.
The simple fact is that if you want machine reliability – get a machine but if you want the flexibility, adaptability and problem-solving ability humans offer, you have to accept the vulnerabilities that go along with it.
If you work with humans, you cannot eliminate mistakes but you can make your system resilient, so mistakes don’t result in negative outcomes.
Most humans are well-intentioned in that they don’t intend to get things wrong. Honest mistakes can happen for myriad reasons, many of which can be system or environment related (remember SCHELL?) and so it is critical that we understand these reasons, otherwise someone else will have the same problem.
Having a workforce that is comfortable in self-reporting errors knowing that they will not be punished, is absolutely critical in building a predictive and proactive culture which can prevent incidents and accidents from happening.
And that is good for everyone.