Following a serious accident in his workshop, where one of his most experienced maintenance team members had both of his legs crushed, the workshop manager confided to me that it felt 'out of the blue'. His workshop had an excellent safety reputation.
The event occurred during a crane lift of a large steel pipe. The pipe dislodged, knocked a nearby team member to the floor and rolled onto his legs. He would never walk again.
Afterwards, when the investigation was complete, what frustrated the workshop manager most was that he already knew most of these issues. Similar incidents had happened, but with no bad outcome … They had a PO in the system to get the right lifting equipment ... He himself had observed that as they got busier there were too many jobs going on with too little space … The end of month bonus did encourage the team to take shortcuts to complete the jobs on time … The list went on.
Investigations are helpful for bringing out vital learnings from incidents. But in my experience investigations also cause good business leaders anguish over actions not taken in hindsight – not to mention the devastating impacts of the incidents themselves. So rather than waiting for people in our group to suffer before we take stock and act, we can instead understand the techniques that best-in-class businesses use to prevent serious accidents altogether.
There are organisations out there that manage to maintain near-accident-free performance over many decades, despite operating in high hazard and complex environments. They are called High Reliability Organisations (HROs). One of the five characteristics that helps them to achieve this feat is called chronic unease, or a preoccupation with failure.
How long has it been since a major accident or failure in your business? Long periods of success can result in us taking our eye off the ball – we might even start celebrating our success. What we know about major accidents is that when organisations are in this mindset, they are often drifting towards failure. Accidents don’t just happen when things are going badly; they often happen when things appear to be going well.
Chronic unease is a strategy to combat this. It is a psychological state where individuals at all levels of an organisation feel a sense of constant discomfort and healthy scepticism about how risks are being managed. This leads them to relentlessly hunt for warning signs of potential failure. Then the trick is to make those warning signs more vivid, and act on them to prevent those failures occurring.
In September 2021, I delivered a webinar discussing chronic unease and how it can be applied in practice. There was a lot of interest in the topic; a lot of people are keen to incorporate chronic unease in their organisations.
A common misconception is that chronic unease is just about combatting complacency at the frontline. But exhibiting chronic unease is not the responsibility of one group. And in practice, chronic unease is only able to flourish in organisations in the long run when environmental factors are structured to support its presence.
There are four areas that are essential for an organisation to work on to create chronic unease and sustain it (see Figure 1):
- A questioning attitude.
- Psychological safety.
- Risk competence.
- Systems to detect and capture warning signs.
Adopting a questioning attitude
A questioning attitude is a curiosity about the signs in front of you and a commitment to looking deeper. It helps you explore risks, uncover warning signs, and understand what those warning signs might mean. It provides a clearer picture of what your organisation’s real performance is and where the real issues are. You should question your assumptions, any unintended outcomes (positive or negative) and anomalies.
Ask questions like:
- Do we understand why we got that result?
- What could be the worst outcome?
- How could that control fail?
- What is our backup plan if it does?
Creating psychological safety
Psychological safety is a cultural environment where people feel like they will not be personally judged or punished for speaking up about warning signs or issues – especially by their seniors in the organisation. Psychological safety is important if you want to make sure you have a chance of receiving the benefit of everybody’s observations. You don’t want anything left unsaid, such as:
- People’s own mistakes and errors;
- Near misses;
- Things that look strange or different.
Ever heard the phrase 'that’s career limiting'? If your people feel they could be personally judged or punished for reporting incidents, making mistakes, or challenging decisions or directions they perceive as unsafe (especially to more senior people), they may choose not to raise issues – and rarely is it obvious if people are holding back.
Psychological safety is cultivated when leaders reward and recognise those who speak up, even if it turns out to be nothing, or even incorrect. Punitive approaches often discourage speaking up.
Risk competence is improved by a combination of technical knowledge and experience, gathered from a wide range of sources.
Improving risk competence
Everybody from the board to the frontline needs to have a clear understanding of the hazards and how they are managed – in other words, a high level of risk competence. If we cannot visualise what could go wrong, or if we do not have a clear picture of what our hazards look like when they are being well controlled, then it is easy to assume that things are going well.
I am not talking about classroom training in risk management. Risk competence is improved by a combination of technical knowledge and experience, gathered from a wide range of sources.
The more risk competence your organisation possesses, the more understanding there will be about any weaknesses in how the risks are controlled, and the more easily that warning signs can be noticed.
Capturing warning signs
Our systems for detecting and capturing warning signs may include:
- Systems to monitor physical risk such as strata, dust or gas monitoring systems;
- Reliability systems such as maintenance and inspection systems;
- Reporting systems for your people to tell you when unexpected things happen.
These systems should support the process of capturing, analysing and taking action on our warning signs. Even if we collect a lot of data, if the data is not turned into meaningful information or does not reach the right people, it cannot inform our decision-making.
Creating chronic unease
Practical ways to create chronic unease include:
Storytelling. This is hands down one of the best ways to create chronic unease. People are more likely to retain information if they hear it as part of a story. Also, as big failures do not happen very often, storytelling is the best way to recreate an experience for people that they can learn from and relate the details to their own situation. You can use stories or findings from your industry or other industries, use your own stories, or invite your technical experts to share theirs.
Technical experts taking a teaching role. Invite your experts (internal and external) to share their technical knowledge about a particular aspect of a risk or a control. You could have regular 15-minute presentations from these people instead of a toolbox talk.
Exploring your data. Take time in groups to explore your data (maintenance, incident, quality, production) with fresh eyes. Get 'fidelity' on the numbers by reading incident descriptions and asking experts. For example, start with your hazard reports. Get a deep understanding of the types of reports you are getting. For example, are they all related to one type of hazard? Are there blind spots elsewhere? Do all the reports come from one team?
Leaders spending time in the field. Field time is all about learning for everybody involved. Listen to those doing the work about what makes it successful and what makes it challenging. Take an expert or a set of fresh eyes with you (not a pack of people).
Using LEAN continuous improvement techniques to explore successful work. For example, 'learning teams' and Kaizen approaches. Leaders can use the knowledge discovered through these techniques to increase learning.
Using bowtie diagrams as shared risk knowledge. Organisations need a clear picture of their major hazards and something to anchor conversations to. Bowties enable everybody to speak the same language and to have a common understanding of hazards and controls – from the boardroom, to the planning team, to frontline workers.
There are many factors working against us practising chronic unease. That is why telling people to 'report more', 'notice more', 'care more' or to 'have more chronic ease' does not work. Without the right organisational environment, chronic unease cannot survive the long haul.
Leaders must address the organisational factors that suppress chronic unease, and build practical influences into their organisation to encourage its continued presence at all levels. Organisations must ensure they are set up to incentivise wanted behaviours through their actions as well as their words.
The way we message, measure and reward within our organisations, the level of our understanding of hazards and controls, and the systems we put in place to capture information all have a profound effect on whether warning signs can rise to the surface, receive attention, and be acted upon.
Chronic unease takes effort to build into your organisation so that it sticks and becomes chronic. It is worth it. In the end it is all about finding multiple ways to amplify your warning signs and make them more vivid, so you can act on them before they lead to failure.
Author: Jodi Goodall is head of organisational reliability at Brady Heywood, Brisbane, Queensland. Brady Heywood has also developed the Queensland mining and quarrying podcast series 'Rethinking Safety' as part of the Brady Review. Sean Brady and Jodi Goodall’s thoughts about serious accidents, mine safety and High Reliability Organisations can also be found at bradyheywood.com.au/insights Check out the recording of the webinar 'What does chronic unease look like in practice?' on the Brady Heywood YouTube channel, where Goodall digs deeper into practices for risk competence and systems that capture warning signs.