Controls risk

ESSENTIALS OF SAFETY BLOG 10/14

Controlling risk is all about maintaining a balance. You need to consider: why people are doing what they’re doing; what their level of situational awareness is; what the level of risk awareness is; what their mental models are; what they put into their planning activities; the risk control measures they choose (hierarchy of control for example); their expectations regarding failures; preserving options; being mindful; and, of course, what tools and equipment, procedures, and systems they need to use. And this is, of course, all led through authentic leadership.

This is the tenth of a dozen or so blogs covering the Essentials of Safety that I talked about in the first blog of this series. We have covered an introduction – which we called Essentials of Safety, Understands their ‘Why’, Chooses and displays their attitude, Adopts a growth mindset – including a learning mindset, Has a high level of understanding and curiosity about how work is actually done, Understands their own and others’ expectations, Understands the Limitations and use of Situational Awareness, Listens Generously and Plans work using risk intelligence.

The other blogs in the series are:

  • Applies a non-directive coaching style to interactions.
  • Has a resilient performance approach to systems development.
  • Adopts an authentic leadership approach when leading others.
  • Bonus – The oscillations of safety in modern, complex workplaces.

Controls Risk

Controlling risk is a vast field – so vast that one could write a book on this topic alone. This is, in fact, what Jim Wetherbee along with many other fine authors have done. All I can hope to achieve here is to highlight the elements of controlling risk that do not always get the amount of attention I feel they deserve. This is not to in any way belittle any of the other methods of controlling the risk of course.

I also want to highlight the differences here in terms of managing risk and controlling risk. It is the management’s task to manage risk, and it is the workers’ tasks to control risk. I will leave the creation of the systems and the high level of risk management out of this section. It is covered in other parts of the book.

In this context, controlling risk is a balance of people trying to keep in mind several different things; why they are doing what they’re doing; what their level of situational awareness is; what their level of hazard and risk awareness is; what their mental models are; what they put into their planning activities; the risk control measures they choose (hierarchy of control for example); their expectations regarding success and failure; preserving options; being mindful; and of course what tools and equipment, procedures, and systems they need to use.

The elements of controlling risks I mainly want to highlight are:

  • Trigger steps.
  • Critical steps.
  • Preserving options.
  • Telegraphing deliberate action.
  • Shared space/safety signs.
  • Risk Intelligence.
  • Critical control verifications.
  • Procedural compliance/adaptation.
  • Drift.

The first two topics that I want to explore, which I feel are really important in controlling risk, are both more about what we need to do to make sure every- thing goes right rather than minimising things going wrong. These important topics are ‘trigger steps’ and ‘critical steps’. It’s not so much whether they are clearly described in a procedure or task-based-risk-assessment, but in that the team doing the work very clearly know them and are watching out for them as they do their work.

The difference between a trigger step and a critical step is subtle but important. A trigger step is one that has immediate consequences. Whether you get it right or whether you get it wrong, there is no going back after the trigger step. Once you have cracked the egg into the soup, you cannot get it back into its shell. It is like crossing the Rubicon – the point of no return. A critical step, meanwhile, is one that we must get right each and every time. If we do not get a critical step right each and every time, it could result in someone getting seriously hurt or killed. Both trigger and critical steps need to be identified prior to starting the work. They should be talked about as a part of the sharing of mental models and monitored throughout a task.

When decisions are made during the task, especially related to changes in the work method – such as on-going task planning – we can make sure the teams have preserved their options for action in the event of things going pear- shaped. An example is always making sure that there is a second exit option if something goes wrong, or another control that can be implemented easily if one fails. In the outside-of-work world it could look like keeping an eye on the sides of the road so you know where you could go and what you could do if a car coming the other way swerves into your lane. Another example is riding out wide on your bicycle when you pass parked cars in case someone decides to opens a car door on you.

It is useful to spend time thinking about, and talking with leaders about, how to set up workplaces that are ‘error tolerant’ and whether it is possible to ‘fail safely’. These ideas/ideals are about the workplace and are super-important goals. They also recognise that humans will fail. And we need to recognise that we cannot build systems that are human-proof. It is also worth mentioning here that systems are not things that are ‘applied’ to people, but that systems include people. This tends to be forgotten when we talk about system development. We should go a bit deeper than simply ‘humans will fail’ and start thinking along the lines of:

In addition to setting up the workplace for success, what about the human bit – the person doing the work, or the work crew around the person doing the work. What can they do to help maximise getting it right, minimise mistakes, slips, lapses et cetera?

Jim Wetherbee and David Marquet gave me the answer. I believe that their discussions about taking ‘deliberate action’ and ‘telegraphing’ actions are brilliant and something that should be explored and shared.

The ideas of taking deliberate action and telegraphing actions are so similar that I felt it was okay to combine them and talk about telegraphing deliberate action.

Telegraphing deliberate action is all about getting into the habit of not only stopping and thinking about what you are about to do, but also physically pausing just before you do the action and, at the same time, verbalising your intent to yourself and to those around you.

So what does telegraphing deliberate action look like in anger? Here is a story that may explain it in a bit more detail:

Scenario: An Elevated Work Platform (EWP) operator is manoeuvring an EWP so that the basket, with the operator and a workmate in it, moves away from a hot furnace that they are working on. Moving the EWP in the wrong direction may result in the people in the basket of the EWP receiving serious burns. Adopting ‘telegraphing deliberate action’, the EWP operator holds his hand over the ‘B’ lever and announces to no one in particular ‘I am moving lever “A” in order to lower the basket’. The other worker, who is standing next to the EWP operator in the basket, sees the hand movement and hears the intention of the operator. He immediately alerts the EWP operator of the discrepancy. Alternatively, the EWP operator himself, upon saying lever A and seeing his hand over lever B, stops the action and corrects it himself. Either way, the potential burn is avoided and safe work is achieved. This is a fairly typical mining or construction industry example, but I hope you can see the potential for using ‘telegraphing deliberate action’ in your workplace – whether that be a hospital ward, operating theatre, maintenance workshop, office, or mine site.

When I talked with operators and leaders about this idea, they tended to say that it made sense and they could see how it could help, but they generally felt that it would not work so well for them personally. Some said it would feel weird to go on chattering as they did their work. They said they would feel uncomfortable doing it. This led to what I feel is the biggest problem with getting to an effective use of telegraphing deliberate action – we do not generally think out aloud as we do work. We do not share our ideas as we get things done in a nor- mal work situation. In fact, the ‘safe to speak up’ concept that is often driven by businesses and often seen in industries may not extend to the cultural ‘need to be seen to know what you are doing and just getting on with it’ way of working. The benefits of ‘telegraphing deliberate action’ only manifest when people are actually speaking up as they are doing their work. This may potentially present some difficulties during implementation. One way around this is for leaders to practice telegraphing deliberate action in their normal daily activities, in their meetings, in their workplace interactions. They can then get into the habit of talking through what they are thinking and doing and encourage their teams to listen and help poke holes in their logic. Thinking out aloud is a classic example of this. Of course, imperative in establishing ‘telegraphing deliberate action’ is helping others understand the ‘Why’ of the activity. If they get it, if they deeply understand why telegraphing deliberate action may help them minimise mistakes and help them undertake safe work, they will do it. If they don’t get it, they won’t do it.

What I love about the idea is that even if other operators are not present, the application of ‘telegraphing deliberate action’ benefits the operator themselves because the act of pausing and the vocalisation of intent forces the individual to be mindful of the present situation and what is trying to be achieved. It gives us another opportunity to get it right. And getting it right in the first place is much more satisfying than doing an investigation afterwards.

You need to make sure the message is not ‘we telegraph deliberate action for the benefit of the observer, the leader doing some field leadership conversation, or the boss’. The consistently shared message needs to be that ‘telegraphing deliberate action is purely for the benefit of the person doing the work’.

Jim Wetherbee says this about telegraphing action:

‘When the practice of telegraphing actions becomes automatic between crew members, the operating effectiveness of the team improves dramatically. When executed properly, this practice contributes to error-free operations, allowing the team to achieve better performance, with higher-quality results’.

This is a reminder that telegraphing deliberate action is not only about safety but is also about production and quality.

Whether you are a manager, an operator, a nurse, a doctor, or an engineer telegraphing deliberate action makes a huge difference to the level of mindfulness and situational awareness in the workplace and helps you to get it right the first time. So, it is not only about reducing mistakes but it is also very much about operational excellence. It also helps keep our workmates up to speed with what it is we are about to do – or at least up to date with what we think we are about to do. By sharing our intention, we are sharing our mental model of what the work situation is and what we are about to do within it.

I have a request from you. Do a micro-experiment (to borrow a Dekker idea). Go away and have a play with the practice of telegraphing deliberate action. Talk to people about it. Get them to have a play with it. Do it yourself for a while and see how it feels – there will be benefits.

A great example of telegraphing deliberate action, although he does not call it as such is in Richard Mullender’s Dispelling the Myths and Rediscovering the Lost Art of Listening. Mullender talks about how police drivers are trained to describe what they see and do as they drive to get better situational awareness and become better drivers. Part of an example he uses sounds like this: ‘I am driving at 40mph. In the distance I can see a road joining me from the left and the junction is clear. There is no-one behind me and there are two cars approaching’. And a bit later: ‘As I turn into the junction I can see there is a car coming towards me and there is a junction on my left hand side that is clear. I check my mirror and there is on car behind me travelling fast.’

Making telegraphing deliberate action a routine prompted me to look closely at the impacts of routines themselves.

Even though having routines is a great way to be consistent, absent-minded slips are more of a problem for experienced people than they are for the newbie in the activity. Slips are more common during habitual activities undertaken by people who know what they are doing, and there is no doubt that ‘routines’ fall under the same banner of ‘habitual activities’. Watch out for small departures from the routines and habits that you may have developed. Keep an eye on them and also on any small changes in what is going on.

As we establish routines, or changes to routines, we need to keep a very close eye on what is going on and what could go wrong, including anything we have not seen before that may pop up unexpectedly. Once again, always have an escape plan. Always have an option for when something unexpected happens. Never paint yourself into a corner. I remember when I used to ride my Ducati down the highway. I never sat the bike in the blind spot of a car or a truck. I always watched cars on side roads and had an opt-out strategy just in case some inattentive car driver did something unexpected.

How we set up a workplace can also give a false sense of safety and security. Safety signs are a classic example of this and one that I feel require a few words on their own. Safety signs can be important in some circumstances. However, more often they can be the bane of a leader’s existence. They are routinely every- where and usually add very little, if any, long-term benefit. There can also be an overconfidence related to the effectiveness and efficacy of them as risk controls. I suggest instigating two activities to address these issues: a safety sign war, and a risk control wariness campaign based on the idea of shared space.

Firstly, let’s talk about a safety sign war.

Establish regular workplace inspections to make sure you get rid of any sneaky posters and safety signs – ones that add zero value such as: (on mirrors) ‘You are looking at the person responsible for safety’, ‘Be careful – Do not get hurt today’, ‘Safety is YOUR responsibility’, ‘Remember to Take Care, Be safe’, ‘Electricity can kill’, ‘Work safely’, a sign on a railway crossing that reads ‘Vehicles must not become immobilised on this crossing’, and this really smart one I saw on a hot water tap: ‘CAUTION – The water in this tap is hot and can burn you’. Imagine … hot water coming out of a hot water tap – who’d have thought!

I have experienced, as we removed signs in a workplace, people telling us that they did not know that half of them were actually there. They had become such a part of the furniture and were no longer even being seen. People also said that they were glad to be treated as adult humans instead of as children.

Try getting rid of the safety messages on front gate electric smart signs that talk about the number of days since the last injury and the injury frequency numbers from it as well. Everybody will be happy with that one. These are absolutely not useful measures of whether work is being done safely or not. They are just easy to measure.

To further expand beyond your war on safety signs, take a look at other controls from the perspective of shared space.

‘Shared space’ is a way of thinking that was derived from a traffic-management philosophy created by Hans Monderman, who hails from the Netherlands. Even though he was focussed on traffic management, the idea is equally useful in a workplace. It originally involved the removal of traffic controls such as stop signs, traffic lights, speed limits, pedestrian crossings, barriers and gates, caution and warning signs, and painted lines from a section of the road. The idea was to increase the unease of road users and encourage mutual respect and communication – either direct or indirect, including eye contact and visual cues – in order for all road and intersection users to interact safely and effectively.

I am not encouraging the wholesale removal of controls. I am merely stating that you need to look before an incident, during our day-to-day or normal work, as well as after an incident to check if you have overdone the risk controls. You do not want to dumb down your people by encouraging them to be reliant on signage and controls without clearly understanding them and their intent.

Encouraging people to think before they act is a good thing. As a leader, ask yourself if allowing people to have some level of discomfort and encouragement to think before they act is such a bad thing. It is not.

As usual, a balance needs to be struck. I often hear complaints that the practice of ‘safety’ means wrapping people up in cotton wool, with a long list of procedures and processes that cause workers to stop thinking for themselves about what could go wrong and what they must do to prevent it from going wrong. This is essentially where the thinking behind shared space intersects with the designing and the creating of safe work. We want to provide enough information to help the workers make decisions but not overload them with ‘protective’ controls. Once again, it is all about getting the balance right. I also talked about this with respect to framework and critical sections of procedures earlier. We also need a prompt that helps us think about what level of overprotection played a part in an incident or an LNW review.

We all know that work is not risk-free. We also all know that it is the task of leaders in our business to apply risk management and it is the task of frontline workers to apply risk controls.

The typical view in people’s mind is that risk is all about hazards and danger, but for those who understand the concept of risk, danger is only one side of the risk coin, the other being opportunity and success.

I believe that one of the secrets to risk management from the perspective of a leader lies in the concept of Risk Intelligence and in setting up risk assessment processes – helping people to understand what risk is actually all about.

Treating risk assessment processes for fatal/material/catastrophic risks in a very similar way to that associated with risk assessments prior to the creation of procedures and the like is sensible and beneficial in terms of creating safe work. The two processes are different in terms of outcomes but the inputs are pretty much the same.

The other difference is that, in bowties and material/fatal risk registers, you highlight the critical contributing causes and also the critical controls. These are controls that must be in place each and every time the task is done, or else the hazard may manifest. You should always try to keep the critical controls to a couple and then make them very specific. Then you need to carry out verification activities on these critical controls in the field as a part of field leadership conversations. One common term for this is a CCV (Critical Control Verification).

The purpose, or why, of the CCV is to verify that the critical controls from fatal and material risks are in place and effective. CCV’s are a very conversation- based approach and provide a strong level of assurance for the lead team in the business with respect to critical control implementation.

Before we start identifying causes, consequences, probabilities, and controls in a risk assessment workshop, we need to take the team through the risk issue with the intent of getting the team to build a shared mental model of the system and its behaviour as well as the mental model of the specific element of risk issue itself. This step is routinely missed in risk workshops I have been subjected to.

It is important to do a lot of work before and during any risk assessment work- shop. This includes training the participants in Risk Intelligence and reminding the participants that all risk is subjective and is interpreted through the lenses we all have of our own experiences, expertise, imagination, Risk Intelligence, biases, heuristics, recent incidents, perspectives, etc. We then need to fill the risk assessment workshop with people who provide diversity in their thinking, experience, and knowledge associated with the risk issue and of course some ‘real’ people. I use this term in incident investigations/learning studies as well. These are the people who actually do the work – the operator, maintainer, nurse, plumber, construction worker, etc. Real people know the Work-As-Normal – how the work really gets done.

As for Risk Intelligence, it is the ability to estimate probabilities accurately, whether we are talking about the probabilities of various events occurring in our lives, such as a car accident, a workplace event, or the probability that some piece of information we’ve just come across is actually true. We often have to make educated guesses about such things, but 50 years of research in the psychology of judgment and decision-making show that most people are not very good at doing so. Many people, for example, tend to overestimate their chances of winning the lottery, while they underestimate the probability that they will get cancer at some time in their life.

In addition to the use of Fermi questions (that we previously discussed), during risk assessment workshops, one way of improving Risk Intelligence is to gather wider information on the subject, or to look for things that may disprove our view, not for things that support our view. This helps us remove some of the confirmation bias.

So why is Risk Intelligence an essential skill for our leaders? Because we want to help them learn the ability to assess situations in which normal work is done and also explore what could go wrong (and right) and the likelihood of that happening. In order to be able to do this, they need a reasonable level of Risk Intelligence. It is this view of risk that we want people to think differently about.

A simple approach to improving Risk Intelligence is to expose ourselves to a greater diversity of opinion, and especially to seek out views that are opposed to our own. This manifests in the workplace as the requirement that task-based risk assessments are best completed as a team, involving those new to the team and others as much as possible. The young and the newbies often ask some great questions and make a real difference to the risk assessment process and outcomes.

It is a good idea to remind the risk assessment participants to keep an eye on their availability heuristic as this can impact how they think about and perceive the risk.

The availability heuristic works well enough when we have to estimate the probability of things that are entirely within the realm of our personal experience, such as missing the bus or finding a ten-dollar note on the footpath. But when we’re gauging the likelihood of things that are reported in the media, the correlation between ease of recall and likelihood breaks down, and the avail- ability heuristic leads to biased estimates. TV news and social media present us with rare and dramatic disasters that we might never otherwise see, such as plane crashes, tsunamis, terrorist attacks, and so on. The images sear themselves into our memories and are recalled rather too easily. If we rely on the availability heuristic when estimating the probability of such dangers, we will tend to think they are more common than they really are.

What can we do to avoid being led astray by the availability heuristic and imagination inflation? The most obvious remedy is simply to be cautious when estimating the probability of dramatic events. If images of such events come easily to mind, we can ask ourselves if it is because we have personally experienced many of them or because we have read about them on the internet or seen them on TV, or been exposed to them through some social media platform. Likewise, we can ask ourselves if such events loom large because they have occurred in our lives or because we have previously allowed our imagination to give them full rein.

The availability heuristic leads us both to overestimate the probability of dramatic events and to underestimate the likelihood of situations that are not so easy to picture.

Another one we need to watch during risk discussions, especially in risk assessments, is probability neglect. This results when a hazard stirs a strong emotion in the individual. What happens then is those involved in the risk discussion will tend to overestimate the likelihood of the hazard manifesting. They will assess risk in a way that reflects their wish to avoid the hazard manifesting.

A conversation about controlling risk would not be complete without a quick discussion on the hierarchy of control and engineering risks out of the equation. I make the assumption that you are aware of the concept of the hierarchy of control. My point here is to not forget it. A final area that is often misunderstood is the use of procedures. I have talked about them under the expectations section of this chapter in detail but it is worth briefly revisiting that for a bit now. Procedures are rarely written in such a way as to be foolproof. They require their users to adapt to the unique situation in which they find themselves at work. Work practices also drift over time, with the way the work is being done today being different from how it used to be done. People pick up ways of working that may be easier or quicker and over time that becomes the way the work gets done. Drift can also appear as procedural drift in Work-As-Written, where we have tweaked procedures over time due to periodic reviews or changes after workplace incidents to the detriment of the level of control of the risks. We can end up with a procedure that misses the mark on critical controls and may actu- ally add risk to a task.

To help with understanding and keeping in mind what needs to go right and what might go wrong, we can prompt people to ask themselves questions similar to those below:

  • How do I normally get this trigger step or critical step right?
  • Do I have a plan to make sure everything continues to go right?
  • Am I doing enough telegraphing deliberate action?
  • What is in the line-of-fire that can seriously bite me?
  • How do I work within the housekeeping as it currently is?
  • What is going on around me – including any simultaneous operations (SIMOPS: SIMultaneous OPerations) and will I impact it, or will it impact me?
  • What is going on in relation to Material or Fatal Risks – critical controls that I should be considering? Is anything I am doing related to a Bow-Tie or Material Risk/Catastrophic Risk?
  • Are there any anomalies in the workplace that do not appear to have been there before? What stands out as different?
  • Do I have prompts to help me understand and be aware of the critical risks?
  • What devices or elements provide me with alarms and warnings when danger is imminent?
  • Do I have a plan if something does go wrong?
  • What if … ?
  • What usually goes right here, but may not today?
  • What will happen to me and the team if we do not control the risks?

Key Takeaway: Controlling risk is all about maintaining a balance. You need to consider: why people are doing what they’re doing; what their level of situational awareness is; what the level of risk awareness is; what their mental models are; what they put into their planning activities; the risk control measures they choose (hierarchy of control for example); their expectations regarding failures; preserving options; being mindful; and, of course, what tools and equipment, procedures, and systems they need to use. And this is, of course, all led through authentic leadership. work).