94

Proving safety (3): a three-part series

June 21, 2024

Share

Facebook
Twitter
LinkedIn

This is the third in a series of three podcast with Greg Smith, the author of the book Paper Safe. In this podcast Greg and I discuss the notion of critical risks and critical risk management from a legal and strategic risk management perspective. We hope that this podcast will make you more mindful and critical about how to think about critical risks in your organisation.

Further information

 

Proving safety part 3

SPEAKERS

Nippin, Greg Smith

 

Nippin  00:01

Welcome to another episode of embracing differences with me. Nippin Anand. Anand a podcast aimed at understanding and promoting transdisciplinary ways of living and thinking, meaning, assimilating different viewpoints, different subjects, different disciplines, but focused on a very simple question, how do we human beings, learn, unlearn, relearn and make decisions, and how can we tackle risks in an uncertain world? This is the third in a series of three podcasts done with Greg Smith, the author of the book, paper safety, and more recently, proving safety. In this podcast, Greg and I have a very interactive discussion about this whole idea of critical risk controls and critical risk management and why it is so important to think critically about the whole idea of critical risks, because I, in my understanding and in my experience, I think most companies don’t quite understand what it means when we talk about critical risk and critical risk management. I hope you enjoy listening to this podcast as much as we enjoy creating it. Greg, I just can’t get enough of you. I don’t know why I think Greg, it would be just as a last piece of conversation before we say See you later to each as I never say goodbye, has a question for you, which is I heard you saying about talking about the idea that organisations should get better at demonstrating how they manage their risks, and one way of doing is that through this idea of critical risk controls. Would you like to give me a little bit more on this before, and then let’s have a conversation on this. But I think that’s a really good topic to cover where,

 

Greg Smith  02:00

where that has come from, for me, is basically analysis of major accident events over the years, and we see repeatedly coming out of major accident events. So you go back to BP, Texas City, where the focus on personal injury rates was criticised and the organisation wasn’t focused on critical risks in the business, and you jumped forward to something like the Pike River Royal Commission, which was an underground coal mine fatality in New Zealand, where the Royal Commission said that, sorry, They criticised the board for focusing on injury rate data and said the board appears to have received no information proving the effectiveness of crucial systems, and so we have this very strong current through through major accident inquiries and through prosecutions of organisations focusing on a whole range of issues in safety, but not the things that can lead to fundamental disasters and critical failures. And I just, I just think that as a matter of governance and assurance, organisations need to understand what their critical risks are, or their crucial systems or their there’s a rich decision, a New South Wales District Court decision that talks about fundamental processes. What are the actually important things in our organisation that are going to prevent catastrophic event, and how do we focus on those and get assurance that they’re working well, and that can, that can be, that can be problematic, that can be problematic. So we know, again, in an Australian context, we know, for example, that workers compensation claims can be a big driver of cost in businesses in Australia. But most of the things that lead to workers compensation issues are not necessarily critical risks that are going to result in catastrophic events. So it’s very easy to lose sight of the critical to focus on that may be trivial, but important, but not critical. If that makes sense, there’s a bit of that. And I also think we have a problem. I suspect it’s a global problem, from what I’m seeing around the place, but certainly an Australian problem. We have this overwhelming perhaps in some organisations, overwhelming emphasis on psychosocial safety. Now, again, it’s important, but there’s an issue of criticality, and lots and lots of safety departments are investing lots and lots of. Energy at the moment in trying to come to terms with this new brief they’ve been given by the regulator. And it’s not new. That’s a misnomer, but everyone’s perceiving it as new, and I’m just concerned that we are going to see an uptick in serious physical harm because of resources being applied to this other area in ways that many organisations are simply not equipped to deal with.

 

Nippin  05:29

There’s quite a few things I would like to unpack with you, but let’s just pick two or three of them. One is I could probably share my experience on the critical risk controls. And I think there is something to be said about the whole idea of a critical risk, because there’s a book here in front of us where we talk about the Costa, Concordia accident and the trouble with the idea of critical risk. And you know, if you talk about shipping industry, for example, there are not too many critical risks we have seen time and again, collisions, elisions, groundings, fire, explosion and so on. So in some ways, the consequence, if that’s what you mean, is very clearly understood that this is what risk means. That’s one problem. So a related problem as a result of that is that very quickly those kind of risks become normalised in the organisation, at least what the organisation is doing about it and how far it pushes the boundaries of its operations becomes very, very normalised. So it’s very hard for people working within the context to see what is critical and what is not critical until it actually shows up. And I think maybe there is something there to be, to be unpacked,

 

Greg Smith  06:50

I think, and I hadn’t thought about until you just raised it then. But I think in part, that normalisation is influenced by the way that we measure safety. So if, if we are, if we’re measuring safety by counting the amount of activity, the amount of activity doesn’t tend to change, even though the risk might be changing before us, but we’re doing the same amount of activity, so the information the organisation gets is business as usual. So because the way we measure and count safety doesn’t reveal much about the risk, it simply normalises activity. So there might, there might be something to that I haven’t thought of it. Thought that through until you just, until you just mentioned it. But again, because we don’t have good narrative explanations of what we’re seeing, it becomes very easy to disguise drift or normalisation behind behind activity.

 

Nippin  07:56

But that’s a that’s a really important point to from my point of view, because there are instances when you can go and interview people. Because essentially, how do you get that information through onboardership? You can get it through navigation equipment by talking to people, collecting information, triangulating it between stuff. But essentially, one of the big problems is that people who are on the ground who are supposed to give you that information don’t see it as a risk at all. So you need a level of imagination, you need a level of critical thinking, discerning, listening to get to that point. Yep, and I think that is something we see is is by and large missing in the way we do audits and investigations, but

 

Greg Smith  08:42

because, I think, I think again, if I go back to some my preliminary ideas, I think it’s because the process becomes the purpose. So we’re doing, we’re doing audits or having conversations, or we’re doing inspections not to understand but to complete a process, and it’s that’s to a greater or lesser degree, or I’m not saying that every time somebody undertakes a safety activity, it’s completely mechanistic, of course, but parts of it, very often are, and It strikes me that there is no in most organisations, there’s no real, overt agenda to understand or to get uncomfortable. It’s we’re much better at driving comfort. Yeah, and

 

Nippin  09:37

I think there’s another element to it, which is, if you’re walking the same street each day, every day, you don’t see the problem at all. So it’s very, very hard for people to see something objectively and say that yesterday it wasn’t risk, but today it has become a risk. I think that’s a difficult thing to grasp. It. That’s one thing.

 

Greg Smith  09:57

Yeah, so one of the one of the in. Interesting things, right? That I’ve been talking about lately is the number of assumptions that underpin what we do in safety, or the information we receive and we again, the assumptions are never overt. So if we’re saying we’ve closed out all the corrective actions, there’s a tremendous number of assumptions that have to underpin that as a measure of anything, you know, starting from What does close out mean? Does that mean something’s actually been done, or Has it just been ticked off as closed out all the way back to the competence of the people to form the view that these were good corrective actions in the first place, and we don’t, I don’t see much challenge to our underpinning assumptions,

 

Nippin  10:46

  1. And then it just reminds me of an exercise we did yesterday, a workshop we did yesterday, as we were mapping the conversation of the whiteboard, something as subtle as this person had, the practical experience. And despite the practical experience, this thing happened, and this person who was listening actually chased the idea of what is practical experience. And underneath that, there was a huge assumption. And the point I’m trying to make is this, that we are not good at listening, we are not good at discerning, we are not good at opening up those conversations. Let’s there was one, one thing,

 

Greg Smith  11:22

yeah, do you want to say? I was just going to say most organisations, structurally aren’t designed to do it. And this was a conversation we had yesterday. Most organisations have like, a 21 or 28 day time limit to do an investigation. Yes? So you, you just don’t have the bandwidth or the space to open up those conversations anyway. And

 

Nippin  11:43

the thing that I find interesting is that you could have, when you this happens a lot in safety audits, that you could go and issue 20 non conformity as part of your safety audit, but you could actually just issue one and really open it up well and do a good job there. And it could be a combination of both. It could be a combination of a qualitative and a quantitative kind of

 

Greg Smith  12:08

Yeah, I think there is a compelling argument, and I’ve touched on this with Professor drew Ray before as well. There is a compelling argument in safety, that I think one of the best things we can do to start shifting a whole lot of dynamics is to do a lot less, but do it better.

 

Nippin  12:30

Yes, absolutely. And that’s my point that you go and issue one non conformance, if you have to open it up, have a good conversation around it, and if you have a systemic issue, like you said yesterday, it is bound to come up in that one.

 

Greg Smith  12:45

Yes, I think that’s right. I believe that. Yeah. So

 

Nippin  12:49

the other thing, Greg with critical risk controls, I find it very interesting, is that when critical risks become the focus, let’s say, in my world, an explosion or a fire or a capsizing of a ship and so on. Then the the whole thing that drives us is consequence. And so this is the it’s not no longer a risk. It’s a consequence. What we are discussing now. Now the trouble with that is that, first of all, we know risk is not a consequence. Risk is risk. Consequence is consequence is the outcome of a particular activity which has gone terribly wrong. Now risk, in my view, to understand risk takes a huge amount of imagination. It takes imagination to understand what might go wrong in this particular instance, which by its very nature, requires us to talk to take a slightly different approach. Because, if he was so consequence driven, what happened is this stifles imagination, and we become so myopic in our view of what risks. Do you have any thoughts around that? Or have you actually thought about it that way? No,

 

Greg Smith  13:54

I haven’t thought about it in that way. Because, from my my practice and my worldview, consequence is everything, and that’s that’s what I mean. There is a vastly different legal consequence to a person losing the top of their finger to a double fatality, even though you might be talking about the same risk of a dropped object, for example. So consequence in that context matters, but but the question of criticality is one that I think organisations grapple with so I for many years, I would tell when I’m doing work with leaders, I would say, your real problems start when people die from a regulatory perspective, but then you start doing work with nursing homes and healthcare providers. And criticality looks really different there, because it’s very. Easy to be critical of organisations that focus on people who get sore backs, except if you’re a healthcare provider with an ageing nursing workforce and you’re getting lots of workers’ compensation claims and lots of people who can’t work, suddenly, that is business critical because you can’t provide service. So I’ve learnt slowly, slowly, but learnt that criticality is not as objective as we might like to think it is. I think

 

Nippin  15:33

absolutely, and this is a classic problem in my world, at least, that the whole idea of a consequence driven criticality can put a whole organisation into a very myopic kind of view of what risk is all about. Yep,

 

Greg Smith  15:51

I think, I think that’s right, and and the other and that, I think that rings true often around incidents as well, because the tendency to classify the level of investigation you do by the potential consequence is, I think, problematic, because even if you can have a terrible outcome, but there’s not much opportunity for understanding or learning from that, sometimes It really is quite simple and linear and mechanistic.

 

Nippin  16:22

Suicides is one of them. I mean, it’s a terrible thing to talk about, but what could you do potentially? What could you do? Well, you can support the family after an event. What could you do to prevent it? Because anyone who has studied suicides would know that when somebody has made up their mind to commit a suicide, there’s nothing much you can do. And so what happens in my world, at least, is that you’re, you’re expected to produce a 200 words or 200 page report, or 100 page report, yeah, and it’s, it’s kind of expected in that report that you must have some kind of preventative actions and corrective actions, which is, which is, I struggle to understand how you could make something up for something that you have so little control over, well,

 

Greg Smith  17:03

that I think that I’m fascinated by that, because I haven’t had a lot of experience in that area. But I suspect that’s going to become a prevalent theme in Australia. We see it a lot already. In the construction industry, there’s lots of noise about the number of young men who suicide in the construction industry, there was a report several years ago in the WA mining industry as well, and there would be plenty of people out there, far more expert than me, who could talk about the the ability of a workplace to impact on individuals suicidal tendencies

 

Nippin  17:42

and then one final point Greg on critical risks is this that often the idea of critical risk control is linked with the notion that risks should be prevented. So it’s a consequence should be prevented. Nothing should go wrong. And I think that’s a good, good goal. But often what we see happens is that when shit happens, when things go out of control, your emergency processes, your business continuity plans, your recovery plans, disaster recovery plans, you come to understand that people are not really haven’t even imagined, no that something of this sort could happen. Any thoughts

 

Greg Smith  18:27

on that I haven’t had a lot of involvement in? I mean, when I say I haven’t had a lot, I find that when I have to attend fatalities and the critical incident response team has been convened. There’s two parts of it. One is the reasonably comfortable mechanistic completion of the log, following what do we do? But the other part of it is that the people who are face to face with the crisis, and it never goes according to plan. I don’t think anyone reasonably expects that it would. I mean, when something goes badly wrong, most plans go out the window pretty quick, and you spend the first 48 hours almost in that flight and fight response to deal with the situations in front of

 

Nippin  19:21

you, it does. And the point I’m trying to make is that sometimes you have to be very, very careful about what is that we push for. If the end goal is to, is to, is to just avoid even talking about the consequence of things going wrong, the imagination of what happens around what happens, and how prepared people are to actually cope with that reality. It goes terribly, terribly, badly. I’ve seen people freeze. They’re in trauma. They have no idea how to work through that emergency or that uncertainty, and it’s just something we see a lot, and I think a lot of it has to do with how an organisation understands. Right, what is a critical risk and how to actually address those risks?

 

Greg Smith  20:03

So emergency response, in and of itself, would be what the Pike River Royal Commission referred to as a crucial system, I think. But again, a lot of it, underpinning a lot of this for me, is the thing that I have just been bumping up against for so many years, is that question of, well, how do you know that it works? What assurance do you have about the level of efficacy or response that you’re going to get? And that, to me, that is still, I think, the an important missing piece,

 

Nippin  20:36

and it trickles down to the level of a seafarer or a worker? Yes, you know, so interesting that sometimes you are standing and just reviewing a risk assessment from a seafarer, and you ask the question, well, nowhere in this risk assessment it says that, if things, if the wire fall parts, what would you do? What is, what is, what is the next course of action. And the response to that sort of question is, well, that’s precisely why we are doing the risk assessment, because so that it doesn’t have doesn’t

 

Greg Smith  21:08

happen. Yeah, yes. I think there’s a lot of that false confidence in our risk processes, isn’t it? Yes, great. Okay,

 

Nippin  21:16

Greg, what a wonderful chat. Is there anything else?

 

Greg Smith  21:19

No, no. It was really good to catch up with you again. Nippin Anand, great to have you over here running your course. And hopefully, if we get all our ducks in a row, I can be over in England in April, and we can do some stuff

 

21:34

together over there. Well, I

 

Greg Smith  21:35

did. I deliberately. We are planning, very deliberately, planning for nippin Anand, I to get together in the UK initially in April next year to see if we can run some programmes together.

 

21:46

All right, we’ll be back with some more news. Very good. Thanks. Nippin Anand, take care.

 

Nippin  21:55

If you enjoyed listening to this podcast. Many more podcasts are available on our website, novellus.solutions, forward stroke, knowledge space. The podcast embracing differences is available on Spotify, podbean, Apple podcasts and anchor. You can also subscribe to our YouTube channel, Team novellus, that way, every time we publish a new podcast, you will get to know you want to find out more about our work, visit us at novellus.solutions, or simply write to us at support@novellus.solutions. Thank you for wanting to learn more than you knew yesterday and until we meet again. Goodbye and have fun. You.