2

Near misses: Origins, triangles and challenges

December 18, 2020

Share

Facebook
Twitter
LinkedIn

In recent years, near miss reporting has become a crucial topic for discussion amongst safety professionals. While some think near-miss reporting is no longer serving its intended purpose, others still find it as a useful tool for measuring and improving safety. In this episode, Carsten Busch shares his perspective about near misses both as a tool and an approach for improving safety. The discussion is based on Carsten’s forthcoming book titled “Preventing Industrial Accidents: Reappraising H. W. Heinrich – More than Triangles and Dominoes.”

[00:00:00] Nippin Anand: Hello everyone! Welcome to the second episode of Embracing Differences with me, Nippin Anand. Now, have been talking about safety reporting systems in our last episode, where I left you with some thoughts on near miss reporting systems in discussion with Oessur Hilduberg. My guest tonight is Carsten Busch. I’d let Carsten let him introduce himself in a minute but what I really liked about Carsten is his unique ability to draw from a rich and a very historical perspective, and also connect his research with practice, which basically means that people like yourselves can intuitively recognize some of what he says but also feel challenged enough to learn something new in a constructive manner. Now, I have no idea what Carsten would be saying today but I’m sure every minute listening to Carsten is very worthwhile.

[00:01:05] Carsten Busch: Carsten Busch for those who don’t know me. I have worked a lot in various safety positions in the Netherlands, in Norway and bit offshore UK. Railways – that’s my main background and currently in the police and well, I tend to describe myself as a safety nerd. And historian digging lot old books. I am going to tell a bit from a new book that’s come which deals with work and also life of Heinrich, the safety pioneer.

[00:01:50] Thank you Nippin for inviting me to do this and to follow up Oessur presentation from last week, the podcast from last week. Oessur he took up some really important points, I think. Good critical points and he raised some challenges about near miss reporting and reporting systems. It’s good to follow up there and I hope to add a couple of points to the things that Oessur told us. My aim is to give an even richer understanding maybe of the subject. Maybe it kind of help us deal with this better.

[00:02:44] Near miss reporting and incident reporting is one of the most basic safety tools. One of the things that probably everybody who starts into safety is taught about this that you should do. Yet I feel what is practiced and I feel Oessur also mentioned that it is little understood how it really should work and what’s behind it. I think that it leads to a lot of sub optimal and sometimes even bad practices, harmful practices even. So, I’m going to take you through some thoughts of mine. Almost a year ago, I sat down to write the most difficult chapter of the Heinrich book that’s coming which is going to deal with the triangles, chapter 9 and in the book. I really had a problem getting started and then proceeding with the book and I thought now I have to use Christmas holiday to read through about 200 papers and book chapters and other sources and just try to shape this, to get better understanding of the matter and put this in place and then the book will follow and it did it. I want to share just a small snippet from that chapter with you.

[00:04:22] So near-misses, what are near misses and I’m thinking its most basic form it is the realization of “Man! This could have been a really bad”. I’m living in Norway and a great example is driving your car now in this time of the year because if you start to late or if the weather changes too quickly in October you will find yourself driving around on summer tyres and there’s already ice on the road and you may have this weird movement of your car, slip a bit in the corners etc. and I think that’s a very typical and a good example near misses and realize, oh shit! I could have had an accident here, I should do something to prevent this happens next time, you drive slower, you take the corner slower, you change the tyres if you haven’t already etc. So how did it get into safety? I would say thanks if you’re against it due to Heinrich and it’s one of his main contributions to safety and original contributions to safety.

[00:05:58] What I mean with it is that people tend to know Heinrich’s work and that it is him who came up with all these things. Like the dominos – the accident sequence and so on. He did not, really. What Heinrich did was make the best of good safety practice in the late 20s, early 30s and he put it together in a coherent framework and packed it with some nice numbers and pictures and so on. His main contribution which is mostly original work of the triangle and the learning of minor events, near misses and so on. The rare orders who had mentioned before but none of them follow true and built an idea the way he did and what was really funny when I did my research, was that he says one of his early papers that, this was coincidence that we discovered it, its incidental to order a research. A research of hidden costs of accidents and I found out that, hey we found that some accidents that have major costs and we had similar accident before that didn’t have those costs. So, people could have intervened earlier and that was when the triangle started.

[00:07:37] 10 years before that I have found the earliest source on near misses and safety literature. There may be earlier but I think this Mr. Richardson, he was quite ahead of his time. What he says here is, “There’s probably no source that’s greater than lessons drawn from observation from minor events.” They happen often, almost everybody knows them and most people don’t pay attention because there are no consequences. But there are major lessons hidden in them. What must have been ground breaking in that 1916, when you wrote this in a safety engineering magazine and yet nobody picked it up in 10 years. It was just the tiny article on one page, I think. It was until Heinrich discovered it again and first mention in 2007, which says that for every actual injury that occurs there are several other near accidents resulting in minor things.

This is the very first mention. 2 years later he wrote a whole paper on it and you see here on the slide on the very first picture and you will see it is not a pyramid, it is not a triangle it’s not even an iceberg. Heinrich never used an iceberg either, not at least in this context. It’s just a simple representation of the idea that there’s one major thing, some minor injuries and then there’s a lot of stuff just going right and he developed it from here until his last work in 59. He would think over it, and make some changes and improvements. I don’t have the time to discuss the research, the underlying principles which are actually quite essential in interpreting the triangle and also different ways of reading it which I am not going to do that but will touch on some of those in a minute.

[00:10:18] What I want to do is discuss a couple of main messages that I take from the triangle that Heinrich intended to be the main messages and this is one of them. The importance of any individual accident in prevention, lies in the potential that in the event not the outcome. He said we should judge events not on the outcome, the fact that a lot of people died, or got injured or whatever. We should judge them on the potential. That is an introduction to risk-based thinking into safety which wasn’t there to that extent, I think.

[00:11:13] I’ve written a little formula we can discuss that but I would say outcome is mostly a function according to Heinrich is that its coincidence mostly, its luck and I would say that he agrees, it’s a function of scenario, energy involved and some randomness. For those who see the slide you’d see a plane on the level crossing having hit a car. There’s a lot of energy involved here, so everybody know that the damage can be bad. Here’s a lot of potential and the scenario is a car on a level crossing and then there is a huge random factor determining the outcome. If the car that is a half meter to the front, nobody is hurt and maybe just a scratch. If it’s a meter back in the middle of the tracks, there’s a full-on hit and people will probably die and whether there is one or five fatalities. It’s not a function of the accident it’s a function of how many people are in the car. It’s just a guy driving to the shop or is it a modern soccer mom football team in the back then the result or outcome is much worse.

[00:12:50] Heinrich had a rather interesting conclusion and I think its valid even today. He says when you see and realize this that we misdirect our efforts in safety because we wait for the big bang to happen. Then we start the big investigation and then we try to learn and we ignore all the minor things where we could have learned and basically could have learned the same thing.

[00:13:32] I think that was quite ahead of his time and it’s what we still see today because accident boards react when there is something bad, really bad, and nowadays they will probably also act if major near miss without fatalities but mostly accident boards react when something bad happens.

[00:14:00] And what I like best in Heinrich’s representation and writing is about the triangle that it’s about opportunity and he repeats this in many papers where he says it’s an opportunity to do something long before something bad happens. If you have read some of Heinrich’s work, he was an optimist he was encouraging and writing self-help books for managers into safety. He put these things like opportunities and chances and well you can do this so get to work – it’s a call to action. Just a few words on misunderstanding because I think Heinrich’s idea was great. But as things go as somebody has a good idea to adopt it and interpret it in their way, maybe change it bit and we see 3 common misunderstandings. At the bottom you have the interpretation of the triangle as you have some kind of metric, you use it to measure how your organization is doing. It’s not meant for that I think it’s not useful there unless you want to compare how many accidents you have, of what type, in that year what pipe what year, last year etc.

[00:15:42] You can also use tables, why use a triangle? Then there is a proportional reduction which I won’t go into because that’s a complex discussion but there’s a real misunderstanding that people think that you just saw a way a bit and then you get rid of the bad stuff with the really big accidents. It doesn’t work that way for a number of reasons.

[00:16:12] Then of course you have the issue where people mix various triangles like they are very good at managing slip, trips and falls, they get awards for several years without an LTI and thereafter your facility blows up and read several examples of that the …. Arise in the big picture, Andrew Hopkins wrote a very good book on complexity, where you discuss this problem and you and say you have to use separate triangles not just random. That is the most common misunderstandings.

[00:16:58] In general I am positive about near miss reporting systems and hope the thought behind it. What I’ve come to realize in the recent years more than before I think, probably something with professional maturity and that stuff doesn’t not work as we imagined. There are some major challenges and limitations and it’s really important to think about these. We probably can’t solve them all but realizing that they are there will make it easier and probably get more out of the system that probably every organization has and, there’s a checklist in my book, I just picked up some highlight from the chapter in my book that I have here. The first is the problem of recognition and identification. Near miss by definition are weak signals and weak signals by definition, weak.

[00:18:21] They are hard to see and easiest to see after effect after something really bad has happened. Everybody can point out now what people should have what I realized, what I should have acted upon because there were weak signals before the big bang and that applies to any event, I think. There are lot accident reports where it’s written that this has happened before, people should have seen and worked perfectly in hindsight and not so easy in foresight. When in foresight it depends very much on the ones looking at the situation it requires experience, knowledge, a critical mind and willingness to speak up of course and so on.

[00:19:23] So there are a lot of input problems here in these systems. I think Oessur it was raised one very interesting one and also a very common one, you see and system one type of accidents and incidents and orders and forgivable factor which was just brilliant. It illustrates one of the problems of input. How do you communicate this up in the system or side of the system? Does my colleague understand that I’ve raised a problem that I’ve seen in my opinion? Does he or she interpret the situation the way I do? Is there a common understanding on what we should do? Or is it that the one who gets the signal thinks that “Oh! But this was success, this works because nothing bad happened before, this is an easier way of doing things, this is acceptable, we may be increasing our risk, because we interpret the signal wrongly.

[00:20:44] Then there’s issue of prioritization. We’ve got some stuff into the system and maybe we are really good getting stuff into our system. What should we select to work with because we can’t solve everything at once? So, we have signals, because probably a lot of 21:08 xxx especially systems that put a lot of bonus on reporting. There will be a lot of noise on reporting, not for the safety but the bonus.

[00:21:24] Then there is the manager who will be in doubt “What shall I do? Shall I do low hanging fruit? Easy to clear or should I go really to work and do the high potential that may really matter and there’s always the danger of the system getting swamped with information – over reporting, which is not much discussed in safety I think and on the reporting. There’s both going on at the same time. Because we get a lot of reports of one type and almost none of the other so that’s an interesting problem. Anyway, the question of effectiveness – We have stuff into the system, we have made a prioritization. We ask ourselves – If I do something or act, what’s the effect of my action? If I act on a weak signal on a near miss will it be successful? If there’s no accidents will it just increase the hassle because we’ve put in an extra safety barrier or procedure or spend money on something? On the other side we have the question – if I don’t act and somebody dies what kind of problems do I have then? This balancing of well many possible outcomes and the side effects of course over, to Perrow, making the system more complex and then it gets out of control.

[00:23:33] For a major group of challenges is If you see a near miss, observe a near miss you will probably want to deal with will layer of organizations want to deal with in mass a certain and react on trends and then question of the scenario are they really similar and aren’t you just acting on the big number and ignoring a small high potential stuff is a difficult question because you can make very specific scenario and which means you have to treat every near miss, specific incident on its own and you can make also scenario groups that is too abstract that they don’t have any functioning either like events on rail tracks which doesn’t help if you get what I mean. Heinrich’s work in a period when the world was fairly linear, manual and steam driven and much more tractable than today’s processes and systems.

[00:25:21] The question is does this really work in today’s complex systems? And I don’t have an answer for that, I am putting it here. I have a gut feeling to a certain degree because we still have energy so energy says a lot but energy doesn’t say oh! But maybe instead of precursors as we call them before, modern age triangle should act more on patterns and precursors because in complex systems, we see patterns emerging and those are probably weak signals to act on in this day and age. Just putting it out, it’s a nice subject for future research, I think.

[00:26:19] And so I’m wrapping up I would like you to remember that when you see the triangle it wasn’t intended as an opportunity and it’s a using a heuristic opportunity, you can learn, you can improve, maybe you can prevent but as with any model, would come in limitation and you have to act within those, if you don’t the model will not do what you want it to do but it actually may be bad.

[00:27:07] There has been critique on near miss reporting systems in recent years, a lot of critique on the triangle. I want to suggest, be critical but don’t throw out the triangle with a bottle water. Thank you! Well, if you have a lot of money and want to give yourself a good Christmas present, check out the forthcoming book preventing industrial accidents which deals with it in many more words than I have possible to share with you now. Thanks!

[00:28:03] Nippin Anand: What did you think? I’m still thinking I thought the idea of putting patterns before prequels was pretty powerful as was the concept of hashtags to reporting systems. Fascinating ideas and very achievable. What Carsten has offered us are some very low hanging fruits that we can achieve even with limited use of technology and in a way that we integrate technology with human intelligence. Mind you! Everything that we design today is disintegrating humans and humans with technology. There’s so much to learn from Carsten and I promise you that’s not the end of it. In the next episode we will have a rich conversation with Steven Shorrock talking about his view on reporting systems from a very practical perspective. If you want to connect with me, I’m on LinkedIn. You can also email me at nippin.anand@novellus.solutions which is my First name dot last name at Novellus dot solutions. Until then, have a beautiful festive season enjoy your Christmas in these weird times and I’ll be back with you once again.