Welcome to our very first episode! In this podcast, Oessur will demystify the concept of near-miss reporting system both as a construct but also as a tool for organizational learning as it is used across safety-critical industries. Listen to this podcast for some food for thought.
[00:00:00] Nippin Anand: Hello everyone! Welcome to this first episode of Embracing Differences with me, Nippin Anand. Why Embracing Differences? Why come up with such a title? I think the term Embracing Differences has intrigued me for many many years. I think the reason for that is very simple, very basic which is that we are not sheep, we are humans and have a mind which means we can think independently and reflect on our thoughts which kind of sets us apart from other living beings. The way we learn and grow is by looking from different perspectives. Now, we don’t have to believe in everything we hear or read but, those differences should give us some food for thought to broaden our perspectives. Especially so as we grow older in our lives and that is precisely the purpose of this podcast series. So, expect a range of topics. As I always say “Don’t believe a word of what you hear” – Just think, reflect and take away what suits your purpose. In my world, thinking is a lot of fun. I really enjoy it.
[00:01:22] In this podcast, we are joined by Oessur Hilduberg, a great friend of mine who is the head of the Danish Maritime Accident Investigation Board (DMAIB). Oessur takes us on a journey to understanding near-miss reporting – a concept that is so embedded across safety critical industries. Let’s hear it from Oessur.
[00: 01:41] Oessur Hilduberg: I must upfront declare that I have somewhat a skepticism towards the concept of near-miss reporting in general as a tool for managing hazards, to report accidents and also as a tool for organizational learning. I’m not going to use any presentation today so I’m going to keep it as simple as I can. So, I’m going to divide it into two areas – Near miss as a construct – What is a near miss? I’m going to talk a little bit about that and then I’m going to talk about organizational learning because what we find is when we talk to stakeholders is that near miss is some sort of way of getting to know ourselves in an organization, to learn something about ourselves. So, I’m going to talk about organizational construct and organizational learning. What stands out when we talk about near miss?
[00:02:43] Before I move forward, when I talk about near miss reporting I talk about it from a marine perspective. I know near-miss reporting can take different shapes and forms depending on the industry we’re in – in mining, in aviation and so on. There are different approaches to it. So, I’m going to talk about it from a marine point of view.
[00:03:09] What we find when we look at near miss reporting system is that, or in general also for that matter is the vagueness of the definition of a near miss. Near miss is defined in a particular way. Namely, the definition is based on what we would call a counterfactual – something that could have happened, that should or would have happened if we did not do anything. The definition also often contains something potential. There’s something hidden that might break out. So, there is this counter-factual that it could have happened. Now that vagueness brings about different kind of problems.
[00:04:05] Number one is that it somehow collides with a more contemporary view of safety, namely that something goes wrong, and that’s what we’re looking for. We’re looking for things that can go wrong. So, this is from a Hollnagel perspective or of Safety I perspective. We’re very pre-occupied with the view that things are not working as expected. More importantly there’s a problem with counting and categorizing events which have been defined vaguely. So, once we get a lot of reports about near misses, what kind of bucket should we put it this piece of data into? So, the same event can fall into various different categories because of the subjective nature of these counterfactuals. This is one issue with near miss as a construct. The definition is so vague that it’s so difficult to work with not only in terms of the individual who reports the near-miss but also as we categorize and treat the data.
[00:05:27] The second thing is the idea of common causality- meaning that if we find that there is a relationship between near-miss, events, observations, accidents or consequences. Now, anyone who has had the time on their hand to read the book by Herbert Heinrich, would learn that this triangle and hierarchy between near-miss, or incidents, accidents or consequences, that relationship is highly problematic. Now Herbert Heinrich who introduced this idea of the hierarchy in his book: Industrial Accident Prevention, in his early work he introduced that there was causality between near-misses and accidents. So, if we get rid of the near misses, we will have no accidents.
[00:06:30] Now, he revised that later on. But there is no data to be found to underpins this hypothesis, the common cause between incidents and accidents. Now what’s the implication of that? The implication is that if we find a lot of near misses and address those, we will not have fewer accidents but will have few near-misses because there is no observable causation between near-misses or events with serious consequences. Now these are two things. I talked about the vagueness about the definition and the common causality of relationship between accidents and incidents.
[00:07:34] The third thing which is also quite important, that once we look through near-miss reporting systems, we find a lot of a certain kind of category of incidents or events and none of other than we know are there and are not reported. So, there’s some sort of secrecy going on. Now one could describe that in any number of ways. One way of going it is, we get a lot of technical events such as fall/trip events, people getting stuck indoors, almost falling downstairs, firehoses with a hole in it, something of a technical nature.
[00:08:25] Other events that never get reported in the maritime industry which are highly safety critical for example, an OOW feeling tired and almost falling asleep, an engineer who put machinery together and almost started it and thereby causing an accident. That kind of events are never or rarely described or reported. Now why is that? One way of putting is that we get incidents that can be forgiven by the organization. Anyone can forgive personnel on a ship that they found a hole in the hose. They may even be rewarded for it, praised for it for their diligence in carrying out inspections whereas reporting falling asleep which touches upon the professionality, the core values of the individual will not be reported. Thereby creating some sort of secrecy of what’s going on. So, there are three things about near miss construct – the definition, the common casualty and the what does and does not get reported.
[00:09:44] The final thing I will touch upon is near-miss reporting systems as a means of organizational learning that we will learn something about ourselves as an organization through these reports. Now, what we find is that there is a tendency to oversimplify what organizational learning is. It somehow by itself appears once we have the data. Organizational learning is a highly complex process. How we get knowledge into the organization, we learn to appreciate it, and how we use that knowledge to facilitate change while simultaneously balance conflicting goals within our own organization.
[00:10:38] Now this specific problem we find in organization is that we have reporting systems where we have tens of thousands of reports. So, the system can handle or store the data but there are problems with what to do with the date. How can we transform the data into knowledge that can facilitate change or improvement? Now that link has been broken somehow or has not been there all along. So, we get the data but have no strong developed methodology to analyze and transform that into knowledge.
[00:11:25] So, the frustration with back-office managers will typically be “Well, we have all this data, we have all these excel sheets, but what do we do with the data? What does it tell us? We don’t know! We have not developed methodology” Now if we are unable to learn from it and the data that we get is not of high quality then what’s the purpose of it? Why do we do it?
[00:11:51] If we as an investigation board ask back-office managers why do you have near-miss reporting systems? The typical response will be, “Because we have to”. Because it’s been written into maritime legislation it becomes a mandatory requirement. Thereby they have to do it but they have no means of developing the system or methodology of analyzing the data because the legislation does not tell them how to or what the outcome of this near-miss reporting should be. Therefore, it is not being used and that’s the problem.
[00:12:37] Now, does it matter? Just have the system and then we have something and create KPIs to show our customers. Well, one could argue that but the main problem is my final point, near-miss reporting systems at best could be useless, a waste of time. But at worst – It could act as a learning decoy meaning that we can get a lot of information that if we believe it to be true or representing the real world then we get a distorted view of what’s going on or what our problems are. This means it is counter-productive to happen. So that’s the positive outcome of my presentation. Thank You!
[00:11:24] Nippin Anand: What did you think? What do you think I thought it was brilliant in the way Oessur articulated the concept of near miss reporting? I enjoyed every bit of it. How did you like it? Or did you like it at all? And if you did please share it with others in your network who you think may benefit from this. if you have any questions please free to e-mail me back at firstname.lastname@example.org. I am also on LinkedIn if you want to join me there.