3

Safety reporting systems: Insights from Steven Shorrock

December 22, 2020

Share

Facebook
Twitter
LinkedIn

In this podcast, Steven Shorrock shares his views about safety reporting systems. Reflecting on his experiences and research, Steve provides some practical insights for understanding and improving our existing approach to safety reporting.

Link to Hindsight magazine: https://www.skybrary.aero/index.php/HindSight_-_EUROCONTROL

Further information

[00:00:00 – 00:00:50] Nippin Anand: Hello everyone welcome back to another episode of Embracing Differences with me Nippin Anand. We are still under reporting systems – a topic that has interested me for quite some time now and as I’m realizing it actually interests a lot of you. Maybe because it’s a very common tool that’s used across organizations particularly in safety critical industries. Well, not many of us have the unique capability to present research in an accessible and simple way. My guest tonight I believe is recognized for this and many more achievements that he has in his life welcome Steve Shorrock and I’d let Steve introduce himself.

[00:00:51] Steven Shorrock: Sure, thanks Nippin. My name is Steven Shorrock. I’m a human factors engineer and a psychologist, work psychologist really and I’ve worked for the last 23 years or so in mostly high hazard industries and mostly transportation within that and much of my career has been spent in air traffic but otherwise it’s been the railways, chemical manufacturing border control security. In the last few years I’ve been working kind of a more voluntary basis and a more collaborative basis mostly out of curiosity in healthcare. So that’s just a bit of background about me and currently I work in EuroControl which is an intergovernmental organization for air traffic management in Europe so in that context I’ve worked in around 30 or so countries something like that.

[00:01:56] Nippin Anand: You also edit the magazine called Hindsight isn’t Steve?

[00:02:00] Steven Shorrock: Yeah, so that that’s a passion of mine. I’m the editor in chief of hindsight magazine which is a EuroControl magazine which now is on basically human and organizational factors in operations in air traffic management but also includes other industries so you Nippin, have written of course for hindsight and so as Anders who is also present I can see thank you Anders. Anders does a remarkable job in well checking what I write as well as other people and it does a Sterling job in that so much appreciate that. We have a different theme for each issue of Hindsight. So the last one that will be published in a few days is on learning from everyday work. One before was on well-being and so on. You can find that on Skybrary. If you just Google “Skybrary Hindsight”, then you’ll find that magazine.

[00:02:55] Nippin Anand: Great! It’s a very genuine attempt to bring some cutting edge knowledge into practice I must say. Really good work done! Ok Steve, so we’ll get started and let’s see where the discussion goes from there. My first question really is that – I have struggled with this question honestly – what the purpose of a reporting system?

[00:03:19] Steven Shorrock: Well, I think there are many purposes and so I think the question is really plural – What are the purposes of a reporting system? Because there’s a difference between purpose and function you know, a purpose is quite intentional so a purpose is what someone wants from something box or somebody wants from a system. So for instance what I want from a tree and what a bird wants from a tree are two different things you know, but the tree is the same. Similarly with the reporting system what somewhat one a member of staff let’s say frontline member of staff their purpose for the system may be different even between frontline staff it may be quite different at different times and that may be different to the management to the safety Department and so on I think there’s also a difference between the rhetorical purpose and the reality so we may say that it has a particular purpose but the reality might be quite different and so you have to combine those things and I guess the ideal purpose is something to do with learning and an improvement you know so ultimately learning and change which go hand in hand.

But there are other purposes for staff so there are compliance purposes of course so that may even better be the function of a reporting system in many ways is simple compliance. But there are other purposes which are perhaps not quite so productive when it comes to safety such as well for instance revenge I mean reporting systems in the healthcare industry and known to be used for a kind of revenge and there’s an even a term in the UK of being ‘datixed’ – which is the data is the reporting system so if you’ve been day text it means you basically being reported. So there are many I think there are just you know there are many purposes some of them are more productive and useful when it comes to organizational functioning and others are not.

[00:05:35] Nippin Anand: Great! I like the analogy of ‘datixed’ with somebody being reported. Generally speaking, and I see what you’re saying so how do these multiple purposes actually play out in the design and implementation of reporting system? Can you talk us through some examples of the problems that it create when you have different purposes?

[00:06:00] Steven Shorrock:  Well, if you look at what a reporting system actually does, how it functions – so let’s say the function of a reporting system does is, it generates a lot of data so that’s kind of the function of it. It doesn’t produce learning by itself because people are necessary in order to translate data into information and information into knowledge and knowledge let’s say, into some kind of wisdom. So that’s what it does and so in some organizations there is just so much data that you don’t know what to do with it in other organizations not quite so much depending on the criteria for what is reportable.

Now in air traffic we have different kinds of events unlike in some industries are clearly definable mandatory occurrence report. So there’s no doubt that that has to be reported and an example would be a runway incursion, unauthorized entry to the runway or runway excursion or loss of minimum separation between aircraft of standard separation for instance the 1000 feet and five nautical miles for instance. So there are a small range of events that tick. They simply have to be reported and then of course you have accidents which pretty much report themselves and then you have voluntary reports which are not mandatory. Other kinds of reports like overloads so this would be where an air traffic controller feels that his or her capacity is either exceeded or really at maximum. So there are a number of kinds of events now. I mean some of the problems that can occur let’s say, you have a lot of separation, let’s say it’s meant to be 5 nautical miles and it’s let’s say it’s I don’t know if it’s like 4.8 nautical miles. Well separation is being lost in that in that case but it’s a kind of arbitrary limit because you could have separation of 5 nautical miles but the aircraft could be actually on conflicting tracks and basically just avoid a lot of separation that could actually be a more dangerous event than when you just about lose separation if it’s 4.5 nautical miles for instance or something like that.

But actually the situation was fully under control so the criteria for what is reportable and what is not reportable is objective. But that doesn’t mean it’s necessarily useful. So yeah to answer the questions I think the problems are not quite so much with the reporting itself the problems typically occur I mean in my experience working with many different air navigation service providers it’s really with what happens next. So one of the biggest problems that we’ve found via having done research into this through safety culture questionnaires and focus groups for instance, one of the biggest problems is through feedback, following reporting. That’s typically one of the biggest problems and then the next problem after that is visible change.

Following the whole process that’s another problem. There’s also a huge burden or investigation and then there’s a huge burden on communication. Some of the kinds of problems that can occur there is-let’s say you are be on routes or area controller, let’s say a radar controller, you’re not working in an airport and it’s a rule in the organization that you have to read incident reports or at least the summaries and you have to sign that you’ve read the incident reports and understood them and so on.

But then you find that many of the reports that you’re getting are about bird strikes. They just really have no relevance to you and this is taken up lots of your time so I think reporting systems just create a kind of a burden when they’re not managed well. Also another problem I think is that they focus on episodes of unsafety at the event level. So we’re really at the top of the triangle rather than things that are going on below the surface in terms of, perhaps the way that things work on a day-to-day basis or just things about the conditions of work or the structures in the organization.

So we’re really looking at kind of tokens of unsafety rather than quite often getting out some of the reason or Rasmuson rather would have called the types of underlying conditions. So I think there are a few of the problems. They can just become an inefficient way of understanding a situation or a problem.

[00:11:24] Nippin Anand: Yeah and utilizing the right way it does provide some very unique insights and what’s happening within the organization. You mentioned the idea of feedback and one of the things we hear from much of my own research that when you talk to the management is that we simply do not have the time so on the one hand you want to be able to report and report a lot of information. But on the other hand you say that you do not have enough time to deal with it. How does the tension play out in your in your view?

[00:11:58] Steven Shorrock: Well, I think the whole system of reporting in large organizations just creates a vortex that sucks in lots of data. So you’ve got to have the time to have the time to report and that in itself is not straightforward because quite often there is no scheduled time for that and controllers might have to do it or anybody else for that matter might have to do it on a break or in their own time or after work or something like that. Then there’s all the certain regulatory requirements on reporting in terms of time targets which creates a kind of efficiency thoroughness tradeoff in some in some cases because you have a certain number of hours by which there has to be regulatory compliance with the notification. So that can create a bit of a tradeoff but then there’s a time to do all of the analysis of the information and the investigation and so on. The time to actually feedback certainly in my industry of a safety department’s time and it’s done on an individual basis as well so that’s another issue that I think where we could make some improvements so all of the investigations are done kind of 1 to 1, the interviews are done 1 to 1 and we don’t employ some of them or perhaps innovative methods of learning as teams that’s done for instance say in, web operations and engineering where they have what they used to call blameless postmortems.

13:39 You know to get to more rapidly get many insights about not just for the incident but the kind of work that is that is going on when a particular incident happened. So that deviates a bit from your question about the time thing but I think it just all comes down to time for every stage of the process – it takes a huge amount of time. A small amount of time for the initial report but then the investigation itself can take many weeks.

14:11 NA: And then of course there is time and resources and involved but if there is a value to it, then an organization is willing to invest into it isn’t it? So what does it take to create that value that the organization is convinced to put the time and effort into it?

14:29 SS: I think you would have to go beyond safety to really see a widespread value. I think safety reporting and investigation is a little bit like the auditing of the accounting world. I mean you’re looking backwards at something that’s already happened and you know people are not that interested in that. It’s just already happened, people are focused on the future. So one thing is probably the insights would have to go beyond safety towards system effectiveness more generally which is what I’m interested in certainly as an ergonomist and psychologist so we need to integrate a bit of systems thinking for that and if there are insights from investigation that concern system functioning as a whole and you’re able to say to senior people in the organization that this investigation has revealed new insights into our operation and organization beyond safety then immediately that’s quite interesting. Because then it’s not just thought for the safety department.

So I think that’s one thing but then to do that you have to look beyond just reporting incidents and even the term safety reporting system is not very helpful because reporting isn’t the purpose of the system. That’s just the input to the system and if we would aspire for learning to be the purpose of the system and it should be a learning system and maybe not just a safety one but that would be stuff that could be one focus. So I think a focus more on learning from everyday work and of course incidents are a big part of that. But I think incidents can be treated as an invitation to understand how things work on a day to day basis. So that again, the incident itself is very often there’s nothing so interesting. It’s the kind of thing that can just happen anytime. I mean somebody forgets about an aircraft that’s low and slow moving through a sector, for instance, in my industry these are just errors that are always there, always going to happen.What does that reveal about the nature of everyday work the patterns and what does that reveal about the system structure? I think that’s the kind of thing that we want to be getting up.

[00:16:50] Nippin Anand: And that’s actually an interesting point Steve, one of the things that comes to mind when you talk about incidents is the famous case of the Wakashio accident where the ship runs aground and the floating news that you see is that this is a classic case of captain looking for a phone signal and using inappropriate charge coming close to the land which then explains the accident but as you rightly said that if you won’t be on the outcome of the accident which is  huge ship running around in the reputation of risks and whatnot it gives you a very unique insight into the everyday tensions between trying looking at the well-being of the crew but also trying to make sure that you maintain safe distance from the lab and that’s very interesting if you want to go beyond the safety and look at those conflicts and it really becomes is that once you haven’t invented accident or an incident, nobody is really interested in that side of things so the question I suppose I’m asking is that is there a way that you can segregate between what is important from a regulatory and compliance and reputation perspective and what can be sensibly used for organizational learning for improvement? Is there a divergent state that you see in reporting systems or learning systems in your world?

[00:18:16] Steven Shorrock: There is certainly a divergent in that. I mean obviously regulators are typically further distanced from the operation compared to the safety department and the management is further distanced in some ways compared to the safety department. So we have a kind of going from the sharp end to the blunt end you’ve got regulators going more toward the blunt end and then behind them you’ve got kind of government and the media and the judiciary and so on. But I think we should be constantly working together within the industry to try and work together with the regulator so that we have a common agreed purpose. Then we’re not having to separate things out into two parts because I think otherwise you have some unintended consequences and you have a kind of a split compliance culture and then you know something else which is really not helpful. Regulators do want to learn and if a reporting system can give them better insight into how their interventions affect the system then that’s a great thing but that takes a lot of courage on everyone’s part because who wants to know the unintended consequences that they can have on a system.  So it takes courage I think and a bit of humility and curiosity to really want to understand that as to what goes on a day-to-day basis.

[00:20:00] Nippin Anand: And you’re absolutely right. That no longer is a safety reporting tool. It becomes a business operations or operational tool which is meaning across department and hierarchies. We’ve covered up a lot but lastly a very hypothetical question is something that came up its very specific to one part of the discussion is that what do you do when trust is broken because of some past or bad experience. How do you engage people to report things from that point onwards?

[20:49:00] Steven Shorrock: Well, I think it’s the same in any relationship really. You start off by saying sorry and acknowledging your mistakes when trust is broken in any relationship. It takes quite a long time to come back but it will certainly not come back unless the wrongs that have been done are acknowledged. I mean in various industries I certainly know of cases and I’ve seen cases where people have been blamed personally, they’ve been shamed publicly, for essentially making a mistake. People get caught up in paradoxical situations where they can’t cannot do right for doing for doing wrong as well but without going into the details of the examples of that, I think the first step is to acknowledge that when trust has been broken and that may be through a breach of confidentiality or it may be more often through an inappropriate punishment either formally or informally, then the first thing that has to be done is a genuine apology and you have to make amends.

I mean this is just a basic thing in human nature and society that we know but sometimes just gets forgotten in organizations. I think saying sorry and a commitment to not repeat that. Then people will trust you by your actions following that. In my experience the worst case is when people are inappropriately when people are blamed for basically mistakes and errors which is just simply not appropriate because it is the organizations role to protect the operation from things that will happen such as errors. It’s also the organization’s responsibility to protect the train driver, pilot, whoever else from his or her errors. So yeah there’s a lot more that we could that we couldn’t unpack in that but I think the brief answer is to acknowledge the transgression to apologize for it to make amends for it and to just ensure that it really doesn’t happen again. These kind of things are usually quite emotional when someone is blamed and it’s the kind of a loss of control not on a human level.

[00:23:24] Nippin Anand: And that apology goes a long way in tightly knitted communities, I believe. People just watching what’s the next reaction from the management when somebody screws up.

[00:23:34] Steven Shorrock: I think people just people just respond very well to you know to an apology. We’re all human and I think when something was done us wrong in any way and someone offers a genuine apology, unless it’s a really something super serious it may take a while but mostly people respond to know people are reasonable.

[00:23:55] Nippin Anand: Great point Steve. There’s so much more but there’s so little time but I don’t want to take up anyone affected such good insights into just the reporting side of impacting  and maybe in the future discussions we talk about analysis and how you translate that into some sort of meaningful learning for the organization and hope you can join us then. Once again thank you Steve or for giving us so many wonderful insights it’s been such an educational 25 minutes spent with you.

[00:24:22] Steven Shorrock: Thanks all and I am unable to answer all of the questions or have a conversation with you all but you can certainly email me and maybe you will include that afterwards or via my website humanisticsystems.com and I’m more than happy to chat.

[00:24:28] Nippin Anand: You can find Steve is very active on Twitter as well so please by all means get in touch with curious about finding out more

[00:24:52] Nippin Anand: So what did you think? I think there’s a lot to think and reflect upon which is exactly the idea of this podcast. I was actually intrigued by Steve when he said “is reporting really the purpose of the reporting system?” I kind of agree with that he makes such good points. At one point he says you have been reported look at the negative connotation that is attached to the term. What message do does it carry? Then also it talks about the multiple and often competing purposes of the reporting system. You want to learn something new but you also have to comply and they’re not necessarily compatible although he offers a way forward to bring regulators in line with the learning motivation – very ambitious but it’s a great point!

[00:25:55] Steve also makes a very interesting point about allowing people more time and resources to report issues but more importantly he doesn’t see that as an issue he sees the issue he faces is more about the processing of the reports. The true value of reporting system, Steve believes, is asking the question what new knowledge have we gained as an organization by analyzing reports which goes far beyond just counting unsafe events and it’s the dull the mundane the obvious through teamwork and the interactions that brings us closer to the new knowledge that not be the high potential events at the top of the iceberg – very interesting. I leave you with these thoughts. Well we have just started and there’s a lot more to learn and I promise I will bring to you more perspectives of the table. A word of caution – don’t believe in a single one of these don’t believe a word of what you hear from me or from anyone else I said find your own truth but yes do keep an open mind. If you want to connect with me I would like then you can also email me at nippin.anand@novellus.solutions. Until the next time, take care and have a healthy start of the New Year, everything else will follow.