Episode 39 – Featuring Dr. Nippin Anand, Gitte Damm, and Dr. Ruchi Sinha
Why Organisations Fail: Discussions and Reflections
Episode 5 - Featuring Professor Lee Clarke
Professor Lee Clarke is based at the Rutgers University in New Jersey and he is the author of various books specializing in disaster and emergency management from a very sociological perspective. Lee reminds us that organisations are designed for a purpose and consistently rationalised to ensure that efficiency is maximised in pursuit of profitability. And when we say an organisation has failed, it’s often that failure sits outside the narrow purpose which the organisation was designed for.
[00:00:00 – 00:01:05] Nippin Anand: Hello everyone, welcome back to the fifth episode of Embracing Differences with me, Nippin Anand. My guest tonight is Prof. Lee Clarke. Lee is based at the Rutgers University in New Jersey and he is the author of various books specializing in disaster and emergency management from a very sociological perspective. One of my all-time favourites from Lee Clarke is his book, Mission Improbable – using fantasy documents to tame disaster. It’s a book I have recommended to many people if you were ever interested in the idea of emergency and disaster management- to what extent emergency plans and procedures actually make sense when things go out of control. So today we are going to explore the topic – Organizational failure. The key question that we will ask is – What is organizational failure? And why in many ways failure is inevitable in an organization?
[00:01:09] Nippin Anand: What intrigues me when I read some of your stuff is the notion of an organization that, what is an organization? And to me, Lee, it appears like we talk about organizational failures and I think the two terms are very incompatible at least in the in the way it’s seen because in an organization, the only thing that comes in the way of profitability is control. You know that, you need to have a sense of control.
[00:01:36] Lee Clarke: Yeah.
[00:01:37] Nippin Anand: And failure in one sense is loss of that control and in that sense; I find your work very interesting.
[00:01:44] Lee Clarke: Well, I’m pleased that you do. I see them as – organization and failure, as two words, that naturally go together because organizations are by in the sense, formal organizations are the kind that Max Weber wrote about ‘bureaucracies’. That are chiefly designed to solve problems – education, oil distribution, growing flowers, whatever. Now whenever there’s a big task to accomplish, and organization usually don’t know the answer to the question – How are we going to do this? This is an insight that comes from Herbert Simon in late 1940s you know, and it’s like we’ve got a big problem. It’s going to be beyond the capabilities of you or me or even superman to resolve. So, what do you do?
[00:02:42] You take the big problem and you break it up into smaller problems and you put those smaller problems in offices, which in principle, can be coordinated from those offices and you can address the problem. So, take some examples from my university life or any big university, we have students who need to be traced for COVID. But they need housing, and they need food, and they need libraries and they need classes to be scheduled and they need financial aid and they need all these things to make this, ultimately a thing called university or college. So that at the end there’s some semblance of that which we call research and / or education.
[00:03:36] My students are like, “This place is terrible” and I push them hard.
Well, what is your problem? “Well, my financial aid cheque didn’t come on time”.
Well, who did you talk to? They will say I talked to officer, they will say I talked to I don’t know and talked to professors. And the professors will say well, I don’t know anything about financial aids. Why not? Well, that’s not a part of my job or what’s been a part of what’s been broken up and put into my office.
[00: 04:09] So, when they start to see this connection between how you do a systematic organization of a problem and its distribution through offices then they can understand how that’s the essence of forming an organization like that. Now when the failure part comes, there are many sources of failure but at bottom we have to start with saying – organizations are organized to do somethings well and other things not so well. So, I don’t really expect the army, the US army or anybody’s big army to be really good at pursuing goals of social justice. That’s not what they do. That means they should be fair on how they assign people a task. Not let skin color, for example, sexual preference or eye color it get in the way.
[00:05:22] It’s not their fundamental goal to pursue social justice. So, when they don’t do it, people will call it a failure. And it is, in some very larger sense but in the day to day, get the job done, get through the day maintain order kind of way – that’s not what the army supposed to do. And all organizations have to do have to be like that. If we expect organizations to be perfect, well first of all they’re always going to disappoint us and sometimes they are going to disappoint in some ways that kill us inadvertently or if advertently sometimes and sometimes on purpose. But generally, we are concerned with accidents where ships go bumping in the night or when the ship zigged when it should have zagged but that Exxon Valdez spill 1989, we could say that we could identify the individual errors that happened along the way.
[00:06:20] Now what’s the system supposed to do? You’re supposed to get the oil out of that place as fast and some reasonably, I mean it needs to be safe but they want to be safe. But there’s a cost to safe. We kind of lost that realization in the present pandemic – safety at all costs! It’s hard to say it out loud you know but in fact there are real costs to keeping people at home over long periods of time.
[00:07:09] Anyway, organizations are designed to do some things well and by definition they’re not going to be able to do everything well. So, failure is kind of built in. In my world, in our world if I mess up in class nothing really bad happens. I mean there’s a lot of ways that I can figure out that I’ve said some fact incorrectly. I can correct it easily because I’m going to need them again. So, I send an email “oh I got that wrong”. You know it’s that in that split second, lightning quick response is required. But if you are in a fighter jet and you pull the trigger and shoot the wrong thing it’s hard to recover quickly. So, it’s kind of matters how many buffers that are between potential agent of harm and the target of the thing that becomes a target that matters. I just don’t deal with dangerous stuff in a sociology class too often.
[00:08:23] Nippin Anand: I suppose there’s a lot in what you said just now Lee and maybe I think what I picked up is – a structure in an organization. The way an organization is organized – it is it is mainly around what is supposed to fulfill and in very ley in terms if, you know, you reference the use of the term the work of Max Weber, it would be to turn raw material into some sort of a commodity or some sort of value. Now in that sense the organization is structured very well. The formal organization is what you saying, the moment you fall outside the remit of the key purpose of the other the reason why it exists, that structure falls apart, it’s not conducive with everything else around it.
[00:09:06] Lee Clarke: That’s right.
[00:09:07] Nippin Anand: So, in a way taking the oil from point A to point B- yes, the systems of every part of the order, every component of the organization supports that key function or that key role but for that matter if a passenger ship which takes passengers from one part to another part, but the moment the ship capsizes the structure of the organization is not conducive to bring it back to anywhere close to what is normal.
[00:09:32] Lee Clarke: Yeah, that’s right you’ve done a lot of research on ship accidents.
[00:09:34] Nippin Anand: That’s right I mean I’m talking about a very specific experience which is the case of Costa Concordia. I find it intriguing what you just said. So, I think you place a lot of weightage on this notion of formal organization and control around that. But then one could also argue that very little actually happens within the space of their formal organization. Most of the work as you have talked about the student who complaints that you know, the way things are not really, that happens all the time and what you’re saying now is that yes, that happens all the time but I have the capacity to correct it even though I do not have to call upon every formal structure of the organization. I have the self-correcting mechanism. But then some situations are more unforgiving than others and when that happens it becomes difficult. What Perrow would call a very ‘tightly coupled situation’.
[00:10:30] Lee Clarke: Yeah, I mean you said nothing that I disagree with. I would just add few things – safety is expensive! Deeper than that is when safety procedures work, nothing happens. So, if you worry about the bottom line, passengers ferried from point A to B happily and maybe in the Concordia situation, you said something we spoke the last time that I wasn’t aware of that they were really trying to get close to the shore so they could get this officer off. Well, that’s not the standard, right? That’s not part of the standard operating procedures. But it probably happens a lot and they get away with it and nothing much happens there. But if they follow the rules from here on out, that let’s not take chances close to the shore just to make it convenient for some officer to get off the boat.
[00:11:38] Fine and nothing will happen. And if nothing happens it’s hard to see that as evidence. What is it really evidence that we’re doing things safely or that we’re doing unnecessary things? Here’s another example from the Exxon Valdez case. In the early days. I don’t know if you’ve ever been to Alaska but Alaska’s just majestic! It’s an incredibly beautiful place. Everything is big snow and ice, and seals and water and crystalline waters. It’s just a sin and I’m not a religious person but it’s a sin to befoul place with oil in the water. The people who live there, they knew the risk and they would complain bitterly. Why do we have to drill here? Why do we have to? Let’s get oil from somewhere else and the Coast Guard responded well.
[00:12:46] The US Coast Guard – terrific organization for lots of reasons but their response was to send two tugboats out from the pipeline terminus, the end of the pipeline in Valdez, Alaska. It would load up with all this 10 – 11 million barrels of oil and then head out to the Gulf of Alaska and from that point called the Valdez narrows and there’s not as much room for error. Once you get to the Gulf of Alaska, it’s wide-open sea so they have two tugs that would escort these super tankers out to the Gulf of Alaska. And they never had an accident I’ve forgotten how long it was, it’s a long time and then the reasons were pretty mundane. We had more eyes on the problem or more vessels looking out for potential danger, more radar or pieces of the system looking at potential things that could go wrong.
[00:14:19] Well, after a while it was expensive to do that after a while people started to interpret that safety as not necessary, not as well, what we’re doing is making it a safe system. Whatever the cause, what we’re doing is superfluous. It’s just symbolic but it turns out, it wasn’t. But the point is, systems are running. Nothing happens. Most of the time big airliners go from point A to point B and nothing happens. But if you talk to pilots a lot of things go wrong and they are able to recover from that. We had a recent catastrophe and it would happen again.
[00:15:19] Nippin Anand: Yes, and that’s a very good example what you gave just now about the tanker in the Alaskan coast. In anthropology, there’s a beautiful saying it. We say it like – nothing never happens. So, nothing never happens in the sense that, you may think that you know that the system is safe because everything is working fine. The rules are working as intended but there’s an awful lot of adaptive capacity here. There’s a lot of things happening here, that’s keeping the system safe which you probably don’t see because you don’t ask the right questions. The questions that you’re asking are giving you that sense of illusion that everything is under control because you know people are doing as you would imagine they would do.
[00:16:06] Lee Clarke: Yeah, and the result when things go wrong is, we always say as – a failure of imagination this was a big thing in in the 9/11 report it was a big thing that people been talking about that since the whatever you want to call that what happened in Washington two weeks ago whenever it was, failure of imagination. Somebody imagined it but it seems so far out there.
[00:16:36] Nippin Anand: I think one of the examples here, is the Costa Concordia case and people often ask the question that why was the captain going so close to the shore and why did he not have enough people on the bridge and why did he not go zig instead of zag? Almost always my response to these is that these all really good questions if you have the imagination to ask this question before the accident happens.
[00:17:03] Lee Clarke: Sure, right I mean it seems particularly foolish because we all know that those big boats don’t turn quickly and then when you get too close to the shore that’s when trouble comes. I’m not an expert in maritime operations but that’s pretty obvious thing to do or not do. I mean you would know better than I. You’ve done that before and gotten away with it. A good captain is going to say “Let’s not take that risk” but a good captain also says “I can take that risk and I can control it I can get away with it”.
[00:17:46] And let’s not forget the problem of production pressure is usually operative these systems, you people and that’s one of them when somebody higher up through the chain says I want you to do this and how do you say no to that I mean you can but sometimes you can’t.
[00:18:06] Nippin Anand: I think there there’s two things you’ve pointed out here and I think how do you say ‘no’ to that at an individual level but also at the level of what the purpose of an organization is. Now that’s very clearly manifested in this book which is that if the very idea of undertaking the risk assessment is to get the job done and if you get up and raise your hand and say this job cannot be done then the purpose of that assessment in many ways is defied because that’s what you’re supposed to be doing.
[00:18:38] Lee Clarke: Yeah, I think there’s you know at the organizational level how do you admit with some things are just not possible or not controllable. At the cultural level at least, so if I can use the phrase western culture, we don’t say it’s impossible, we say, well engineers have a favorite saying, they say “I can make a system as safe as you want, just tell me how much you want to spend. We can build dams, we can do anything you want just tell much how much money you have for it there is no problem we can’t solve”. A lot of times you know, they’re right. There are 9 billion people on the planet now and a lot of them suffering every day but people are alive and just stick with the code probably not solve it but will adapt. The world is not really coming to an end because of it. But there are people who say “We can control this. Trust us, we can control this”.
[00:20:06] I am skeptical I won’t say they can’t because you know, ultimately, I’m an expert in sociology not epidemiology certainly not virology certainly I’m not an expert in those things and that’s another part of you know, what an organization does. They say “You’re an expert in this, but not in this, this is not your job”. So, there’s always a tension there to be at the helm of the Concordia or an airliner. I don’t know how to fly those things!
[00:20:41] Nippin Anand: I’m just thinking as you’re speaking Lee, you know what I mean from what I gather so far what else can an organization say in the face of uncertainty if not what you just stated. What is the alternative here?
[00:20:57] Lee Clarke: That’s a really important question and the answer is easy from an academic. I sit in my office here and say, but in practice it’s hard to implement. What I’m about to say is they can, say, forthrightly you know like the BP oil spill, part of the problem there is that it dates way too deep. They had information they knew what would happen at the shallow depths in the new meta cap. And they just assumed that information would operate in the same way at really deep levels, really great depths of the ocean and they were wrong. So, what you could say is we don’t really know how to do this, things that can go catastrophically wrong.
[00:21:55] Here’s what’s going to happen, here’s “The worst cases”, this is where my next book comes in and this is the worst case and some things, when they have consequences that are widespread that’s where you have communities that should have a say where risks are taken you know and not what do regular other people know. Or what do I know about whether nuclear power plant should be built in my backyard. I’ll complain about that because what if that goes wrong, if it melts down or if oil keeps gushing into the Gulf of Mexico or North Sea or whatever. It affects my livelihood and affects my interactions with people and things that I know and love and that’s relevant so in an ideal world when organizations are going to make choices that have the potential to go way outside their borders, other actors who step in should have a role.
[00:23:15] It’s not an argument for socialism or anything like that. It’s just where other people are being asked to bear the risks for somebody else’s gain. Why should they not have something to say about that? Now implementing that is very difficult there are fewer books written about those kinds of success stories.
[00: 23:40] Nippin Anand: I don’t even want to go outside the peripheries of an organization but I’ll take the analogy of a simple example that comes to mind today is, there’s been a lot of talk about these giant mammoth container ships that can carry 20 to 25,000 containers and go from one part of the world to another now being having studied economics and being on these ships for many years and also following this how the trajectory of how these ships are increasing in size I do not see the economic logic of making these ships of the size they are today. But that’s another argument. the argument is that going even the peripheries of the organization there is clearly a realization that if a ship of this nature ends up into a catastrophe when it’s a fire or flooding or whatever. We simply do not have the capacity to deal with it. Yet we continue to do that so I don’t know I mean this information is very transparent I mean you’re talking about making it transparent to the outside world I’m saying that this information is already very transparent to the insiders and yet we believe that we can carry on with it and we have certification and regulation in place to prove that.
[00:24:51] Lee Clarke: Good to get away with it most of the time I don’t mean to get away with it in an evil but way most of the time 25,000 containers. I don’t know. How many containers do they usually carry?
[00:25:04] Nippin Anand: Well, I mean when I used to sail on the ships it started with 800 containers on the ship was a big number and I’m talking about 1995. That number is consistently grown to become 20 to 25,000 containers it’s huge number we talk about 400 meters long ships, 50-60 meters wide and about 20 meters inside the water, 15 to 20 meters and manned by a handful of people maybe 10 or 12 people often with mis declared, undeclared cargo which people have no idea.
[00:25:36] So, you could end up loading dangerous cargo which you have no idea about and you’re carrying that risk with you all the time and you know that you also know that it happens a lot now how do you how do you deal with this? And the kind of one of the mitigation measures in place a very skeptical firefighting system which everyone knows it is not going to work. I would like you to just tell me what’s the key message of your book the worst-case scenario how do we oh maybe I didn’t get the title right I apologize if I did.
[00:26:10] Lee Clarke: Worst cases, the worst-case scenario was taken. You can have one yeah, the Airbus A380. I reckon odd 500 people in some configuration you got to have a crash probably the guys didn’t pay attention to it whereas one of those 25,000 container vessels, we are very reactive as a I don’t know, a general culture, we’re very reactive. I don’t know if we haven’t blown up a city, I see no real problem here. That’s a pretty glib answer the truth is I don’t know.
[00:26:51] We need more humility than the experts and certainly among people who have the resources and people at the top levels of these organizations were making choices they need the space to be able to say we don’t we can’t control this when this goes. I mean they still do this with the big oil spills it would frankly be better if they I mean they always go out there and put all these dispersants down which terribly poisonous themselves and they don’t get rid of the oil they just make the oil break down into smaller pieces so it’s more biodegradable to the environment. But they have to do something I guess but well what we really should do in this case is walk away from it say well that’s the cost of doing business but do that beforehand so that the cost can be covered. Not just cost to the organization, the potential costs to communities. I mean who know if there carrying 25,000 containers and they don’t know what’s in them. How can they do that? There can be nuclear weapons in there?
[00:28:09] Nippin Anand: Could be. So, let me explain this to you so this is interesting I think it would it would it would be informed I mean it would be an insight for you as somebody who researches into this area so that container ships carry a lot of cargo from different parts of the world and discharge a lot of cargo. Now in many ports around the world when you load the container onto the ship and otherwise you have no idea, you can only go by what the shipper says is the weight of the cargo. You have no way to find out I mean there are technologies available today as soon as you lift the container, the crane should be able to. So, technology does allow but we haven’t really ventured into that space because we don’t see an incentive in doing that in many parts of the world. Now the other issue is that just before the ship sails out there is a mad rush to get a lot of last-minute containers in and that’s what you get a lot mis declared, undeclared cargoes and again you have no idea because the container is sealed and you are just a carrier of that of that risk you just told here’s the paperwork and here’s the container. You’re not supposed to open it, you’re not supposed to question what’s inside you’re supposed to trust that manifest and just go with it and most of the time that’s OK except that it’s proven that a lot of times there are lots of surprises there.
[00:29:22] Lee Clarke: Like what? Drugs?
[00:29:23] Nippin Anand: Huge! It’s documented a lot of times. But I want to go back to this idea of from this conversation what you mentioned about humility amongst experts that you know that when you are designing regulations area designing controls. Is it really a question of humility because I see this is more of a structural issue, Lee that yes you can be very curious and you know this discourse has become very popular these days for the leadership to become more curious, more humble but how does this structure really support that curiosity is another question? It is a much bigger question actually.
[00:29:57] Lee Clarke: Yeah, I don’t know the answer to that I just don’t know the answer because as soon as you try to structure this into normal operations, risk become routinized.
[00:30:15] Nippin Anand: Indeed, because I’m very cautious of this new thinking that has emerged these days about psychological safety. I wholeheartedly agree with leadership being more curious and humble. My question really is that how do we how do we sustain that humility within an organization? Because you know if you read a recent article on the BBC, it says that a CEO of a company can earn the wage of what a worker earns in one full year. The CEO can earn that salary in three days of work. If that’s how we structure and organization, how do we really get to this situation?
[00:30:51] Lee Clarke: There’s the obvious – put them in jail! Instead of saying there are no crimes here, put them in jail as we do to operators. We could have more regulation but these are facile answers. If I had confidence that I knew that I had a meaningful answer to your question I would be a consultant instead of a professor of sociology.
[00:31:23] Consultant a little bit, but I really wonder about stuff that I truly understand but how to institutionalize the concern with safety? – I’ve been thinking about that for a couple of decades now and I don’t just don’t have an answer to it.
[00:31:41] Nippin Anand: Neither do I, but its fair. It’s good to be to be searching for these things, for deeper meanings. But you know it’s been a great conversation, Lee. I enjoyed every bit of it from the first time I connected with you it’s such an honor to know you and to learn so much from you. It’s remarkable.
[00:32:19] Nippin Anand: What did you think? Again, there is no reason to believe Lee or anyone else as I keep reminding you all the time in this series. But I think it’s very important to try and understand somebody who spent his lifetime trying to understand this very important question and I think in many ways what Lee is trying to say is that failure is inevitable because organizations are designed for a very specific purpose, which is to extract maximum value out of resources and in doing so sometimes we rationalize costs to the extent that it turns into a failure.
[00:32:57] Now failures are not always that easy to recognize in a setting like a university for an example that Lee gives. Why? Because there is enough self-correcting mechanism there, you’re not going to kill somebody if you did not give the right information to lecture but thinking about on an aircraft or a car carrier or safety critical asset the consequences of that decision can be extremely very significant. Of course, safety is possible and safety can be achieved in many of these settings but the question that we need to ask is that – At what cost? Are we really willing to invest to make the system safe? This is a question that engineers would ask most the time that how safe is safe? You can give me a lot of resources that I can make a really safe system.
[00:33:48] So what is the solution that Lee offers? I think what he’s trying to say is that, let’s stop claiming that our systems are ultra-safe when deep inside we know that they’re not and start making it absolutely transparent what the implications of our decisions and our actions can be specially to people who may have nothing to do with our business, our actions, our decisions and this is where he’s talking about communities. For example: What are the implications for a community that lives on the side of the river or near a port when an oil spill happens. They gain nothing from our operations but they have everything to lose from it if there was an oil spill in the river because it affects their livelihoods.
[00:34:35] The most interesting part of Lee’s interaction with me was his immense sense of humility. That there were a couple of times when he turned around and said “No, I don’t know I don’t know what the answer is” and I think if you think about somebody in his capacity it is remarkable for a person to say that because most of the times when you ask a question to an expert the immediate response would be or its expectation would be that we know the answer to the problem that’s why I’m an expert. But part of being an expert part of being recognized as an authority in your area of work is to know your limitations and I think that was a great acknowledgment of humility. I quite enjoyed talking to Lee and I think I learned a lot more than just disaster management in this short interaction with him. So yes, it was a very enjoyable conversation I hope you liked. If you are interested to learn more you can follow me on LinkedIn. You can also email me at email@example.com.
Want to talk? Reach out to Nippin Anand:
Subscribe to the podcast
A series of podcasts with thought leaders and safety scientists.