Episode 30 – Featuring Nektarios Karanikas
Why don’t workers speak up? A conversation with Amy Edmondson
Episode 17 - Featuring Professor Amy Edmondson
Over the years, I have become concerned with how the concept of psychological safety is used and abused across work sectors. In my view, the dominant narrative that ‘a submissive employee felt terrified to speak up because of an abusive boss’ is sometimes over-stretched. It masks us from understanding the deeper and systemic problems within the society. In this podcast, I have the honour to speak with Amy Edmondson, Novartis Professor of Leadership at Harvard Business School. Together we explore what makes it difficult for people in lower positions to speak up to their boss.
Based on a review of accidents, I have come to realise that when people don’t speak up, it could be that there is a problem of competence that undermines trust amongst team members. In such instances, lack of trust should not be mistaken with the absence of psychological safety.
[00:00:06] Nippin Anand: Welcome to another episode of Embracing Differences with me Nippin Anand. This podcast series is meant to bring you different perspectives and concepts in safety. The idea really is to create space for thinking and reflection, not to reinforce any grand theories or our prior knowledge on a subject. The aim is to learn and grow, not to remain stagnant. And of course, as I keep saying there is no reason for you to believe me or any so-called expert but keep an open mind and be prepared to challenge your beliefs if you truly want to learn more than what you knew yesterday.
[00:00:44]: Let me start with a story. Just before the start of the pandemic, I was riding a cab in London.
The driver was Romanian, an immigrant in the UK (like I am) and so we found some common things to discuss. And at one point the conversation ended up into talking about Brexit. And the driver said “you know what, I don’t feel psychologically safe these days because of this Brexit thing”.
I wasn’t too sure what he said so I asked him to repeat – so what do you mean when you say “psychologically safe”. He said, “well you see I have to drive at odd hours, you get all sort of passengers, and it’s not difficult for someone to find out my nationality and where I come from when I speak and with this political environment and people with divided opinions it’s not uncommon to hear unpleasant things about immigrant from mainland in the backseat. And sometimes it feels like it’s a personal attack on me.”
Ah! I see so you mean that you don’t feel safe about it? “Yes, that’s what I mean” he said.
What is the point of me telling this story? The point is to understand how certain concepts at a certain point in time become so pervasive, so unquestionable that we don’t think so much when we use them. It’s like the use of the term security – how it has been used in public discourse after the 9/11 or in more recent times ‘social distance’ – even though we all know it’s about physical distancing.
But back to the topic. I am referring to the use of the term ‘psychological safety’ which has been so uncritically embraced in the society that we have come to believe that it is the key to all our problems.
So, what is psychological safety?
Here’s a quote from Amy Edmondson:
“Psychological safety is a belief that one will not be punished or humiliated for speaking up with ideas, questions, concerns or mistakes.”
It would be safe to say that when people don’t speak up, voice their concerns or opinions, go along with the most powerful voice in the room, feel humiliated, ridiculed or abused – these are all considered signs of a psychologically unsafe space.
So why did the cab driver use the term without thinking much about what the term really means? I think we all do, many in the safety critical industries do, because we are meaning making beings. We have to put labels on everything for the sake of understanding.
But the irony of meaning making is that sometimes it is the same labels, the same words, the same categories, that obscure the meaning of concepts that we are trying to understand.
Over the course of few years, I have become concerned with how the concept of psychological safety has been used and abused to understand how people relate with each other when they work in teams. My concern is that this narrative that “a docile employee was terrified to speak up because of an abusive boss” is sometimes taken too far to explain away a lot of problems that could otherwise be better understood if we stopped labelling everything as a problem of psychological safety.
In this podcast I have the honour to speak with Amy Edmondson, Novartis Professor of Leadership at Harvard Business School. After all, who else could help us understand the concept of psychological safety better than someone who has lived and breathed PS and team relationships almost all her life and become an internationally acclaimed voice on this subject.
The question I want to ask Amy in this podcast is this:
What makes it difficult for people in lower positions to speak up or challenge their boss? (pilots, surgeons, captains, Offshore installation managers)
My argument, based on years of research, is that people don’t speak up because we have a problem of trust – trust in the competence of employees in lower positions – which creates tensions within the teams. And the problem of competence is not an individual’s problem – it’s a systemic problem.
Make no mistake, I am not rejecting the problem of PS, but my concern is that if we do not tackle this deep-rooted problem, no amount of PS will help to improve trust between team members.
[00:05:44] Nippin Anand: First of all, I’m so honored to have you. So, it’s such a wonderful feeling to hear from you to actually see you in person.
[00:05:55] Amy Edmondson: Well, you’re actually doing this topic that’s absolutely near and dear to my heart. This is the stuff that got me interested in doing a PhD in the first place. So, it’s unusually focused and exciting.
[00:06:10] Nippin Anand: So, let’s start from there because I think an introduction from your end is probably not needed, the world knows you so well
[00:06:17] Amy Edmondson: Not true.
[00:06:19] Nippin Anand: OK in that case, let’s start with a small production from your end.
[00:06:23] Amy Edmondson: I’m Amy Edmondson and I’m a professor of leadership and management at Harvard Business School and I study teams and teaming and psychological safety and I guess my broad interest is in organizational learning, like what makes organizations able to keep improving and innovating in a changing world.
[00:06:47] Nippin Anand: Beautiful! So, a little bit about myself. I originally come from India and am based in Aberdeen which is North of Scotland. 12 years at sea, and then came to the UK to do a Masters in Economics which I absolutely didn’t like and then ended up doing a PhD in Social Sciences which is where my heart is. For seven years I served as a safety inspector in the North Sea area last three years one of the things that I did treat me is that it is that exactly how you say how do organizations learn. On that journey I was faced with this very life changing accident that I experienced from this passenger ship, Costa Concordia that went aground and capsized off Italy. There were stories that emerged as a result of the accident were very different from the reality that we saw when we went to visit the captain. So, we spent four days with him in trying to get his perspective of the accident. So, we collected a lot of stuff. We went into the black box recorder, hold up lots of informal conversations, we had lots of video footage with him. So, me and the head of Danish Maritime Accident Investigation (DMAIB) started to put the story together. But this is again our story. It’s not his story, it’s not the public view, it’s our view.
[00:08:37] What we saw from this accident, we look at the black box and this is the interesting bit that you see a ship heading into the rocks and it’s full of competent people. Everyone is absolutely certified competent and what you see minutes before the accident is a complete silence noticing that the ship is heading into the rocks. Nobody’s speaking about it. At one point you actually hear from the captain that and he very gently says to the helmsman – the person who’s steering the ship, that steer carefully otherwise we will go on the rocks and that’s kind of a joke from his end. The ship meets an accident, she capsizes and everything that follows that happened.
[00:09:17] So, what interested us was that how could a system which is so stable that is from a complete silence it goes to absolute chaos and within few seconds and what explains this? I started to get lots of phone calls and e-mails from people that “Don’t trust this captain, he’s kind of a monster!” and that he had a very powerful presence on the bridge whenever he would come, the harbor pilots – people who take the ship in and out, had views about this captain. So, it was only in 2017 that we had the opportunity to actually go and visit him. What was interesting about that visit amongst many other things was that we kept hearing about this mistrust in his officers. To share a small story with you here, that really changed the course of this research. So, the story really is that before the accident of course that he goes to the jacuzzi. This jacuzzi on the passenger ship is supposed to contain salt water. That’s what it’s supposed to have so he finds out very quickly that this is not salt water, this is fresh water. So, he calls his junior officer and says that why do we have fresh water here? Why not saltwater? The junior officer says that that’s because the salt water pump isn’t working so we filled it in with fresh water. So, the captain says, “OK fine why don’t you do something. Why don’t you get some salt tablets from the hospital?” and ships hospital has plenty of those. People they get dehydrated and there’s lots of people on board. So, he said, “Why don’t you put some salt tablets in this jacuzzi” so he comes back to the to the jacuzzi the next day and the water still tastes absolutely the same. So, he calls this officer and says, “What’s happening here? Why do I not see saltwater? How many tablets did you end up putting into the jacuzzi?” The officer informs saying 3. It’s a complete silence there and the point really is that in his world, somebody who does not understand the basic principles of Archimedes principle and the salinity of the water is not fit to be an officer of watch and there comes the trust issue.
[00:11:30] We consistently hear about that all the time and then we go back to see the situation in the labor market that what’s really happening here, pretty interesting patterns of huge expansion within the cruise industry and lots of people being recruited. Not today to the level that you would expect, lots of fast promotions in a culture which is traditionally very hierarchical and it leads itself to a very interesting growing patterns in the teams. So, you have one person who is like the silo, who’s the expert and the rest of them are just novices. In that sense, how this relation plays out, it just gave us an opportunity to see a completely different world from his perspective. Obviously, we looked into many other accidents from that point onwards to see what is common between the Costa Concordia and the MH 370, the Ethiopian Airlines and so on. We came to understand that there is something happening here that we’re not trying to grasp. So, there’s something else that interested us that in high-risk systems,
[00:12:23] Amy Edmondson: There’s a tension because especially in very hierarchical systems they’re quite worried about being and looking competent and at the same time their real performance, their real safety, their longer-term survivability is based on the ability to keep getting better and to be almost unnaturally observant and attentive to failures, little mishaps in processes a weakness. So, for high-risk organizations, they need to be just unnaturally attentive to the things that go wrong but that can feel very much at odds with looking good and with looking like performance. So, one could actually say that the real tensions between the appearance of performance and learning that real performance and particularly overtime, is in fact dependent on about learning. It’s almost one in the same as learning when you’re facing any kind of uncertainty or change, and who isn’t? So, I think the pattern you describe has been has been frequently described reminds me of Jan Hagen book, “Confronting mistakes”, written much before the 737 Max incident that you mentioned but it’s about airline accidents, and black box analysis and some of the most famous and deadly airline accidents in history fall exactly into this pattern with junior officers unwilling to speak up to senior officers in the cockpit and blind spots around their assumptions of what people understand and do and speak up about.
This is a pattern that plays out in healthcare and all sorts of settings – aviation, shipping etc. where the risks into human safety are very real.
[00:14:44] Nippin Anand: But what do you think about the idea that a junior officer or subordinate in this instance is probably not even able to comprehend the situation?
[00:14:53] Amy Edmondson: You mean by which situation? The technical situation or the organizational and interpersonal situation?
[00:14:57] Nippin Anand: Yeah, so let me help you understand this little bit more. When we talk about the captain and the officers, as the ship is heading into the rocks, the junior officer is not able to understand the complexity of the situation. So, in his or her mind this happens all the time that the captain is an expert and he knows how to handle the situation so he would be able to recover from the situation. James Reason says one person’s error is another person’s expertise, let’s put it this way. So, what I’m trying to do to allude towards is that you will only speak up if you understand the complexity of the situation and where we are today, in many high-risk systems is that there are far too many people at the entry level people who are working with people who have significantly higher level of expertise and as a team in this situation, it becomes extremely difficult for a couple of reasons. One is that there is very little trust between the team members from both sides. One side perceives that there is very little value in the input coming from the subordinate and the subordinate thinks that there’s very little value in in providing any output because of the fear of being seen as somebody incompetent. So, this is an awful lot of cognitive imbalance here that is happening.
[00:16:17] Amy Edmondson: Yeah, I mean that is an interesting and possibly unique pattern in the industry that you’re most familiar with it. If you have a big gap in competence and particularly from what you’re saying it sounds like there are people in roles who actually don’t have the competence to deliver effectively in those roles and yes that’s certainly a very real problem in real challenge and one that is systemic in nature. There are educational deficits, there’s labor market deficits, there deficits of the industry as a whole. It sounds like from what you said was getting just too crowded to an end that can dilute the ability to pay for the talent you need which or the education you need or the capacity of the unit. So, there’s a lot of vicious cycles in here it sounds like. Certainly, the issue isn’t entirely one of speaking up and an interpersonal. Fear it is also one of plain old ordinary skills.
[00:17:36] Nippin Anand: When you look at some of the recent airline accidents you see some very similar patterns. In the case of the Ethiopian Airlines, you have a co-pilot who is just 200 hours of flying experience as against the pilot. So very huge skill differential or scale gradient together. In the MH 370, the co-pilot was in his first flight on board the Boeing 777 as a fully approved pilot and then there’s the Costa Concordia, where the most senior officer on board the ship has very disproportionate experience, very little experience as compared to the captain who was almost 60 years old or 59 years old at the time of the accident. So, the point I’m trying to make is that this is a systemic problem that runs across many industries where you have this as you rightly said rationalization of cost. That is wanting to source labor from the cheapest possible sourcing markets with questionable training standards input alongside people who are highly trained and creates a very interesting dynamics within the team.
[00:18:42] Amy Edmondson: Yes, indeed I mean ‘interesting’ is a funny word for it but I think what you’re saying which I would agree with from what I know that it’s somewhat unprecedented, that this kind of gap especially after decades of relatively high access to well-trained pilot or all of the airlines that needed well trained pilots could get them whether from training schools or from the military and they would have the skill sets and some of the routines for communication that they need.
[00:19:38] Nippin Anand: What interests me is that there are two very different problems of being a monstrous captain very terrible or terror of a captain and a systemic problem. Often what we see in the maritime world is that we are trying very hard to solve these problems through the tools of psychological safety but also crew resource management that how can we create the intervention tools for the junior officers to actually speak up when they see a problem. It goes back to the same thing that unless junior officers are really able to understand where and when they need to speak up it will be very difficult for them to intervene. So, we try to solve the wrong problem with the wrong tools and I just wondered what your thoughts are of it? Have you had any thoughts on this at all?
[00:20:32] Amy Edmondson: You mean about the tools used for speaking up?
[00:20:36] Nippin Anand: Yes, so what the companies are doing is they’re giving them training along the lines of crew resource management and psychological safety wherein the problem lies somewhere else which is at the heart of it is the trust between team members.
[00:20:48] Amy Edmondson: Trusted competence.
[00:20:50] Nippin Anand: Yes, that’s the thing I was looking for.
[00:20:52] Amy Edmondson: I wouldn’t say it’s the only thing, I would say it’s a potential two by two in that. What you actually need is skill competence and CRM-like routines and norms and behaviors were because I’m you need people who are competent enough to do their jobs and understand what’s going on the flight and also who are willing and able to speak up because I think many of the accidents historically documented and other sources are very competent people. But they didn’t they just it felt impossible somehow to speak up against their captain. I think competence is slightly and importantly different than trusted competence. The word trust covers a lot of ground and it includes some of that interpersonal perceptions about how you’ll respond when I say something or what you’ll do? What you’re talking about is very much adding the competence piece to the mix on this accident diagnosis and the prevention of them.
[00:22:27] Nippin Anand: I totally agree with you that is something that is there is a lot of place for psychological safety. The only issue I see is that the high-risk situation when you have Charles Perrow’s words ‘very tightly coupled situation’ but you don’t have the time and space for reflection. Obviously, it is very difficult to practice psychological safety in those particular situations. So, you have lots of traffic on the bridge and the captain has to make a decision. That’s not the time to encourage people to discuss and make mistakes and talk but there is a lot of place for all that in meetings, in debriefings, in pre-briefings, and in the way you normally communicate on the ship.
My point really was to just have a just seek some simple clarification from your end you know somebody who’s done so much work in this area that there is a distinction between psychological safety and trust based on competence as you rightly say.
[00:23:28] Amy Edmondson: Of course, the sort of fascinating problem or puzzle that I’ve studied most often is how you can in fact have gone out of your way to hire the right people but to have the right competence, to have the right training, even be clear about the goals of the mission and so forth and yet still be faced with case after case after case of people not speaking up in moments where they had something potentially relevant right there. Might be a safety issue, it might be a question or might be an idea for innovation, it might be all sorts of things the observed phenomenon is there is an awful lot of holding back you know the people will air on the side of holding back up. So, the puzzle wasn’t no more no less than. OK you’ve got the competence, you’ve got the talent, but you might not be using it right or you might not be in a position where you’re able to put them to the highest and best possible use for customers or for the mission.
That just struck me as wasteful. Bringing in something else that is equally important which is that competence needs to be there and I can imagine a phenomenon whereby that lack of competence exacerbates the phenomena that I have studied because if I don’t have the competence, I usually not always but I usually know that I don’t have it and that makes it all the more interpersonally threatening and that puts me in even more of a bend when I’m not sure. When something urgent or strange or different is happening and I just don’t know. So, am I going to speak up? Probably not. So, you’ve got a double trouble here.
[00:25:28] Nippin Anand: It’s so interesting you say that because we’re seeing a lot of technology being implemented to simplify work and that creates a very interesting pattern where you think that you can get away with lots of skills and replace them with semi scale or unskilled people who were initially trained to do certain things. But on the other hand, ships are getting bigger ships are getting more complex and entering into ports and spaces which has so much more uncertainty there. So, on the one hand uncertainties pushing you one direction, the cost is putting it towards simplification of work and so you almost end up in a situation where you have one very highly skilled person which is the captain and then you have the rest of them as semi-skilled people and that’s the dynamics makes it so interesting that creates that tension of the trust in the competence of people. I’ve seen so many times people say, “I don’t trust this officer when he goes in the bridge. I’m constantly up at night just wondering how he’s going to navigate” So we see a lot of that it creates a lot of tensions in the workplace and what’s also interesting is that the labor market and demand for experienced people at least at the top is a categorical and regulatory requirement. But nobody’s looking at the skill differential here as a result of that.
[00:27:04] Amy Edmondson: It is surprising. I mean it seems like it should certainly be added to the to the short list of things that need to be considered, regulated and in some industries more than others. Some countries more than others have real regulatory requirements on who can do jobs where people’s lives are at risk whether in medicine or in in shipping or any number of other settings.
[00:27:44] Nippin Anand: One could say that there is one because the Ethiopian airline pilot over 200 hours of flying experience did have a certificate.
[00:27:52] Amy Edmondson: But inadequate.
[00:27:57] Nippin Anand: How was your experience on knowledge-based work which is where most of your work which lies in.
[00:28:03] Amy Edmondson: Well, it’s people and people are everywhere and the patterns are quite similar. The time frames are often different. Sometimes the time frames are just elongated. There aren’t these moments of physical danger or systemic breakdown there where a ship is going to have a collision or a nuclear power plant is going to have a meltdown. They don’t usually lack those kinds of physical and temporal intensity but they have similar pattern in that. People who in fact, have the necessary competence but just don’t feel that the environment is conducive to their expressing themselves. The pattern I see most often nowadays when I ask people about these moments where they’ve held back, where they had something, they wanted to say but they didn’t and what was the reason right and then the range of answers is: Well, you know the boss is just too dismissive of bad ideas, or the boss’ boss was present or this is a very political issue so I didn’t want to get on the wrong side of the wrong people. But probably the saddest one to me is, “Well, nothing would have happened, anyway” that’s where you’ve kind of give up. You just don’t believe that you matter enough so why bother?
Sometimes by the way I think that one is fear but an unwillingness to admit the fear. So, it’s a little bit more cynical and blasé. But the one that I’m hearing most often is this – I just wasn’t confident that what I was going to say was important or right or would add value right. And this lack of confidence I think is on the rise because our knowledge-based work is less and less certain. There’s more uncertainty, there’s more complexity, there’s more interdependence. So, the reality is that the world in which so many people, so many knowledge workers, especially are operating is not one where you can actually expect to have confidence. If you’re overly confident, you’re not really sizing up reality quite accurately because reality in a way today is forcing us all to be quite humble.
[00:30:05] So, when you’re telling yourself the following that well because I’m not confident that I’m right so I’ll hold, back you’re actually doing your colleagues and yourself a disservice if that’s going to be the new normal. The new normal is “You’re not going to be confident, do it anyway!”. That idea that might be a game changing idea for all you know and especially because very few innovations come from a single person with a single idea. It’s like I have an idea and then you add to it and then someone else tests it out and it’s got to be a team sport. But it will never happen if we were quiet in the first place.
[00:31:46] Nippin Anand: Absolutely you know there’s it’s only few years that I’ve started practicing this me that the moment you say I don’t know something it feels deeply uncomfortable. But did the reaction is quite the opposite of what we mostly imagine people feel so much more trusting you as a person.
[00:32:03] Amy Edmondson: Ironic, isn’t it? Not only do I think they trust you more but, they actually like you more. Often, I think there’s an intuition we have of respecting you more too because you’re confident enough to say I don’t know. It’s quite attractive really, I mean if you never if you said I don’t know to every single thing all day long well that might that might wear out its welcome but having the honesty which is why trust is relevant there but having that having that willingness and almost ease with yourself to say “I don’t know” for the things you don’t know and it’s also potentially a stance of curiosity like I don’t know and you’re kind of expressing that you might be interested in learning more.
[00:33:01] Nippin Anand: Yes, and then we live in a very professional world and most people will anyway find out you know how much you know you admitting that but only authenticate your trust into them. I mean even going back to the shipping example seafarer who first joined the ship and the captain asks him how he felt keeping an independent watch and he says that, “Well, I think it would be nice if you stayed with me for a couple of days until I get the confidence” will be much better received than somebody saying “Oh I’ll be fine don’t worry about it” because people will find out the depth of your knowledge.
[00:33:35] Amy Edmondson: Ed Shine uses the term ‘situational humility’. If you don’t have situational humility in the kinds of situations where you might be in over your head it’s really not very wise.
[00:33:53] Nippin Anand: Yes, absolutely, situational humility I’ll remember that one.
That’s great, Amy. My idea really was to just to get together with you and see what your thoughts were on this idea I’ve been grappling with it for a few years now.
[00:34:13] Amy Edmondson: I think you could map this out through these two different ideas. I think you have and one is that competence must not be left out and clearly you need the kind of learning environment, learning culture and competence for excellence. That’s that sort of upper right-hand quadrant in that two by two and I think the other idea might be related to the vicious cycle of industry forces that lead to growth, that led to maybe a shortage of the skilling and then maybe some of the price issues as well are creating accidents waiting to happen and around this key dimension that you identify and I think there’s lots of lots of room to work with, both of those absolutely.
[00:35:05] Nippin Anand: Great! Well, thank you very much!
[00:35:13] Amy Edmondson: You’re very welcome.
It’s been such a pleasure talking to you.
[00:35:18] Nippin Anand: What did you think?
So I’ll tell you what I think and what we should be doing. The next time you hear complete silence on the blackbox or data recorders, slow down and stop making too many assumptions about why people don’t speak up. Rather step back and question the silence and look beyond the obvious and convenient explanations.
Silence is not always the sign of an abusive captain, a bully pilot, a monster surgeon or on the other hand, a meeky subordinate or a docile officer in a junior rank. Its not always about the power distance or the power struggle. When people don’t speak up (speak up to the hierarchy), that could mean many different things. Maybe the co-pilot had no idea what to say, or the captain had lost all the hope that his officer will have anything meaningful to contribute in a critical situation. The problem maybe systemic, it could well be that we have lowered the standards for entry into the cockpit so much that the captain has lost trust in his team. The Ethiopian air co-pilot had clocked a mere 200 hours as a pilot. How do you speak up and what do you say in times of crisis? Is this really about psychological safety?
But there’s another problem with this simplistic approach and even more immediate one.
Imagine sending the captain on a training course on PS when you find out that no one could speak up and raise their concerns in his presence.
It is depressing for the captain, an insult to the profession and a drain on resources. And it is not only a drain on resources, it is also counter-productive.
So, before you think you have a solution to the problem, slow down and listen to the sound of silence.
[00:37:22] Nippin Anand: Thank you for taking the time to listen to this podcast. I hope the time you spent was worthwhile. If you think the podcast has made you think, slow down and reflect, I have achieved my purpose. Please share it with others in your community so that messages reaches far and wide.
I spend a lot of time thinking, researching and producing meaningful content. If there is a specific top you’d wish to know more about, please let me know. If I can, I will make every attempt to create something meaningful and valuable to you. If you have a topic that you would like to discuss with me, please feel free to be in touch. Particularly if there is something you don’t agree with. Disagreements are always fun.
I wish to also remind you that all my podcasts, related reference material and transcript for each podcast is available on my website, www.novellus.solutions. You can also get in touch with me on the same website or through LinkedIn, Twitter or my personal website, www.nippinanand.com.
Want to talk? Reach out to Nippin Anand:
Subscribe to the podcast
A series of podcasts with thought leaders and safety scientists.