Chris Clearfield
EP
15

The Causes of Safety System Failure and How You Can Prevent It

This week on Safety Labs by Slice: Chris Clearfield. Chris shares his systems and change management expertise to help HSE professionals understand the “rules of the game”. Reassessing safety management through a systems lens can enable you to mitigate risks, implement change and create a more positive and collaborative safety culture.

In This Episode

In this episode, Mary Conquest speaks with Chris Clearfield, a coaching consultant, speaker, and co-author of “Meltdown: Why our systems fail and what we can do about it”.

Chris studies disasters, and his research into systems failure provides many valuable lessons for EHS professionals who rely on systems every day to maintain safety.

He explains how understanding system complexity and coupling (the amount of slack) can help HSE professionals keep their safety systems out of “the danger zone” and prevent accidents.

Also, he highlights how improving systems - so they provide more feedback - is key to making better safety decisions.

Using real-life disasters such as Deepwater Horizon, Chris provides lots of practical advice on implementing HSE change and creating more curious, experiential and inclusive safety cultures.

Transcript

♪ [music] ♪ - [Mary] My name is Mary Conquest. I'm your host for "Safety Labs by Slice," the podcast where we explore the human side of safety to support safety professionals. We move past regulations and reportables to talk about the core skills of safety leadership, empathy, influence, trust, rapport, in other words, the soft skills that help you do the hard stuff.

♪ [music] ♪ Hi, there. Welcome to "Safety Labs by Slice." What do PR disasters, the Deepwater Horizon accident, and an overcooked Thanksgiving dinner all have in common?

At first blush, not much, but they're all the result of systems failures, which, at their core, are surprisingly similar. Safety professionals design, evaluate, and improve systems every day. The vast majority of the time, these systems do their job of preventing injuries or worse. But as we all know, sometimes the system fails. Our guest has read research from decades of study and analysis of system failure.

Chris Clearfield joins us today to help answer the question, what causes system failure, and how can we prevent it? Chris helps leaders guide transformational change. He works with engineers, software developers, attorneys, and C-suite executives. As a coach and consultant, he supports leaders in attending to the social and technical aspects of their work to solve their unique and complex challenges in creative and collaborative ways.

Clearfield is the co-author of "Meltdown: What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us About How to Succeed at Work and at Home," which was named a best book of 2018 by the "Financial Times" and won the National Business Book Award. He works on risk, strategy, and innovation with leaders at some of the world's most interesting companies, from major oil producers and professional service firms to tech companies, like Etsy, Netflix, and Microsoft.

He's a graduate of Harvard University, where he studied physics and biochemistry. While training as a pilot, Clearfield began to research why some pilots successfully manage emergencies while others crash despite their best efforts. And when the Deepwater Horizon oil rig exploded in 2010 and the accident investigation revealed the same interaction between systems and teams, Chris found his purpose, teaching leaders how to build organizations that thrive in complex and uncertain environments.

As part of that work, Chris is the co-author of the book, "Meltdown," and he joins us today from Seattle. Welcome.

- [Chris] Thanks, Mary. I'm super excited to chat with you today.

- So, today, we're going to focus on safety-related systems, but the book isn't exclusively about safety systems. So, I just wanted to give our audience an idea of how broadly these principles can be applied. So, you name all of these four seemingly unrelated things as systems, preparing Thanksgiving dinner, the expansion of Target into the Canadian market, running a nuclear energy facility, and summiting Mount Everest.

So, before we get into system failure, in what way do you consider those examples as systems? In other words, what is your working definition of a system?

- That's a great question. I think my working definition of a system is something where you've got this combination of people, process, and technology that need to come together in a way to make things work. And in an organization, we have lots of different levels of systems that are always operating, right? System could be a small team, a system could be the whole organization.

It could be a department. And as you kind of focus on these different…you know, could be a plant. As you focus on these different levels of system, you see these commonalities that come from, you know, the fact that, A, it's people involved in all of them, and, B, there's always this element of technology. There's always this element of tools that you need to actually get things done.

And so, you've got everything from the human level and what are their skills, what's kind of affecting them, you know, intra-personally inside them, all the way up to all these technologies that need to work together to make something successful. So, I know that's a pretty broad definition of system, but I think it just kind of goes to that so much of our world really is driven by systems rather than these kind of isolated behaviors.

That's sort of how I think about it.

- Yeah. I would agree. It's broad, but, I mean, that's the way it is. The literature around system failure identifies a few key concepts that we need to understand in order to evaluate a system. So, those are coupling complexity and the danger zone. You talk extensively about these. As I understand it, these features were identified by Charles Perrow when he studied the causes of the nuclear accident at Three Mile Island.

So, I'd like to break down each of these ideas because they're pretty fundamental for understanding the rest. What is coupling, and do you have examples of tightly or loosely coupled systems?

- Yeah. You know, coupling is…it's sort of how much slack there is in the system or how much of a buffer there is in the system. So, a tightly coupled system is a system where one process kind of leads directly into the next and you don't have time, or space, or capacity to pause and intervene, as an example.

So, one way I like to think about this is, you know, a tightly coupled system is like taking a flight where you have to change planes and you've only got, you know, 45 minutes between your connection. So, if there's kind of any disruption on your way into, you know, that airport where you're changing planes, then you're going to miss your connecting flight.

That's a system that's tightly coupled. That's sort of a back-of-the-envelope way that I think about it.

- Yeah. Kind of a margin for error, like.

- Yes.

- I didn't expect to trip and my luggage to go open and… - Yes.

- Yeah. You know? And also complexity. So, complexity and simplicity, people talk about all the time, but in terms of systems, what is complexity?

- So, a complex system, there are a couple of different ways that I think are useful to think about it. One is a complex system is one where there's sort of a lot of interconnections in the system, and it's those interconnections that matter as much, if not more than the individual parts of the system themselves.

So, in a complex system, you know, you need a lot of things to go right for the overall process to go right, to get the outcome you expect. And the more complex a system is, you know, the more likely a kind of small misunderstanding or a problem will affect other parts of the system.

So, it's this kind of measure of, yeah, the kind of interconnectedness of the system. But you can also think about it as sort of how easy it is to understand your system. You mentioned Three Mile Island, and I'm sure we'll talk about different aspects of that, but one of the fascinating things about that accident was just how many interactions there were that were unexpected inside the plant.

And so, you know, in a nuclear power plant, for example, you can't send somebody into the core to see what's going on. You have to rely on these indirect measurements. And that's one of the things that showed up in Deepwater Horizon, for example, too. You know, because it was such a deep well in the Gulf of Mexico, you couldn't send somebody down to just sort of see what was going on.

So, you had to rely on these indirect indicators. So, you know, back to our kind of, you know, family vacation analogy, what would make the trip complex is if you had, you know, three or four layovers that you had to make. And so, a complex and tightly coupled trip, which is the kind of combination of these two things, would be one where there are lots of opportunities for disruption and you have very small margin for error.

And so, when you have both of those things, when you're in a system that's both complex and tightly coupled, that's when you're in what we call the danger zone, which is this place where small disruptions can kind of spiral and create these big effects in the outcome of your system and big effects that are not expected.

- Yeah. I think…you talk about it, too, in the book in terms of linearity. So, like an assembly line, right? It may be super long, there might be a lot of parts, but it's very clear, if something goes wrong, cause and effect is very clear as opposed to this giant web of factors, right? And you were talking to me about the danger zone.

And all Kenny Loggin's jokes aside, how does that relate to wicked environments? Because you talk about that, too, in the book.

- Yeah. So, a wicked environment is an environment where it's hard for us as humans to develop intuition around. It's environments where we might make decisions and we don't get a lot of feedback on our decision. So, we kind of are relying on intuition and we tend to be a little overconfident. And you can kind of contrast that with what we call in the book a kind environment, which is one where, you know, you learn a lot by doing, you get a lot of feedback.

And so, safety is kind of an interesting example of this, right? Because there's lots of times in a system, when you're thinking about it from a safety perspective, where you may not get a lot of data on how well your system is operating. You know, I mean, even if you're measuring incidents, even if you're kind of measuring things like that, the connection between that and the potential for catastrophic failure in your system is pretty tenuous.

And so, one of the things that we thought a lot about in the book is, you know, how can people learn to operate in these complex systems better? And a complex system is often a wicked environment, right? Things can look like they're going right for a long time until you have just the right alignment of factors that create these kind of massive disruptions or cause these massive disruptions.

And so, yeah, there's really this relatedness when you think about how easy it is to learn from your system. And, in fact, I think that's one of the things that people can take a real active role in a complex system, is really building processes to really learn, learn about how the system is actually operating rather than how you think it's operating.

And that feedback, in turn, can help give you clues about, you know, what might go wrong and how you learn from it and things like that. And that's true, whether it's software or deepwater drilling, right? I mean, I think it's the same…it's kind of the same set of principles, regardless of really what industry you're in.

- Yeah. In safety, a lot of guests have talked about, you know, work as imagined as opposed to work as done, right?

- Yes.

- I'm sure you've heard that before. So, okay. So, now that we understand sort of what to look out for, what is kind of dangerous, and maybe what not to do when you are designing a system, obviously, many of us, like, safety managers sometimes get to design the system from scratch, but many of us inherit or are sort of working within systems.

And I'd like to spend the rest of our time talking about general solutions and specific tools that we can use to improve systems. So, to start with, it's clear that you can't always influence both the coupling and the complexity of a system.

You gave the example of summiting Mount Everest, right?

- Right.

- In terms of, like, you can't…it's going to be complex. You can't control the weather.

- Exactly. You can't control the weather. You can't control what the other people on the mountain are doing. You can't control, you know, the particular conditions on the mountain at any given moment, but you can think about how to manage those. And you can be very intentional about how to manage those in a different way. And, you know, some of the learnings after the Everest disaster in '96, I think it was, were about how do you build more of a buffer in the system?

How do you reduce the tight coupling? So, you give people, you know, more time to adjust. You give the guides more time to adjust on the mountains so they're fit for altitude. You take logistics off their plates. So, somebody's in charge of, you know, making sure the right food is at camp and things have cleared customs, so these guides can focus their attention on the things that matter and be able to make, you know, the best decisions from a risk perspective in that moment.

- Yeah. Okay. So, you'd say maybe that's a good general approach to a scenario, is if you can't control one of them, the coupling or the complexity, see what you can do, see where you can find some space to change things.

- Exactly. Yeah, exactly. And I think, you know, one of the things, too, is just, say, to recognize, to recognize the awareness that your system is both complex and tightly coupled. If you have that awareness, it really can give you the opportunity to start to almost to treat the system in a different way, to treat it with a different kind of respect.

- Yeah. For sure. I think it gives you a new lens. Like, certainly, since I've been reading the book, I'm, you know, reevaluating things like, "Oh, okay. How does coupling fit into this?" And it is a really helpful way to look at things.

- Yeah. Tell me, how has the concept shown up for you? What's… Yeah.

- Oh, well, workflows at work, honestly, like setting up workflow systems, right? Okay. First, this has to happen, and then that has to happen, but not until this has been completed. Can we do that? And then this person needs to...etc. Right?

- Right.

- In my case, it's communications, not safety, but it's the same idea, the same thing for safety, right?

- Right.

- And so, I'm thinking like, "If I built in a little bit more slack here, then there wouldn't be, like, a cascade failure in…" - Right.

- Right?

- Yeah.

- As they call it with computers. Yeah.

- No, exactly. That's exactly it. And, you know… Right. So, you can build in more slack. And, you know, sometimes you can't make the system less complex, right? You often need a bunch of people to touch a process. You know, the process is often dictated by either the kind of demands of the organization or by the physics of the situation, right?

I mean, I work with a lot of engineers, a lot of safety professionals, and sometimes the process is just the process. But what you can often do is try to…if you can't simplify it, try to inject some transparency into it so you can at least, you know, very clearly see, like, "We're at a part of the process where, you know, somebody hasn't responded to the task that's been assigned to them or the comment that's been given to them."

And, like, if you can see that, then that gives you an opportunity to just be curious about what's happening and see if you can go and intervene. And, you know, maybe they haven't seen it, or maybe they need some help because they're swamped with something else. And if you can inject a little bit of transparency, then you start to get a better feel, overall, for the system for what might be going wrong or what might be helpful to kind of modify or adjust in real-time.

- Yeah. And I like what you say about getting curious, too, because that speaks to kind of setting down assumptions. Typically, I think humans will see, "This went wrong. Well, that's because..." blah, blah, blah. We just jump to we know that this is the reason, right? And I think getting curious is a fantastic way to just gently remind ourselves that any number of things could be going on here.

- Yes, totally. I have a friend who…he's written a really interesting book that's called "Agile Conversations," which is all about kind of transforming the way you approach conversations. And they have a technique in there that they call coherence busting, which is, like, when you find yourself really making assumptions or, like, really having a strong story about what is going on, you know, "That person's not showing up at work because they're unreliable," or whatever it is, you try to come up with an explanation that fits the facts but is wild, like, "That person's not showing up to work because they were abducted by aliens."

And it's not that you actually need to believe that they were abducted by aliens, but just kind of…it helps to kind of round out your view of things and helps to expand your mental model so you're not so focused on, you know, something that's very plausible but may or may not be true at the expense of other things that are also plausible. And one may be more true than the explanation you're coming up with.

So, you can start with kind being a little bit silly and playful about it.

- Yeah. I mean, that's…it's like a fun way to kind of push yourself out there a little bit, which I think we all need. We all fall into our ruts of assumptions, and routines, and habits. So, what I'd like to do now is actually go through some of the tools and the techniques that you mentioned in the book and sort of ask you, you know, what are they, how can they be used, how do they fit in, and maybe, particularly from a safety perspective.

- Great. That sounds great.

- So, I'm quizzing you on your book. No, I'm kidding. Perrow's matrix to identify the most vulnerable parts of a system or a project, can you sort of…it's a visual thing, but can you explain?

- Yeah. Well, I mean, in some sense, it's sort of what we've been talking about. It's, you know, looking for parts of your system that are complex and tightly coupled and looking for parts of your system that have a lot of interactions, maybe there's some ambiguity, and clarity, and you know that you are not going to have the time to recover from something that goes wrong. So, you know, one of the things every leader I work with, regardless of what they do, or what kind of organization they're in, or what their profession is, if they're engineers, or attorneys, or safety professionals, everybody has a scarce amount of attention.

Everybody has, you know, a million things coming at them and, you know, not the bandwidth to process it. And so, really recognizing that if part of your work is in this part of the danger zone where it's complex and tightly coupled, then that deserves some attention and that deserves some willingness, again, to be curious about it, to really try to build some processes to learn about the system and to learn, particularly when things don't go as expected, even if you don't get a negative outcome, to focus on, "Well, okay."

You know, "This was a near miss, what could have happened here and what led up to this." So, it's just kind of, I think, for me, a cue to focus on what matters and focus on the parts of your systems that not only are more likely to kind of spiral out of control and have this cascading failure that we've been talking about, but are also, in some sense, more likely to be, you know, not as well designed as they could be, more likely to be, you know, inefficient.

Not inefficient in the sense that you want to drive the tolerances to zero and up the coupling, but inefficient in the sense of, you know, "Is this really how we need to design this system? Is this really how it needs to be working?" So, in some sense, it's this queue for attention, and then you can decide, "Well, can we make it simpler? Can we make it more transparent? Can we make it less tightly coupled? Can we inject some buffer into this process so we're less likely to get these bad outcomes?"

- Yeah. And it actually reminded me of the Eisenhower Matrix, which I'm sure you've heard of. But it's essentially, right, two axes, one is urgency and one is importance. And, again, it's a way to prioritize. But this is a little more closely aligned with safety, right?

- Yes.

- SPIES, that's an acronym.

- Yes.

- What is that? Yeah.

- The subjective probability interval estimate. Which is a mouthful.

- Systems, right?

- Yeah. I don't remember what the S is. You said you weren't going to quiz me, Mary. You know, SPIES is this idea that we humans are…we're just overconfident and kind of continually overconfident. Even when we try to take into account the fact that we're overconfident, we remain overconfident. And so, you know, whenever you have a…we tend to estimate things too narrowly, right?

So, how long is a project going to take? "Well, this is going to take between, you know, four and six weeks." Well, what's your kind of 95% confidence interval for how long it's going to take? You know, like, really, 95% of the time, it'll land between 4 and 6 weeks, say. What you find when you ask people this and then you measure the actual outcomes is that the 95% confidence interval is actually between, like, 4 and 12 weeks or whatever it is.

Like, it's sort of...we need to have much healthier respect for the actual uncertainty in our systems. And so, SPIES is a method of estimation that instead of kind of thinking about it from the perspective of going, like, "What's the interval here?" you kind of look at all of the intervals, and you can use a very simple tool, and we have a link to it in the book, where you can actually kind of…you sort of basically vote for different time intervals.

And when you've voted for these different time intervals that something is going to take or could be, you know, the height of a protection that you need or something like that, when you do that, you actually get a much, much better estimate of the likely range of outcomes that you're working with.

So, it's sort of a kind of way to tap in to sort of sidestep a little bit our overconfidence but still tap into our intuition and really come up with these better estimates for, you know, how long something's going to take, how much it's going to cost, you know, the height of a barrier we need for a dynamical process. We write about it in the context of the seawall around the Fukushima Nuclear Plant being too low, and really that their estimates were kind of narrowly focused around a distribution of recent experience rather than around the set of possibilities that are out there.

And so, it's kind of a way of unanchoring us from our biases and our overconfidence.

- It sounds to me like a good tool when you're trying to introduce slack or trying to make sure that there's maybe enough, right, in terms of time or, in this case, like, wall height. Right?

- Yes.

- Yeah. So, another one I really like is the premortem. Yeah. Tell us about that, and then I'll tell you why I like it so much.

- Great. Well, you know, I mean, a lot of us…well, you know, some organizations do postmortems quite well, where they really focus on learning. You know, others, it's kind of a ritual or a way to really attribute blame. And that's not so positive. But even for organizations that do a postmortem quite well, you know, the chief limitation of the postmortem is that the failure has already happened, right?

So, you're looking retrospectively, but the idea of the premortem is we can tap into what researchers call…they have this great wonky term for it called prospective hindsight. So, we imagine that we are at the end of the event or the project and we put ourselves in that space and we say, "Now, let's imagine that things have gone terribly wrong, that it's been a disaster.

What are the conditions that led to this project being a disaster, or this outcome, this, you know, negative outcome that we didn't want? And what that does is it really frees people up to share in different ways. You know, it kind of changes the bias from, "We want to be cheerleaders of this project and say how successful it's going to be," to now the contribution is, you know, "How can we contribute ways that the project might fail, not because we're fatalists, but just so we can learn from it and so we can think about it?"

You know, I mean, this is something that can be common in a safety profession when...you could add this in easily, for example, if somebody...you know, when you're doing a job brief at the beginning of the day, right. You can ask, "All right. Let's imagine that things go really wrong here. You know, what are the things that led to this negative outcome, sort of given that they've gone wrong?" And what that can do is that can also create a kind of shared understanding of what the key hazards are that a work team might be facing.

And that shared understanding, we know from the research, can actually really lead to better safer outcomes. So, it's a really powerful technique. And you can use it in, you know, non-safety context, too. I had a client that was doing a pretty intensive technological migration from…they hosted all their software on campus, on-premises, and they were moving to a cloud provider.

And it was, you know, a big complex kind of multifaceted project. And before they made the switch over, that's what they considered, you know, "Let's imagine that this goes horribly wrong, what has been the cause of that problem?"

- I think it also...you know, you have a...not a love-hate, but positive, negative view of intuition that it can be positive, but it can also get in our way. But I think this is a positive example of that because at the beginning of a project, if you say to everyone, "Okay, this has been a disaster, let's imagine. What went wrong?"

Everyone has a pretty good idea pretty quickly. Like, it comes into your head like that. You know, right away, "Oh, well, either it was late or," whatever, "it exploded," whatever the situation is. And then you work backwards. And the other reason I like it is because it... You were talking about blame.

I think it encourages people to speak up and to take control a little bit. I think that if you feel like rather than, "I have to take responsibility for the thing that went wrong," you're saying, "I can take control of the conditions so that it'll go right," in advance. And I think you're much more likely to speak up, which leads me to something else that you talked about a lot, which is encouraging people to speak up, including skeptics.

- Yes. Especially skeptics. Especially skeptics. I mean, there's just a huge and rich body of research around this, you know, Amy Edmondson, psychological safety, pioneering work, another colleague of hers, Jim Deers, who also…just really incredible stuff on what it means for people to voice things and what has to be in kind of present in the environment, present in the culture for people to be able to share things, particularly things that are, you know, taboo or, right, speak against the success of something.

But, you know, I will say, we have this framing in the book a little bit, but the more I've done work since "Meltdown" has come out, which was, you know, four years ago now, almost exactly, the more I've seen for myself and as I work with leadership teams, a lot of what I'm doing is coaching the leaders to listen rather than coaching people to speak up. Because, you know, there is an element of power in this, and you need to create a listen-up culture rather than a speak-up culture.

And a listen-up culture is when a manager, a leader, a leadership team really has the ability to hear critical input about what they're doing and what they're thinking and respond to that instead of shutting it down. And I think that's a really important aspect of this that I think if I were writing the book again today, I think we would emphasize that more.

- Okay. Well, that's good to hear. And I think that this is related to…you talk about diversity, building diverse teams. And what was interesting to me was that you said that typically, efforts to create diversity have actually failed and resulted in less diversity. So, first of all, talk about, I guess, why is it important?

Where is the proof, and how can we do things right in that sense?

- Well, it's incredibly important. And the proof is all around us from teams that really make bad decisions and often the uniformity of both from a kind of surface-level diversity perspective, you know, ethnic cultural backgrounds, but also professional diversity.

You know, I work with a lot of teams that are exclusively engineers, or exclusively attorneys, or exclusively software people. And there is a way of thinking that comes from training, comes from, you know, being part of the vocation, comes from selecting into the profession, that is really self-reinforcing. And I think to really make big shifts, to really make big, important shifts, one of the things that you need is the ability to inject.

It's not about creativity, but really the ability to inject kind of fundamentally new ways of thinking. You know, a team will be approaching a question as, like, "Should we do A or B?" And it's very easy to get locked into a discussion of, "Well, here are the benefits of A, and here's why I don't like B. Here's why I like B. Here's why I don't like A."

But if you have the right mix of people in the room, and you have the right creativity, and you have the right willingness to struggle, then you can get to a C, right? You can get to something that is a both/and, that brings in both sides of that. And what the research shows very clearly about diverse teams is that they uncover more information about the problem, they consider more possibilities, they consider those possibilities in different and creative ways.

And part of why they do that is because they are less comfortable with each other. There is more friction. You know, the lesson is, is that we need to build as leaders, as leadership teams, we both need to incorporate diverse talent from lots of different dimensions in our teams and our work, but also, we need to really raise our tolerance for dissent and discomfort and really encourage that and build that into our process rather than trying to move past that to get to a kind of clean, clear outcome, which, I think, is an easy thing to try to do, particularly as you and I were talking about a moment ago, when everybody is, you know, being pulled in a million different directions at once and they have really scarce bandwidth and all of this stuff.

So, yeah. It's so many different levels. Not to mention the level of, you know, morally, it's the right thing to do, right?

- Yeah. Of course.

- It's morally the right thing to be inclusive. But it's got this just this incredibly positive business case behind it.

- Yeah, exactly. It's not just the right thing to do. If that's not enough for you, it's actually also the successful thing to do.

- Right. Exactly. Exactly.

- I like that you mention diversity in terms of career as well because I think that expertise can be a bit of a damper on curiosity, like, or on bringing things up, right? Like someone can say, "I have a question, but, you know, I'm a senior manager of..." yada, yada, "I'm supposed to know this already."

Whereas if you don't have the expertise, you can be the curious outsider. You can make explicit what people are kind of all secretly assuming, that everyone has a shared understanding, and maybe they don't.

- Yes, exactly. And I think you get the kind of...you know, nobody wants to be the one to look foolish, right? And I think there's really… Gosh, I was doing work a couple of years ago with a client team, big oil and gas company. They were working on safety and reliability. And I realized, after working with them for a couple of weeks, I realized that I was feeling this pressure to really show up and have an answer for them.

And I was like, "That's very interesting because I don't know the answer. You know, I don't know how to do this incredibly complex piece of work that they are thinking about doing. But noticing that I had that feeling, I went into our next work session together, and I said, "Hey, I'm feeling this. And I just want to say, like, I don't know the answer. And even if I did know the answer, you guys wouldn't believe me because, you know, you all have been doing this for decades. And here I am. And, you know, I don't know. I'm not an expert in this industry. I bring these skills in this complexity, but it also makes me wonder, do you all feel a pressure to know the answer?"

Because I think a lot of us do a lot of…particularly expertise. A lot of us experts, we really feel the pressure to know the answer. And, you know, the more complex the problem you're working on, the less there is an answer to even know, right? The more it's like you have to try to process that's oriented around experimentation and, you know, building coalitions of people who are willing to work with you on something.

- And formulating questions. Right?

- And formulating better questions. Exactly.

- Yeah. As you were saying that, I was thinking, "This is why consultants exist." Right? If experts could solve, like in an organization, a business could solve something on their own, they wouldn't need a curious outsider to come in and ask the "dumb questions" and sort of reframe and get people to make explicit all their sort of assumptions, like, "Well, now that it's all on paper," that, you know, "solutions will present themselves."

- And, you know, I'll just share a little bit about my own journey in consulting because at the beginning… Because I think a lot of consultants really are expert consultants. You know, they come in and they know a process. "Here's how you can optimize the speed of, you know, your chocolate-making process." Right? "Here's how you can do this." I mean, there is space for expertise, but what I've realized as I have developed my work and deepened my practice is that I'm showing up very much in the latter, in the category you talked about, like, I am a curious outsider.

It's very clear. I'm never an expert in what my clients are experts in. But what I do bring is the perspective about complexity and the ability to ask questions and a lot of tools to help the leaders I work with shift from their own expert stance and themselves become more curious, become more open, become more willing to be vulnerable and say…you know, saying you don't know the answer is really scary in a lot of organizations.

And for a lot of us, as leaders who have been kind of, you know, promoted over time for knowing the answer, for solving the problem, and then, at some point, you get to a problem that is not…it's not solvable in the same way.

- Well, I think it's the combination, like, yeah, you aren't the expert, however, because you're helping them reframe and then they do have the expertise probably to solve the problem once they kind of look at it differently. So, I wanted to move into warning signs and learning from small failures. So, you talked about the Aviation Safety Reporting System, and it's actually related, like, why does this work?

First of all, what is it, I suppose?

- Right, right. Yeah. So, the Aviation Safety Reporting System, which I can't help but call ASRS, is this effort run by NASA to kind of gather and look for trends in what is going on in commercial aviation and actually general aviation as well, mostly centered around in a U.S. context.

And the observation is really that there's a lot of things that go wrong in a system before you have catastrophic failure. And so, you know, most…I mean, you know, the Swiss cheese model is, of course, overly linear, but we do know that there has to be…there's almost always more than one cause for a disaster or a catastrophic outcome.

And the idea with a system like ASRS is to…and even more than that, the way that the airlines run their safety management systems and the way that they do safety, their kind of safety auditing, the idea is that safety is really this collaborative endeavor, and you have the people who are on the front lines of the operations and you need to hear their voice, you need to hear what's going on, and you need to take that information in and make changes, you know, fix problems.

If there's a confusing note on a chart, you need to fix that. If there are trends of people, you know, having deviations in certain ways, you need to understand that and make the community aware of that. So, it's everything from the kind of technical to the interpersonal. And it is one of the things, I think, not just the ASRS, but the shift from, "We're going to blame the operator," to, "We're going to look at the system," I mean, that is part of why aviation is so safe today.

And I would say that safety was not an inevitability. I mean, that safety was hard-fought by people who really had the courage to start to think of this in a different way.

- Yeah. It's consistently brought up. I mean, maybe because it's so dangerous, potentially, right? It's consistently brought up as an example of an extremely safe industry because they've taken it very seriously. They've learned from failures. The reporting system is an excellent way of sharing data. If one airline is finding something, well, it's really nice if the other airline also discovers that that may be an issue before something happens, right?

- Right. And, you know, like, a lot of this is this learning is rooted in tragedy. I mean, there's a very specific accident where a plane crashed into a mountain just Northwest of Washington, D.C. There was some confusion on the approach plate and another aircrew from a different airline had had the same problem happen just a couple of weeks before, and they had their own internal memo about it, but it never reached the flight crew that was at a different airline that ultimately lost their lives in the crash.

- Yeah. I think safety managers are acutely aware of the potential. You know, they get into the field because they care about people and, you know, they're trying to prevent this kind of stuff. We don't have too much more time, but I wanted to ask…so, a lot of this advice is information that we may have heard before in various forms, like, you know, listen, encourage people to speak up, and various things, but why is it so hard for us to make these changes?

Why do humans avoid taking the good advice that we maybe know already?

- You know, it's a great question. And it's actually…really, since the book has come out, which, as I said, was four years ago, it's the direction that my work has really gone in. I mean, what I do now is I work with leadership teams who are really trying to create fundamental change in the way that they work.

And part of what I learned is that, you know, something like openly talk about your mistakes, right? I mean, that's something you and I have been talking about, openly talk about the things that go wrong so you can learn from your system. You know, for some organizations, that represents a huge cultural change, you know, to shift from a culture of blame to a culture of learning.

And so, then the question becomes, "Well, how do you do that?" And I think, "It's really hard," is the first thing to say. And I think leaders who are leading change are often in a position that feels very lonely. You know, they see a new way to do things, they see a way of changing the way that they work, but they need help to shift there and they need some tools, and some scaffolding, and they often need some support because of that loneliness and because a lot of what they are doing is, again, dropping their tools of knowing the answer and moving towards this experiential, experimental, exploratory mindset.

And so, you know, change happens, I think, in short, when people see what's in it for them, when people understand how the change is going to affect them. And it doesn't mean every change has to be positive for everyone. But what it does mean is that you really need to be sensitive to the fact that when you're making people change, you're pushing them to a threat state.

You're pushing them towards the unknown. And so, they need to know, "What's wrong? What's the cost of the current system as it is now?" Because, for most people, what's working now kind of works, right? I mean, otherwise, it wouldn't be what was working, right? I mean, all of our…what we do now is adaptive, in some context. And, often, the system changes and it gets more complex, people need to think about more different things, but one of the things that…so, one of the things I often will coach leaders and leadership teams on is you really have to start by…it may seem obvious, but you have to start by talking with people about the problem.

You have to kind of talk with them about the problem, share with them what you're seeing, and then get curious about what they're seeing. So, you know, "Here's what we're seeing, here's how this, you know, rule," etc., whatever it is, this way of working, "here's how it's affecting the company. How's it affecting you? You know, how does this impact you?" And once you start, once you can kind of…I sort of use the metaphor of, like, buckets. Once you can kind of fill up somebody's bucket so that they really understand the problem and they also feel like they had a voice in helping to define and identify it, then you can start to think about how do you get started with a change?

You know, what direction do you want to go? How are you going to actually implement that? And it's the same thing there, it's you have to suggest to people, "Here's what I'm seeing. Here's what I'm thinking." Or, "Here's what we as a leadership team are thinking. What do you all think? What are we missing? How would this affect you?" And it's really this dance. And it's a very different way of engaging with people. It involves a lot more curiosity.

And it's a real contrast to, I think, the kind of…what leadership teams often do, which is they'll go away on their own, they'll discuss things, they'll think about it a lot, and then they'll kind of come out with a memo or, you know, a procedure and, you know, back to work is imagined versus work is done, I mean, they're often targeting work is imagined and they don't…because they haven't gone out there and done the work to work with the people who are on the front lines and understanding the context, there can be a lot that gets missed.

So, yeah. I don't know. Is that, you know…I could talk for hours about this.

- Yeah, absolutely. I'm going to move into some questions that I ask every guest that are just not necessarily related to the topic, but just interesting. So, this first one I'll call the University of Chris. If you were to develop a safety management training curriculum, where would you start when it comes to non-technical training?

So, we're talking about core skills, human skills, what are the most important ones to teach tomorrow's safety professionals, in your opinion?

- I think it's coaching. I think it's how to be more coach-like. I know, for me, my own practice and view of the world has changed as I have deepened my ability to get curious and ask people questions and to not so much show up with an opinion, but to clear the way for them to do their best work.

And so, I think that's it in a word. And there's actually some really interesting data on this, too. Linde, the chemical manufacturer, did a really interesting program where they taught their safety managers to be more coach-like, and they saw some really...you know, not just improvements in safety, but improvements in their kind of whole process.

- So, that actually reminds me of what I had been thinking of before, which is one thing that we hear from safety managers a lot is one thing that's very hard for them is to get buy-in. And so, I think that when you were talking about presenting problems and getting curious as opposed to imposing procedures from on high, you know, going away and saying, "This is the solution," I think everything that you described there and here is excellent in terms of getting buy-in, right?

Just getting people invested in making change.

- Totally. And a phrase I use in my work is anytime you're looking for buy-in, you've already lost, right? Buy-in is when you give somebody an idea and kind of force them to match it. And I think that the way forward in change is almost always co-creation. You have to have the people at lots of different levels being part of building the solution.

It's a nuance, but I think there's a real difference there.

- Yeah, no. I see that, and that kind of goes with your coaching approach that you're talking about, too.

- Yes.

- So, a couple more. If you could travel back in time and speak to yourself at the beginning of your career and you could only give young Chris one piece of advice, what would it be?

- Gosh, that's a good question. I don't know if I could have changed any part of the path that I'm on without...like, even when it's been a struggle, even when it's been hard, I mean, each of these…every moment has been a moment that's been designed for my own growth, and education, and experience.

And so, I don't know if there are shortcuts for young Chris. I mean, I think I would've...you know, if I could go back, say, start therapy younger, right? Start therapy earlier, like, get on this path earlier. But yeah, I think that's…you know, it's a really interesting question, and I think…I don't know if I have an answer because I think that even when I've struggled, those struggles are part of what has given me the ability to do what I can do today and work with people who

[crosstalk] do today.

- I had another guest who sort of gave the same answer, is that, "If I'd gone back, I wouldn't have listened to myself. I couldn't have given myself that shortcut because I had not experienced the struggles or the situations that brought me there." Right?

- Right.

- You've given us a ton of tools and ideas here, but I do always ask at the end, are there any resources that you highly recommend for safety professionals? So, books, or websites, organizations, even just concepts. It's tough because that's basically all we've talked about.

- Right. But, no, it's…I mean, there's a ton of stuff. I'm just trying to think of what…well, here's a book that one of my clients who's a safety professional in the film industry, she recommended to me as part of our work together, which is, "The Infinite Game" by Simon Sinek.

And it's a very easy read, and it's an interesting book, but what it talks about is basically that we need to take care of people and that the...you know, we can't have…business doesn't exist without the people that do it. And to get people at their creative best, you need to engage them and you need to care about them.

And I think that there's a whole lot of books like that. You know, the great example of that is the transformation at Alcoa that Paul O'Neill brought about, using safety as the lens. And, of course, "The Power of Habit" has written about that brilliantly. I think the other thing I'll throw out there…I just recommended this book to a friend of mine. It's a book called "The Art of Possibility."

It's written by an orchestra conductor and a therapist. And it's just a really interesting book if you're interested in kind of growing your own personal mindset. And, for me, it's made a big difference in my work, and my life, and my relationship with my partner, and just kind of how I see and navigate the world, because I think being able to manage ourselves and manage our own reactions, and uncertainty, and anxiety, I think that's a huge part of being a capable leader these days, again, because the problems are so challenging, because the problems are so hard.

And a book like that, I think, is a really nice first step.

- I love books that, as I said, make me look at things differently, and I will put yours in that pile, right, where it's like, "Oh, let's use a new lens." It's fantastic.

- Thank you.

- Where can our listeners find you on the web?

- Well, you can go to chrisclearfield.com. That's probably kind of an easy place. And I think, you know, the… I'm active on LinkedIn. So, Chris Clearfield on LinkedIn as well. I write a lot on LinkedIn. If you go to my website, though, you can sign up for my mailing list, which is all about how to create transformational change in your organizations.

And I think that's really kind of a nice touchpoint. So, sometimes I'll...you know, I send out stuff every week or so. It's not spammy. I mean, it's always…I spend a lot of time thinking about, you know, what it is that the leaders I work with, the clients I work with, what it is they're asking about and what they need, and then I kind of put that out in the world.

We have a discussion today on LinkedIn, something I posted about. What it means to break the rules and when you can learn from breaking rules, and what's that look like? But yeah, so, chrisclearfield.com is probably the best place to go. Get on the mailing list and get content like this. And then sometimes I send special offers. I have a class on leading change that I run.

Sometimes I send out discounts for that, things like that. So, that would be probably the best way.

- Awesome. That's great. Well, that is all the time we have for today, unfortunately. Thank you for joining us, Chris, and thanks to our listeners for tuning in.

- Mary, this was great. Thank you very much for the interview and, yeah, for the really skilled way you guided us through all these different concepts. I appreciate it. Thank you. ♪ [music] ♪ - Safety Labs is created by Slice, the only safety knife on the market with a finger-friendly blade.

Find us at sliceproducts.com. Until next time, stay safe. ♪ [music] ♪

Chris Clearfield

Co-Author of MELTDOWN: Why Our Systems Fail and What We Can Do About It and co-founder of Clearfield Group

To find out more about Chris’ work, visit: https://www.chrisclearfield.com/

Chris’ book, “Meltdown: Why our systems fail and what we can do about it” - Financial Times' best business book of the year, 2018