Bob Edwards
EP
53

Managing Complexity in Workplace Safety

In this episode, Mary Conquest speaks with Bob Edwards, a Human and Organizational Performance (HOP) coach, who has a Bachelor's Degree in Mechanical Engineering and an MSc in Advanced Safety Engineering Management. Bob is also the co-author of ‘Bob's Guide to Operational Learning’ and a member of The Hop Hub, a consortium that guides people on their HOP journey.

In This Episode

In this episode, Mary Conquest speaks with Bob Edwards, a Human and Organizational Performance (HOP) coach, who has a Bachelor's Degree in Mechanical Engineering and an MSc in Advanced Safety Engineering Management. Bob is also the co-author of ‘Bob's Guide to Operational Learning’ and a member of The Hop Hub, a consortium that guides people on their HOP journey.

HOP considers how humans and organizations interact to accomplish work. It's a thought process that allows us to build more error-tolerant systems and teaches us that expecting perfection from workers, processes or procedures is not realistic.

In this thought-provoking interview, Bob helps safety professionals explore the complex nature of work and learn how to gain a deeper understanding of their operations.

He explains why modern work is so complex with a multitude of constantly adapting interconnecting interactions and reminds us that humans aren’t just reliable machines.

Bob suggests leadership doesn’t usually appreciate this complexity as they focus on high-level metrics and urges HSE professionals to be more open about the actual context underlying these numbers.

He reveals the limitations of simplification, the 5 Whys, making predictions and trying to fix everything. And highlights the virtues of listening, curiosity, respectful relationships, psychological safety and learning from success (as well as failure).

HOP is often accused of lacking accountability, and Bob addresses this challenge and explains how HOP will actually increase co-worker responsibility and commitment.

Using many real-life examples of the benefits of operational learning, Bob teaches us that understanding complexity coupled with collaboration, empathy and mutual respect can create more reliable and resilient workplace safety.

Transcript

♪ [music] ♪ - [Mary] My name is Mary Conquest. I'm your host for Safety Labs by Slice, the podcast where we explore the human side of safety to support safety professionals. We move past regulations and reportables to talk about the core skills of safety leadership, empathy, influence, trust, rapport.

In other words, the soft skills that help you do the hard stuff. ♪ [music] ♪ Hi there. Welcome to Safety Labs by Slice. Many of you have probably heard about human and organizational performance, sometimes called HOP or HOP.

In fact, I interviewed Andrea Baker about what HOP is back in episode 13 of this podcast. She and our guest today co-wrote a book called "Bob's Guide to Operational Learning." In the book, HOP is defined in this way. HOP is about learning how humans and organizations interact to accomplish work. It's a thought process that allows us to build more error-tolerant systems and teaches us that expecting perfection from workers, processes, or procedures is not realistic.

So today we're going to zoom in on one of the most difficult and interesting facets of operational learning, and that is complexity. To help explore this, I have Bob Edwards with me. Bob is a human and organizational performance, HOP practitioner. He has a bachelor's degree in Mechanical Engineering and an MSC in Advanced Safety Engineering Management.

His work experience includes time as a maintenance man, a soldier in the U.S. Army, a design engineer, a maintenance and technical support leader, a safety leader, and an assistant plant manager. And of course, he's the co-author of the book "Bob's Guide to Operational Learning." Bob helps organizations gain a deeper understanding of their operations and the complex nature of work.

Through collaboration and mutual respect, Bob helps organizations to become more reliable and resilient. And he joins us today from Soddy-Daisy, Tennessee. Welcome.

- [Bob] Ah, thanks, Mary, it's nice to get a chance to chat with you. And you said it properly, Soddy-Daisy. It kind of rolls off the tongue, right? It's a little town near Chattanooga, Tennessee, at the base of a mountain, so it's actually quite pretty.

- It kind of sounds very exotic, I'm sure, especially for our Australian listeners. Chattanooga is like… - Chattanooga is a great place to live. It's not too big, but it's got everything you need. It's beautiful. A lot of mountains and creeks and beautiful places to go.

- Okay, well, I have got a lot of questions for you today, so let's dive right in and let's talk about complexity. So, the following was a new term for me. What is the field of complexity science and how does it apply to HOP? So, I'm especially interested in the idea of emergent properties.

Can you just give a rundown for our audience?

- Yeah. So, Mary, it's interesting to me because I've been in the industry since about 1989 when I graduated out of college, Tennessee Tech, and I've still been working in the industry making stuff and designing stuff and repairing stuff for a long time. And we have all these different things that we've done along the way to try to make things better, make them safer, more reliable, and a whole lot of good stuff has been done but we felt like something was missing because it was that the tools that we had, the systems, the way we looked at our systems and the way the interaction of humans and systems together happened, we were treating everything as though it had a definite root cause if there was a problem.

That there was a linear path of failure, if you just would have stopped at step four, it wouldn't have happened. And yet we knew there was something else going on. So, on this HOP journey when we first started it around 2013 or so, complexity wasn't a big part of the conversation. Actually, it was more about sort of the work as planned versus work… I think we call it black eye blue line.

Work in reality. And then because I'm a practitioner of this stuff and have just done a ton of stuff in this space and I do a lot of operational learning, the more we understood how work really gets done. And I know this is an old maintenance guy that never goes to plan and I have twelve adopted children so nothing ever goes to plan with them. And so there was something that we needed to understand differently.

And what was missing from the conversation, at least in my world and seems to be in most of manufacturing and most of industry really, is a deeper understanding of what work truly looks like, how all these things interact and connect and couple the human and the system together. It's not one or the other.

It's the coupling effect of all this and the fact that when bad things happen, it's actually just a whole lot of I would just say normal stuff is out there. We call it coupling of normal variability. But what it means to me is just there's a lot of stuff out there and when a bunch of it doesn't work well at the same time it can lead to a bad day. The interesting thing is that stuff is out there all the time.

So on a good day, a bunch of those things sort of over-perform and we sometimes will have a great production day. We're like, "What did we do today that was different?" Nobody can put their finger on it. Well, that was a complex success. There's a bunch of variables that overperform. So the more we started looking at like Holland and David Snowden and these different guys that were complexity side, because they are super smart people.

We just would study their thoughts around complexity and then we would just apply it practically into the world of work. We're like, "Holy cow, this is what we've been missing." This is a huge part of how work gets done. Most work I think, is actually that we would call it complex. Lots of pieces all interconnected and coupled together and if several of them or many of them do not work well, that interconnectedness of them can lead to a failure.

And so it takes us to a place where we're not focused on root cause anymore at least not when it comes to humans and systems failing together. I think root cause still has a place for sure in, like, an ordered system like an engine that fails or a circuit board that the trace blows out or whatever. But for the world of work, it's much more involved in that.

So, I don't know if that helps a bit or not but we call it complex adaptive work or complex adaptive systems because the human is actually very adaptive. I say this all the time. Amazingly adaptive, really. The reason we're successful is not because we have great systems, sorry, we don't. I mean, nobody, really.

I work with over 130 companies and I mean, we have some good stuff, but nobody's perfect and the processes and the equipment is all falling apart but continually try to keep it running. Procedures are incomplete. Nobody has perfect procedures either. And the reason to a large degree that we're successful is the incredibly adaptive nature of humans. And so humans are very adaptive. We're brilliantly resilient, right? Tornado rolls through here we're out there rebuilding the next day but we're not very reliable.

Humans just aren't that reliable and yet we keep expecting it to be. And so we're expecting sort of the impossible, right? We're expecting a human to be more like a machine or robot. That's not how we operate. So that's kind of where we're at right now is realizing that work, for the most part, is complex even in factories where you might think it's not because there's all this interconnectedness and the humans in there adapting and figuring it out and for the most part, making us successful.

- So you said that something has been missing and I wonder what you think. How do people in general and safety professionals, in particular, tend to respond to complexity?

- Boy, that's a six sigma answer. It depends. Some people, I tell you this, I think the people closest to the work tend to do this. When we start talking about this stuff, they're like, yes, and we've known this all along, right? And thank you for acknowledging that this work is a mess, that we really are out here… A lot of operators struggle going on here.

We're trying to get this stuff done. I think sometimes they're higher in the organizations where they, unfortunately, don't get to see enough of the sort of the context of work. Unfortunately, we do this to ourselves. We would provide higher levels of metrics, KPIs, dashboards, scorecards, but no real context behind all that.

So sometimes when people hear this, it can be uncomfortable for them. They're like, "Well, wait a minute. Our numbers are really good." Well, yeah. So are a lot of companies who have fatalities. So the numbers being really good don't mean anything to me. It's the context behind those metrics. And I think we got to get much, much better at being sort of open and transparent with the context.

What does it really take to get that done? Best production day ever. Yeah. We almost blew a unit up. Nobody knows that because nobody's talking about it. But the metric said best day ever. So, it could be hard to hear, but I can also tell you this, once people sort of grasp this, then they're not satisfied with simple answers anymore.

Well, Mary, she just pushed the wrong button. Well, okay, but let's look at the control panel that she's interfacing with, right? Let's look at the production pressure that she was under. Let's understand all the stuff that was going on that she was managing in that moment. And all of a sudden, it becomes pretty obvious that the reason we don't have a lot more events is because Mary normally doesn't push the roll button.

And so that can be hard for some people initially. Even some people resist it. I had a manager one time say in a workshop, he said, "Well, what if I don't accept this human error thing?" i'm like, "I don't care whether you're accepting or not. It's there." You accepting it or not, accepting it, you know, that doesn't really change anything as far as it's there.

So, yeah. I'd say, you get the whole spectrum, and this notion of emergence mentioned earlier, things emerge and what's interesting to me that things will emerge and sort of, for lack of a better term, sort of bubble up in the system and the operations people, the operators, for example, a lot of times they're really good at managing that. And then we look at it from the outside if we're not close to that work, and it sometimes seems like, "Well, why didn't they see that?"

Or why didn't they just do a stop work or why didn't they just push pause or whatever? But in the middle, they were actually managing through all that. As things were coming forward, they've seen something similar before, so they're trying to manage their way through and adapt their way through it. So I think that things emerging and surprising us, it tends to happen further away from the work. The people close to the work are like, "Yeah, I've had a bunch of situations similar to this before."

I could tell you this, Mary. I've done literally hundreds of operational learning teams and all kind of operational learning stuff, which is a big umbrella that learning teams kind of falls under. And I've never actually seen something completely new and unique. I've never seen an alien abduction, right? I mean, it's always just like, "Yeah, this and this and that at the same time, this shifted left instead of right, and next thing I know, we got a problem.

And the operations people can just talk you right through that stuff. They're like, "If it's okay to tell you, I'll tell you the truth, and the truth is this is what we deal with a lot."

- Yeah. So they have that context just through experience, right? And it makes sense that people higher up don't have the context. It's not their job. And it makes sense to some degree that they rely on data as a bit of a shortcut. Yeah, but shortcuts can lead to problems.

- Todd says this, "Great leaders make great decisions if they have great information. A lot of times it's a data input problem." That's context. If I had the best production day ever, and I don't know that they nearly blew the unit up, all I see is it's the best. I'm probably buying pizza for everybody, right? Because I don't know what they actually had to do, because nobody's talking about that, not higher up in the organization, because they want to see those numbers and they run by the numbers, all that stuff, right?

And I'm not saying get rid of metrics. I'm just saying metrics without context can be deadly. It's certainly dangerous. Maybe deadly a little is strong, but they actually can be. I have to know the context. So the more curious leaders get about context and the complexity of the work, which is where all that context sort of it is, the less satisfied they'll be with just, "Hey, we made good numbers this month."

They want to know, what did it actually look like to get those numbers?

- So would you say that HOP-based learning or operational learning is the art or science or craft, whatever you want to say, of diving into complexity, like leaning into it as opposed to trying to simplify?

- You do have some great questions. That's a great question. I've heard arguments on both sides. I've heard people say, we have to simplify. I worked for a really big company for 16 years, and we had a lot of simplification efforts. I'm not opposed to those.

But I actually think where my brain is right now, Mary, I think we actually need to get much better at understanding complexity and managing complexity. I don't think the world is getting less complex. I think it's getting more complex. So I'm not opposed. I guess I'm a little bit concerned that in an effort to simplify, we lose context. And if we lose context, then all of a sudden now we don't know where we really are.

So if I had to lean one way or the other, I would lean more towards, let's get better at understanding complexity and knowing how we manage it and how we need to manage it, which means transparency is going to be huge. I got to be able to tell you, Mary, what I really took to do this without fear of retribution, because if I think you're going to write me up, put me in time out, I'm not talking, right?

And so I need to be able to have… I mean, Martha Acosta talks about psychological safety. She's quite brilliant, actually. But what it is, is that it's got to be a safe place for me to tell you because we're changing from this sort of adult-child relationship to adult-adult relationship here at work. It was just brilliant. We needed to have done this a long time ago, but now I can tell you this because I know that you don't think there's any way I would have done that on purpose.

I didn't mean for that to happen. I came to work to do a good job.

- I think probably the first step is just accepting complexity.

- Yeah. Yeah. Yeah.

- Again, it's there whether you accept it or not. But I think if you accept it, then that's step one.

- We can have, like, a HOP on out of it, right? So, my name is Bob Edwards. I finally accepted that the world is complex. Right? Thank you for sharing, Bob. I guess I'm just supposed to give my first name. Sorry, I'm not very good at that.

All right, go ahead. Sorry.

- Yeah. So let's talk a little bit about how one "does HOP". So there's a framework that you mentioned in the book for learning called five whys. So can you just quickly explain what that is for our listeners who don't know? And also why in the context of HOP, you say that five whys is insufficient?

- Yeah. Well, the five whys is better than what we had before, which was the one who. Who did this, right? At least with the five whys, like in Wayne, Mary, or Sam or something. But the five whys was really built, actually decades ago to try to go deeper into the system.

I think Sakichi Toyoda came up with the idea anyway. It does help you look deeper into the system, but it's really, really linear. Like step one led to step two, led to step three. And if you had just stopped at step four, but it really wasn't step four. It was actually just work. In hindsight, it looks very linear. Like, this led to this, led to this, led to this, but in reality, it was just work.

There's no ominous music playing at step four. You ever watch a scary movie like, don't open the door, listen to the music, right? They open the door and die. So, five whys is fine by like looking at how a pump failed or a circuit board blew a trace or something like that. But if you can introduce a human with a complicated order system and all that interaction that goes on, it's just a very linear tool.

It's way too siftful for complex failure. Now, I also know that some companies are like, they're fixated on, we must use a five why and everybody is using it all the way through the organization or some version, a fishbone or whatever you want to call it, the Ishikawa's. But I also understand that. I mean, I have worked in industry most of my career, and so what I have seen companies do is they do good operational learning.

They have rich context, and then they kind of, for lack of a better term, sort of force spit it into a five why so that people can see it in a familiar framework. The one I saw done had brilliant context above each one of the five whys, right? And the people that saw it were like, "Well, that's the best five why I've ever seen." It's not a five why.

It's operational learning but we force-spit it back into that tool because it's what people want to see. So realizing we get comfortable and we want certain things. It's just those tools…it's just like the chain reaction stuff we talked about in the military. Like, you break a link of the chain, it won't happen next time. But it actually was not a chain of events. It was mission.

It was mission. I mean, we were doing what we were doing, and all the stuff we have to manage, and yeah, nothing sort of coupled and connected and led to a bad event or a bad day. So, I think it's okay to sort of let go of those tools a little bit and not be so fixated on the tool. I also don't want the tool to drive the conversation. I don't want somebody asking me five times why, why, why?

I mean, I don't. I used to do it myself. It's just we want to really move back away from the event or the problem and say, "Teach me what it takes to operate that piece of equipment or run that process or whatever." And just look at what it takes to run that but understand normal work, we say this all the time. Failure happens in normal work. And so if you study and understand normal work, you'll see all the brittleness, the challenges, the frustration.

You could talk about the event. We're not shy about talking about events. We just don't start with the event, as in why did this event happen? If it's an event-based operational learning, the event brings us together. And we say that, but then we back up and say, okay, tell me first of all, what does it take to operate this, or what does it look like in a day of running this process?

Because that's how we're going to get a lot more of that context. Because if we fixate on that event, we'll fixate on where Mary pressed the wrong button, right? We'll miss all that rich context that was all around that. So we actually kind of flipped that equation around, back away from the problem, work our way towards it. If we've done a good job of operational learning, whether it's a learning team or in production meetings or however we're doing this, the event becomes not very interesting.

It becomes kind of obvious.

- Yeah. So, I wanted to talk about this interplay. So there's hindsight bias, I think, which is the why didn't you hear the ominous music?

- Yeah, exactly right.

- In the book, it says, the goal of any operational learner is to listen and learn until you understand the logic of everyone's decisions and how you could have made the same decision in the same circumstances. So is it fair to say that if you truly understand the context, the event seems inevitable, or is that going a little too far?

- No, actually, yeah, I would almost say could, would, but you're right on it there. I think for me, Mary, when I do really good operational learning, I'm surprised if they don't have a lot more events. You're right. It almost looks like, "Yep, that one was there." I mean, all the ingredients were there for that to happen.

So, I think if I ever reached a point where I think I probably would have done the same thing. And it includes good work, too. Just for me personally, as an operational learning guy, that's what I do a lot of, right? I want to keep learning until, like, yep, I think I'm going to do the same thing. That's when I gained an understanding because if I'm still thinking, "Yeah, but Mary, you should have paid attention to the button."

You pushed the wrong…well, that's hindsight bias, right? In hindsight, we're all geniuses, right? I mean, we are. Afterwards, I was smarter than Einstein in hindsight, but the reality is that I didn't know that was about to happen or I would have done something different.

- I want to read a short quote from the book, and it says, the art of asking questions is really about the art of listening. So this is a cornerstone of operational learning. Can you expand on that and explain what that means?

- Yes. Listening's the word, right? What do we all do? I'm bad about it myself at listening to respond as opposed to listening to learn and not some cheesy reflective. So, what I hear you say, Mary, is…sorry, if anybody out there loves that kind of approach, that's fine.

Don't be offended by me saying that. But it's not that. It's genuine curiosity. If we can notch up that curiosity to where when we're talking to somebody and listening to somebody and they're talking about so that we're actually listening to learn something. And I saw a plan manager recently, Mary, he had, like, a moment there. He's like, "Oh, my goodness." He said, "I've been trying to connect with people in the shop, and when something happens or whenever I'd be out there or just watching the work, I'd say, "I used to do that job."

And he said, "I realized how that's coming across is I think I know more than you do." And so now what he did in this journey is he's changed his approach. He realizes he needs to learn something from them, and so he says something along these lines, "I used to do that job, but I don't do it today. What's it like now?" Now, that's setting somebody up to share with him to where he can listen and actually learn something instead of just saying, "Yeah, I used to do that.

I always did it this way," or "You should have done that way." It's like, you know, "I used to do it, but I don't do it now. Tell me about it. What's it like now?" Is that true? Appreciative inquiry, right?

- So curiosity is one of my favorite topics.

- Which is why you're good at doing this, right? You're in the right job then, Mary, because curiosity is built on these things, right?

- Yeah. So I think one thing that you mentioned was that people often come to you and say, "Okay, well, if I want to operationally learn, what should I ask? Please give me a list of questions that I should ask." And you point out in the book, if you're genuinely curious, you kind of don't need a list because the next question presents itself.

- Yeah, really. It comes from what you just learned, right? But I also learn that from screwing it up. So I was at a site, and they said, "Would you give us a list of questions?" What I thought they said to me was the types of questions to start these things out. And at the time, I thought, "Well, yeah, no problem." I don't remember 10, 15 minutes.

I remember it was 21. I came up with 21 different questions. I was back a couple of months later, and they're like, "Bob, we need to talk to you because some of the questions you told us to ask don't always make sense." I'm like, "What are you talking about?" They're like, "You gave us a list of questions." I'm like, "Give me those back." No, those aren't the questions.

There is no list of questions. That's just the type of questions that come to my mind as I'm learning. But you're spot on. If I'm listening to learn and I'm curious, then from what I just learned, it actually sort of sparked my next point of curiosity. Just like, we were sitting there talking about, if you told me you just bought a 1965 Jaguar out of a field that had a tree growing through it, and you're thinking about restoring it, I got a million questions.

Obviously, it's like, "Wow. Does this show the engine in it?" You've blah, blah, blah. Here we go. So just being curious in the conversation, and I think it's Ajahn Brahm that he's a monk that does a lot of public speaking. He said what he tries to do is the most important person to him is the person he's talking to right now. Man, that's some pretty wise words right there.

I need to work at that because we're also distracted, right? We're talking to stuff and we're messing with our phone and taking selfies, right? We're not really right there in that moment. So I think that has helped me a lot, too, is just to be in the moment with them, listening to learn something new and then being sort of this may sound kind of goofy, but kind of being energized by the fact that you're learning something.

I mean, I love learning. You're curious, right? It's actually kind of cool when you're like, "Whoa, I don't think I ever thought of it like that before."

- I'm going to move on. One of the questions or one of the quotes that jumped out when we were reading this that, I don't know, it resonated with me was we don't have to fix everything. We discuss. So, let's discuss that.

- All right. Well, first of all, I don't think we can. We don't have the resources, right? But I also don't think we need to. I think when we do good operational learning, people close to the work. And I'm using that broader term rather than just learning teams because the book is about operational learning with actually an appendix on learning teams themselves, which is a thing. We created it because there was a need, but that is grown and it is operational or anything.

And so the stuff that we learn about…we actually want to get ideas from the people close to the work. We also want ideas from engineers and managers and whatever. But we want to bring these thoughts together. And then I think we want to work on the things that especially the things that the people close to the work believe will make this better.

- Well, I was just going to say I think that it's helpful because if you think like, okay, you have to embrace complexity and you have to learn about the context and understand all this, I think for many of us, the natural inclination is to want to, "Okay, now I understand all the things." Of course, it'll never be all the things, but now I understand many, many more things and the inclination is to want to fix it all.

Which is it can be a little intimidating too, I imagine.

- Well, yeah, it's terrifying, actually. Yeah. It's not possible. So as a matter of fact, because it is complex as we're working to make things better for the conditions now, things are shifting, things are changing. And I've seen this, once again, I didn't study deep theoretical stuff. I'm just a practitioner and stuff.

I just do it all the time and I started realizing, you know, we would say, "Hey, they didn't sustain what we put in place." Which is sometimes true, Mary. Sometimes we put a pretty good thing in place and people don't sustain it. But then my question now is is it really a pretty good thing if it's hard to sustain? And then second of all, I started realizing that in some cases it's not a sustainability problem.

It's sort of a failure of safeguard evolution. Like the things we put in place, we put them in like, I fixed it but the work's still changing and shifting, and something more to key now. We're working differently now. We're still expecting this solution from six months ago to still work now when in fact it needed to actually change and evolve as the work has. And so that's another reason why I don't think we have to fix everything.

We certainly want to make things better but I think you overwhelm yourself if you try to fix everything and I don't even know if that's necessary. I definitely want to fix the things that the operators believe will make the work better and of course, any compliance that we always do. I don't want to be out there.

What about compliance? So it's compliance things that serve us, we need to straight that out too. But we got to make sure that the work is actually better, right? We're more likely to be successful in the conditions that we have or with the conditions we have out there.

- When you talk about sustainability, I was thinking that, yeah, no given procedure will ever be fully sustainable because it will always change. But what might be sustainable is getting an organization to operate in a way where operational learning is the norm, right?

So they're moving as things shift. To me, that's sustainability.

- Yeah, I think it's a really good way to sustain something, sustain your ability to learn. Listen and learn. And Andy, in my work corner, Andy says it's the speed with which we learn and improve based on what's happening. So you're right. That's something we can work on.

Let's get very good. And you know what? It actually needs some attention to make that more sustainable because we've seen this happen where we've got a leader and a handful of the staff and people in operations all doing really good at this and then leadership changes out and you get some maybe well degreed pedigreed, all these certification comes in and rules with an iron fist and just destroys this good work that has happened and people just shut lying down again.

And, so yeah. I think... It makes sense to me is that you try to work on a sustainable process of learning as opposed to try to say that we got to sustain all these defenses because it's perfect because they're generally not.

- I want to actually go through and pull out some examples. We can't get to all of them because you listed 21 but you talked about what the common conditions in operational learning are. So first of all, I guess let people know what you mean by that and then I want to pull out and talk about some of the types of common conditions that you see.

- So I'm not sure what you want me to explain. What part of this?

- Well, when you say what are common conditions in operational learning? If someone were to just ask you that, what do you mean? What are common conditions?

- Okay. To me it's normal work, right? What are common conditions? There's going to be challenges around procedures. There's going to be problems with tools or lack there of, there's going to be resource issues. There are going to be... These things aren't going away.

They're actually getting kind of worse these days. There's going to be systems degrading and a lack of knowledge because people are leaving at an alarming rate and we're not being able to back them. These are just all…it's the context, right? It's to all the conditions that we deal with out there and I think we have to become really good at hearing it, even if it hurts and then you're trying to sort of assimilate, what does all this mean? If we have a big turnover and people while the companies are struggling with this and we have pretty complex work?

Okay, there's there's an equation for a disaster right there. Just those two things. Now, add in the fact that we have a shortage of parts because of the supply chain issues, and we have, you know, people still getting sick and still going home, or people that just quit and don't come back. And so we have training issues where we've tried to speed up the training because we need to get you up to speed faster, which then shortcuts a lot of things.

All this is just the conditions we're dealing with out there, and we have to be willing to listen to people when they tell us about this. And some of it may be as David Payne from Chevron, and he's retired now, and he used to say this, some of this stuff may lay in back kind of in our office, which may be hard to hear as well. Maybe things that we put in place that we needed to rethink. So in order to do good operational learning, you've got to have the capacity for candor, as I just forgot his name.

He's a ranger. It's really interesting, but he uses that statement, the capacity for candor. Not an army ranger, but like a forest ranger.

- Yeah. No, it occurs to me there's really no room for ego.

- There's no what?

- There's no room for ego if you genuinely want to learn. I want to pull out a couple of them. So weak and unclear signals, for example, when you're talking about that, can you give some examples? Just explain what you mean by that.

- So they're only weak before the event happens. After the event, they're like, "Oh, my goodness, somebody was screaming that, weren't they?" A weak signal, here's a quick example. That thing normally starts up fine. That's a weak signal. I would have weighed his manager. You know what I would say, "Good. Glad it normally starts up fine."

Do you as a hot person, I say, "Tell me about the time it doesn't." Well, here's how we brought that [inaudible] off. It's called a Molotov cocktail, right? So now we're getting into the interesting stuff. So people you'll hear just in conversations.

"Oh, man, somebody could kill that intersection." "Hey, how about those balls this past weekend, man?" You're like, "Wait a minute. What? Somebody's going to get killed out there?" So all these things that just kind of pass through our conversations, you know, some little weird rattle that a machine is making or something that's like, you know, and a person close to the work, they pick this stuff up.

That something's off. "No, it's fine. The numbers look great." The things are…we can look at the dashboard, but that operator out there is like, "Yeah, something's off." And in the HOP world, we'd say, "Send me what you think's off or talk to me," because they literally can. Someone is really familiar. Those weak signals to them maybe aren't even weak. They're actually pretty strong signals.

They may be weak in the system because we're not really picking up on them, but made no vibration or noise or something doesn't quite control... The control is not quite right. That's all weak signals. They're out there. And if we ask, if we're curious, we'll hear them in conversations. People will unlock some pretty crazy stuff behind these weak signals.

Unclear signal is a similar thing, right? I think I understand what you're saying. There's something... There's a command with mortar teams. There used to be, fire is another command. Misfire. Okay, that's not very clear.

That's almost the same word, right? It's almost the same phrase. So there can be that confusion easily in a moment of stress where I think you said fire when you said misfire. So any sort of communication issues or signaling issues that could be easily misinterpreted. And the scary thing is that they're out there all over the place.

But the good news is the people close to the work and tell you about it and come up with some good ideas to make it less confusing.

- Yeah. And so which segues nicely into the next one, which you mentioned before a little bit, is adaptation. Is that when, like, "How come this is duct taped?" "Oh, well, this wasn't working and so we just...we adapted." Or is that, I mean, maybe not the best example?

- Yes. Absolutely, humans adapt. You take even just driving, for example. You may have your destination, your GPS to tell you which turns to make but you are adapting split second by split second sweating and thinking about it for the most part. And so the adapting nature of humans, right? Is we're trying to reach our goal, but as things come at us, we will maneuver around, over, through, under whatever it takes.

And in many cases, if especially people are really good at the work, it doesn't hardly even feel like they're adapting. It just feels like they're doing the work. But from a HOP perspective, we're interested in how much adaptation does it take? You've seen the black line, blue line, black lines are playing, blue lines, reality. How much of that is just straight-up humans adapting to manage all that variability out there?

- I think actually driving is an excellent example, right? Like, we don't even think about it, but you notice that that other car is driving kind of funny, so you're going to hold back from them. Or you notice that the weather is coming in or anyway, there's a lot of… - Yeah, you're adapting literally in the moment, right?

All the stuff we have, never mind self-driving cars. That's the thing that's coming at us for sure. But just in regular driving, all the things we have there to help us, like GPS, lane departures, auto braking, all those things are great. But those things won't get you there. It'll still take that adaptive human to get you there, right? They'll help you get there safer, but they're more likely to get there but you still need that adaptive human in there.

And the adaptive nature of humans is, I really do believe, there's a very, very large part of why we're successful and then when a human messes up, it kind of hurts my soul when I see them, "Why has so and so and so? He's been here forever. I can't believe he did that." In the HOP role we're like, if it could happen to him or her with 20 years experience, right, we should be really in the learning mode now because if it could happen to them, with all they know, it could certainly happen to somebody with less experience.

It kind of gets rid of some of our judgmental… Like you said, you have to be a little less arrogant and a little more humble and a little more sort of teachable.

- So what about latent conditions?

- Yeah, I don't really use that term much. We put it in there just because it's a term that kind of comes from the I mean, you hear more about, I think, of the human performance realm, but for me, it's all the stuff that's out there. Conditions are out there and I think we used to look at it more like, yeah, they're just out there ready to get you. But I don't even know if they're that. I think they're the conditions that are latent in their organization.

Maybe it's something through the procedures, through the equipment, through the lack of maintenance. I don't know what the conditions could be. They could be all sorts of things, but they're just out there kind of all the time and they're just, you know, enough of those things all kind of connect together and it can become whether the human can manage. I heard somebody saying this once recently, actually.

They said it's almost like a bunch of those conditions all underperform at the same time, sometimes in a matter of seconds, and it just gets away from us. We just can't adapt fast enough to counter. So latent conditions, the work conditions, the variability, all that stuff to me, it's just all that stuff that's out there, for the most part, all the time.

- You mentioned the complex coupling of normal variability.

- Yeah.

- Yeah. Okay.

- I would never say that in a regular conversation. So that seemed like a complex coupling of that variability. But you're right, that's what it is. In what way? I would say as an old maintenance manager is, yeah, it's just like a bunch of normal stuff out there and it's connected, interconnected, and if enough of it goes south on us, we got a bad day. But that's what it is. It's a complex coupling of normal variability, which is also how success happens, right?

A bunch of variables overperform at the same time.

- I'm laughing because that's literally the next thing I was going to ask about was... One thing that you mentioned is that we're not just investigating or learning about failure. We're accepting that it's in there, but we're also learning from success.

- Which I think is a better thing to do, actually but we have to. So we have to learn when a failure happens. We can't have a failure and not learn. That's a cost to us, as opposed to investing our brain cells into learning and trying to build a better process or whatever, but with successful work, learning from successful work is one of the most powerful things we can do because all those conditions are still out there, but nothing bad has happened.

Some big companies that have been doing this for a while they don't set a metric on operational learning teams, for example, but they do monitor which ones are reactive, which ones are near misses, which ones are proactive, and they've seen over the past few years that they actually are getting more and more pull to the scales of tip. They're actually spending more time learning about successful work and they're pretty excited about it because they're finding that variability.

They're finding those latent conditions, if you will. They're finding all that complexity of that mess out there when nothing bad has happened. Because I say this all the time. I don't think humans are very good at predicting. I suck at predicting. The reason I don't go to Vegas is because I suck at predicting. And I'm fairly good at math, so I know the odds are against me.

So I just do this sometimes for the fun of it. I push the button on an elevator. There's like three elevators there. Like, I bet it's going to be that one. And it's not even one in three I get correct. Like, it's rare that I ever get it correct, right? So how 1 in 10 million am I going to stand a chance?

So we're not very good at predicting, but I do think we're pretty good at learning. So I think if we if we focus more on learning from success, if those same elements are out there but a failure happens, well, then let's learn about them before the failure. Maybe we'll get ahead of some of the stuff. No, I will restate that. We are getting ahead of some of this stuff. Story after story of operational learning on a successful day that unlocks some pretty significant brittleness in the system.

And that was a good day. Metrics said we were good.

- A, B, and C went well. But let's talk about… So what if B hadn't gone quite so well? Or had, you know.

- Or maybe the only reason it went well is because four of us were propping it up, right?

- Yeah.

- So talk about that. I did operational learning at a site that had really kind of a trophy awarded kind of success story, and their project was actually well done. But when we did operational learning on it, there was a couple of areas there where they were like one layer away from a bad day. One defense dropped out and it would have been a bad day. But that was really valuable because they found they could have just said, "Aren't we amazing? Look at all we've done. We got a trophy to prove it."

But they didn't. They said, "Let's learn. Were we that good or were we kind of lucky or..." It turned out they were quite good, but there was a couple of areas in there that was quite brittle and that was trophy awarded success.

- There's another quote or another concept in there that I have heard before, but I always made an assumption about it, which you point out isn't what you're talking about. So the quote is, "We can blame and punish or learn and improve, but not both." So I always took that as a blanket call to learn and to avoid blame and punishment.

And for the most part, sure. I think some critics of HOP or this kind of thinking think that there's maybe a lack of accountability that operational learning precludes responsibility. But this brings us back to context, because in the book you say that there can actually be a time for blame and punishment.

So can you talk about that a little bit?

- Yeah, yeah, because there's some bad people. These are people we should not have hired. This would be like, "Dang it, why did we hire?" "I don't know. We got them, though. We have got to get them out of here." That's Todd's quote, right? You can learn and blame and punish them you can't do both. And Todd's brilliant at poking you a bit with a stick, right?

To make sure you're awake and paying attention to what he's saying. And it really has stirred up a lot of conversation. But what we fully found is that, in fact, good operational learning leads to true accountability. You know, true accountability minus the legal definition, like I want to hold you accountable in a court of law is a person's willingness to give an account for their actions and to tell the story right of their actions.

I'm only going to do that if it's a safe enough place for me to tell you the truth because I know you know I didn't mean for a bad day to happen. And so this notion of accountability actually in the HOP world we get more accountability but we don't have the old school, which actually wasn't accountability. But I'll hold you accountable. We say this all the time. Accountability is intrinsic. It's a person's willingness to give an account.

So it's intrinsic. I can't make you... It's almost like ownership. It is actually kind of like ownership. I can't make you, Mary, have ownership. Ownership is intrinsic. True accountability is intrinsic.

But we so desperately need control and in many cases, power and control. Some people are literally addicted to it that we feel like we're going to lose our power and control and I'm going to have the inmates run of the prison. Last time a guy said that to me, I said, "It's a factory, not a prison. And they're not inmates. They can go work down the road and probably will if you don't straighten up."

And so this notion of accountability is really, really important. And we need to talk about it more not less. And so HR disciplinary action is and always will be needed for people problems, right? If there's a person intends harm and they don't fit in with the social norms, they don't get to work on time, they don't get the work done.

That's a person problem. And the way we kind of maybe oversimplify it a bit, but basically, if you could remove that person and that problem goes away, yeah, that's where that fits in. Otherwise, it's probably more of a systems problem. So we need HR disciplinary action to give people a fair chance. This is from Andy, my work partner.

Give them a fair chance to sort of align themselves with the social norms of our organization or we're going to ask them to exit. And it also is there to help us solve hiring mistakes, the ones we shouldn't have hired, right? And sometimes you don't know until you got them. But I'll tell you this, I want you to do that sooner than later. I don't want you to wait years upon years.

Mary, I've been in sites where they are like, "Man, we've been trying to get rid of him for 14 years. Probably he missed a step and lockout tag out." Like, "Well, shame on you. This guy has been a bad performer for 14 years, and you're waiting for a lockout tag-out violation to get him? Now you just weaponized safety."

- What if he thought he was a good performer all that time? How sad.

- Yeah, exactly, right? Yeah. I'll tell this. I'll make it completely redacted. So a good friend of mine is in HR and he said he was headed to go talk to a guy. They're terminating him, and the guy is expecting to get a promotion. That's tragic, right?

So that's a leadership problem. That guy has gone all the way to this point where he thinks he's about to get a promotion, and they're getting ready to terminate him and let him go. So this whole performance management thing that we're supposed to be doing, sometimes I think we're not doing it very well. And people don't even know how poorly they're performing in some cases. So we do have poor performers, and we do need them out of our organization.

But I bet it's way beyond safety, and I'm way beyond…not way beyond. I mean, I'm not a safety guy. I've worked in manufacturing most of my life, designing and building stuff. I worked in safety for a few years, long enough to know I really respect safety people. They have to know a lot of stuff. I thought you just went around and yelled at people and said, "Hey, work safe." I didn't know it was more than that.

There is a lot. But I can tell you this that I think it's wrong to wait until a safety event happens to get somebody. I think they've probably been not performing well. They've not been coming to work on time. They've not been doing their job for a long time. I don't know if that helps or not, but I think accountability is… If we want and need accountability, just say that straight up.

We want it at levels we've never had it before. And true and sort of real and raw accountability is me saying to you as my boss, "I don't know how to tell you this, but I think I left the welding rod inside that pump that we just closed up." And you not freaking out and saying, "Well, you're fired," but saying, "All right, Let's get it back open and see." Turned out it was in there. But had he been afraid, this is a true story, right?

So had he been afraid of his manager, he would have said nothing and eventually that thing would have punched a hold of that diaphragm pump and later they'd have taken it down and nobody would have known how it happened, right? So true accountability was him coming forward and saying, "I don't know how to tell you this." And his boss said, "Just talk to me. What's going on?" He said, "I can't account for one of my welding rods. I think I left it inside the pump."

Big [inaudible] pump, you don't care about the details. But the point is, that's true accountability. If somebody says, "For two years now we've been breaking that scraper thing loose and nobody's locking it out because there's no maintenance here at night and we're not authorized to lockout. If somebody returns at all while we're in there, we're going to lose our head. That is true accountability.

That's people being real and giving an account. If they're afraid they're going to get fired, they're not telling you anything.

- Yeah. Leadership has a role in that.

- Huge role. Yeah. We can't have good accountability in a site if a leader is not willing to hear the truth.

- Yeah. So I wanted to zoom out again to the big picture and ask you what is the role of hope and optimism in operational learning?

- What is the role of hope and optimism?

- Where do they fit in?

- Well, we humans made this mess, didn't we? We humans made this mess. We call it work, right? We put all this crap together. I think the fact that there's a possibility, and not just a possibility, we're seeing it happen, that we could actually sort of change some of this, straighten it out and make it better and make work maybe a little more something I enjoy coming to do, that brings some hope to me and if I have a part of it, too many people feel like they're just like a cog in the wheel or a number on a spreadsheet because we treated it like that, right?

You don't like it working here, get a job down the road. I had an HR manager see me one time at a site and he said, "I take a glass of water, put my finger in that water, and pull it out and say, see how fast that feels? That's how fast we'll fill your job." And I said, "You the HR manager? Well, that's terrible." Right? But when people start to feel valued, when we break down that sort of adult-child relationship, that tateristic thing has been around forever. When we start to treat each other with more respect instead of butting heads with each other or treating each other like they're not as smart as me because I have more degrees or a bigger title of my job.

I think one site, Mary, they spent, I don't know, several years just focusing on how do we respect each other better. I mean, well, that's brilliant. Now they're deep into this HOP stuff. It's actually my dear site in Trinidad. I love these guys, and they spent several years really focusing on respect all through the organization. And I thought that was pretty powerful.

That gives me hope, right? I mean, without hope, the people perish, right? That's an old proverb. And so with some hope, we're like, "Oh, wait a minute, we can make this better. Let's do it." For lack of a better term, turn out the dang news and let's do something, right? We can watch and hear all the bad or we can just do something different to make this stuff better. And then actually, it gives me some encouragement when you see some change and we'll slip back.

Nobody's perfect. Nobody's got a hundred percent staying power. We'll slide back, but we just apologize and get back up and go at it again.

- Yeah. Learning. There's something inherently optimistic about learning, right? Knowing, like you said, knowing that things can be better basically.

- That's really good... So, I've never had that question asked to me before. That's a really good question. And I think that hope and optimism is important in life and this particular thing has a lot to do with work, but also has a lot to do with life. So, yeah, if I have some hope that things will get better. So I was a firefighter and a first responder for years.

I can tell you this, we've seen some people that would like not doing well and we'd say, "Hey," on the radio, we get the message back, "Hey, the ambulance is only five minutes away." You could see a change because it's like they now know that advanced care is almost here and you could actually see a change in their response. Even if we believe that there's chance that we're going to survive this, we're going to do better.

- So I would love to have this conversation go on a lot longer, but we're getting close to time. So I'm going to go into some questions that I ask all the guests. So, for this one, if you were developing training for tomorrow's safety professionals and setting aside all technical training, you were to focus on human relationships or that kind of core skills, what skill do you think that you would focus on the most?

I know they're kind of intertwined, but.

- Yeah, I'd probably focus on what we call industrial empathy. Actually, I think maybe Andy's first one that came up with that term, but it's seeking to understand from another person's perspective what their world looks like. So, no matter who you are, if you're a design engineer, if you're a safety professional, if you're a manager, the very first thing I would say is, first and foremost, can you focus on seeking to understand from their perspective what their world looks like?

We will make better decisions as safety professionals. We need safety professionals with all their understanding of compliance and all that stuff. But compliance doesn't make you safe by itself, right? It's just part of it. So a big, huge part of it is, I think, industrial empathy would be my primary focus. It is one of my primary focuses in the work that we do.

Then I have to do what you said earlier. I have to listen to learn and really seek to understand. I can't just, "Oh, yeah, no, no, I got it. I got it." No, I mean, actually, let me take time to…which, you know, sometimes we don't feel like we have the time to do.

- Yeah. When you're talking about industrial empathy, it makes me think of the goal of operational learning to learn enough to the point where you have the empathy that you could genuinely say, "Okay, I can see how this decision was made. I would also have made that decision."

- Yeah. And it's not sympathy. Sympathy is for a different place. There's a place for sympathy, but it's not here. This is hardcore industrial work, but empathy is powerful.

- So if you could go back in time to the beginning of your career, what's one piece of advice that you might give to young Bob?

- Wow. So something that I wish I would have known when I first started in the industry is just how powerful the machine itself is. Because I think maybe naively believed things could happen and change faster than they do. And so maybe to tell myself, "Be patient, Bob."

Because you know how it is, when you see something different, something new, something like, we should all just do that. I've been on this HOP journey for almost 10 years, and it's ever-growing and ever-changing, and I'm always thinking and learning new stuff. And so, yeah, I think I would tell myself, be more patient with yourself and with others, because this stuff is…we're asking ourselves to think differently than maybe how we've been raised or how generations have been raised.

And so, yeah, I think that would probably help me. But of course, I was young, so I wouldn't have listened to myself anyway.

- I've had a few people say that. "It doesn't matter what I would have said. I wouldn't listen. I wouldn't have listened to myself."

- But since it was me in the future coming back, maybe I'd say, "Well, maybe then old Bob." And another thing I'd say, "Man, Bob, use more sunscreen because you'll get a lot of sun damage."

- Okay, so how could our listeners learn more about some of the topics in our discussion today? Obviously, there's the book, which I'm going to hold up now, "Bob's Guide to Operational Learning." But are there other books or projects or concepts or websites that you think are really worth checking out if people want to understand complexity and context and some of the things we've been talking about?

- So I'd read anything you can find around complexity that you find interesting. Some of them are super, super dry, but I haven't found a lot necessarily that's like super easy reading but the website we have hophub.org. There's free resources there and some suggested readings and stuff.

Kaufman's got stuff out. Dekker, Erik Hollnagel, Edgar Schein just passed away. It broke my heart. The guy was amazing. Ninety-something years old. And worked to the day that he passed. But he wrote so much good stuff about really listening and humble inquiry and humble leadership.

Once you start doing this stuff, then you've got podcasts. There's different people put podcasts out. And I would say this, read everything and listen to everything you can and use that frontal lobe a lot, right?

I mean critically think about it. If some of this stuff you may be like, "I don't know about that." Well, then okay, don't do it if you don't know about it. It doesn't make sense because everything written, not everything I wrote is perfect to write. A lot of people getting into this space now. So keep that frontal lobe and highly engage. Don't just take it face value.

Dig in and learn.

- So where can our listeners find you on the web? You mentioned the HopHub, but if they want to reach out to you, how should they do that?

- Well, so that's the easiest way because I actually just today took my… I had an individual website, hopcoach.net and Andy's done such a nice job on our sort of consortium website. I just had my guy that does my website just route it straight to that because I don't want to confuse people. There it is. If you type it in, even if you type in hopcoach.net, it goes straight to the HopHub.

So I'm easy to find. My email's right there on the website and then somebody, you know, reaches out and wants to chat or whatever, I'm always happy to set up a call or a phone call or video chat whenever. Everybody's got my number out, my phone out there. So I get a lot of text and stuff, but it's all good.

So, yeah, just reach out if you want to chat some more and then just stay really curious and realize this, if you're on this journey, or I think Erik Hollnagel refers to it as sort of almost like a voyage. Like, we don't know exactly where this is going. So it's not like we're going to get to that destination. We're headed in this direction together.

Safety differently, safety two, HOP. All this sort of complexity stuff and all this stuff kind of coming together. Realize you're part of it. With your podcast, you're a part of it. You're helping to spread this knowledge and ask great questions to get sort of thought-provoking conversations going. And so if you join in on this, it's like Hotel California, you check in but you never really leave.

So join in. It is sort of an industrial reformation, if you will, that's reforming. It's not an industrial revolution, but it's reforming a lot of how we think about work way beyond safety, but also very much included in safety. Safety is very important, but most of us aren't in the business of manufacturing safety and we're not. We're making stuff and we want to do it safely, but we also want to make a lot of money and do it high quality and efficiently, all that stuff.

- Well, that is sadly, all the time we have for today. Thanks so much for the book and for all your thoughts today, Bob.

- Oh, pleasure is mine.

- I'd like to thank our listeners and also ask a we favor. If you enjoy Safety Labs, please give us a review and rating and tell your safety colleagues about the podcast. Thank you also to the Safety Labs by Slice team who have been learning together since day one. Bye for now. ♪

[music] ♪ Safety Labs is created by Slice, the only safety knife on the market with a finger-friendly blade. Find us at sliceproducts.com. Until next time, stay safe. ♪ [music] ♪

Bob Edwards

Bob Edwards is a Human & Organizational Performance Advocate and practitioner. His experience includes 6 years in the military, over 10 years as a design engineer, an owner of a design firm, a leader for maintenance and technical support teams and he has worked as a safety leader for the past 8 years. Bob takes his life work experiences and combines them with the Human and Organizational Performance (HOP) philosophies to help organizations learn how to better respond to failure and how to use HOP Learning Teams to gain better operational intelligence and to develop more effective, thorough and sustainable solutions to tough operational problems.