Dr Ivan Pupulidy
EP
32

The Learning Review: How to Improve Safety Investigations

This week on Safety Labs by Slice: Dr Ivan Pupulidy. Ivan tells the story of how he developed the concept of “learning reviews” after recognizing the limitations of traditional accident investigations. He helps safety professionals move from narrow, secretive blame-focused investigations to open, honest learning-focused safety reviews.

In This Episode

In this episode, Mary Conquest speaks with Dr Ivan Pupulidy, an international speaker and consultant focused on Risk, Safety, Aviation, Wildland Firefighting and Human and Organizational Performance (HOP). He spent over two decades in the US Forest Service, culminating in a promotion to Director of the Office of Innovation and organizational learning.

Accident investigation techniques have remained static for many decades, yet complexity is increasing. and workplaces are not necessarily safer.

Therefore, Ivan introduced an alternative approach: The Learning Review.

Unlike other accident protocols, it’s designed to understand human actions in complex adaptive systems. Ivan explains how to conduct a learning review in the workplace and why it helps HSE professionals improve safety cultures.

He describes the transition from a frame of unhelpful judgment to positive understanding of the accident, where the system - not individuals - is the key focus of the review.

Ivan explains the importance of humble inquiry, “complex narrative”, focus groups, inclusivity and engagement, new safety review terminology and “learning products”.

It’s a fascinating journey of cultural change, innovation and organizational learning - that ultimately enhanced safety in the US Forest Service.

Transcript

- [Mary] Hi there. Welcome to "Safety Labs by Slice."

Our guest today developed and implemented Learning Reviews which replaced Serious Accident Investigations in the United States Forest Service in 2013. What is a learning review? How does it differ from a traditional investigation? And most importantly, did it make a difference? I'll ask him all that today as we discuss Learning In Complex Systems.

Dr. Ivan Pupulidy is an international speaker and consultant focused on risk, safety, aviation, wildland firefighting, and human and organizational performance. He spent 22 years in the U.S. Forest Service, culminating in a promotion to director of the Office of Innovation and Organizational Learning.

Dr. Pupulidy represented the U.S. Forest Service to the Obama White House at a forum on Wildland Firefighter Safety where he presented the learning review process he developed as well as HOP concepts. Ivan holds an airline transport pilot license and has accrued over 10,000 flight hours as a tactical aircraft pilot for the U.S.

Coast Guard, U.S. Air Force Reserve, and the U.S. Forest Service. Currently, a professor at the University of Alabama and a guest lecturer at Yale, Ivan studied under Sidney Dekker for his master's degree in human factors and systems safety. He also holds a Ph.D. in social psychology and organizational culture from Tilburg University in the Netherlands.

Ivan's unique mix of academic credentials and real-world experience, make him a sought-after speaker and consultant. He joins me today from Santa Fe, New Mexico. Welcome.

- [Dr. Pupulidy] Thank you.

- So, you served for 22 years in the U.S. Forest Service. Can you briefly set up our conversation by explaining the safety problems that the service was facing up to the moment when the organization decided that they needed to do something differently?

- There were a bunch of different problems facing us. Like, from 1994 to 2017, we experienced over 400 line of duty deaths in wildland firefighter operations in the United States. That's a lot of people. We lost a lot of people. We got a lot of experience with accident investigation, but the problem was that the experience was fundamentally the same experience over and over and over.

And that experience was to blame the dead for their own death very frequently by saying that there was human error involved and citing human error as causal. Well, we, kind of, wondered about that, because that really wasn't true to form. When we started thinking about it, people make mistakes all the time. I'm in aviation.

I know pilots that make mistakes all the time yet we don't have crashes all the time. There has to be more to the story than that. And the field do this very well. They recognized right away that there was more to the story. And so what happened was the field started to have distrust of the system and, to some degree, the leadership of the organization. And I call this a crisis of trust that existed in the organization.

And I'll give you an example. We had a tanker that went down shortly after takeoff, flying out of Reno-Stead in Nevada. And what happened to the tanker is the left outboard jet engine blew up. And when it blew up, it peppered holes in the underside of the wing of the aircraft. And when that happened, fuel started streaming out of the wing because that's the fuel tank.

And there's plenty of ignition sources out there with the jet engine and a reciprocating engine. So, the fuel caught on fire, and the wing began to melt. We know this because there's pools of molten material leading from the runway where the explosion took place to the crash, so we know that the wing was melting off the aircraft. Now, the organization set out with a Serious Accident Investigation Guide.

With this guidance, the causes of most accidents and incidents are the result of failures to observe established policies, procedures, and controls. That's the guidance that the investigators had. So, what did the investigators come up with? They simply said the failure of the flight crew to maintain airspeed above and flight minimum control speed after losing power in the left jet engine during initial climb after takeoff.

Contributing to the accident was the crew's inadequate cockpit resource management procedures. The failure of the captain to assume command of the airplane during the emergency. The flight crew's failure to carry out the jet engine fire emergency procedure, and the failure of the crew to jettison the retardant lead. Well, with all these failures, the crew didn't stand a chance, but the reality of the situation is with the wing melting off the aircraft, the crew didn't stand a chance.

And that's no place in the narrative. The only way that we understand this is by going back and looking at the physical evidence and understanding what happened instead of trying to place blame on the crew to develop a sense of understanding, to have empathy for the crew for what they were going through, and the realization that the entire incident took place at 26.9 seconds.

So, how were they supposed to do all these things that they failed to do? This was typical. This was typical of what we were doing with our accident investigations. The result of this was the firefighter stopped taking leadership assignments. They wouldn't put themselves in the line of fire like that. They stopped talking to the agency and even to each other. Fear began to dominate the culture, and learning shifted to how best to defend oneself from the agency.

So, this is our crisis of trust. Something had to change. Simultaneously with the leadership of the organization, the leadership of the Forest Service was going through its own thought-provoking, thought-finding, identity-finding mission. And they hired a company called Dialogos. And Dialogos came and did a diagnostic memo of the Forest Service leadership, kind of, holding up a mirror to leadership.

Leadership didn't like what they saw, and so they started, kind of, sweeping changes about how they organized at headquarters level, and how they organized out into the field, and how field leadership would be distributed inside the organization. And they began a journey to try and understand how safety works, and they pressed my group before it was even a group in the service to do this to set up a learning journey to go out to different locations to see how other groups of people were doing or the organizations were doing safety.

We took them to the Coast Guard. We took them to Los Alamos National Lab. We took them to Con Edison. We took them to UPS. And we showed how these organizations invested a tremendous amount of capital in safety. And so that, kind of, shifted their mindset a little bit. The next thing that we did was we started national dialogues.

And national dialogues are really interesting, but I may circle back around to that a little bit later. The national dialogues fit into a larger network of things that was happening. The other thing that was happening was a research branch was developed. We called it Innovation and Organizational Learning, and this is the group that I started and ended up becoming the director of.

And that group was in charge of looking at accident investigation models, looking at human error and causality, trying to understand system safety, trying to bring complexity theory into the vernacular of the organization, trying to learn what it is to be a Forest Service firefighter to develop that empathy at multiple levels of the organization, and trying to identify who are the learners in the organization because it's not just field personnel.

As it turns out, if we're a learning organization, everybody is a learner. So, that's what my organization focused its research on, and we touched learning theory, social psychology, psychological safety, human cognition, organizational change models. We looked at all those different things in some detail. Simultaneously, looking at the investigation process, we noticed that there was a big problem with the Serious Accident Investigation Guide, and I couldn't do it anymore.

I literally went to an accident scene, saw what was going on, saw what was happening to the people that were involved with the incident, saw the potential for doing harm to those people using the process, and I took the Serious Accident Investigation Guide and I threw it in the garbage can. And at that point, the only thing that I had to replace it with was Sidney Dekker's "Field Guide to Understanding 'Human Error' " And so I picked through that book and I selected sections of it that I read to the group, so that we could all begin to set a frame of understanding around the accident instead of a frame of judgment.

And this incident started the learning review process.

- So, it sounds like the traditional method of investigation not only was not helping the organization learn but was causing active harm to those who had already been harmed in the situation. So, in fact...

- And their family.

- ...you have said when we spoke earlier that you can punish or you can learn but you can't do both simultaneously.

- Correct. And that ties into my favorite saying, which is the currency of safety is information. And anything that you do to disrupt that flow of information disrupts your ability to learn. And that's why I say you can either learn or you can punish but you can't do both. And these things are inextricably tied to one another. So, the ability to learn requires some real strength in the organization because to learn, you've got to be in doubt, you've got to be in the unknowing space.

If you're in the knowing space, you won't learn because you already know. So, you have to move the society, move the culture, if you will, to a place where they willingly doubt. They ask, "Is that so?" And once you move them into that space, once you move them from asking questions that they believe they have the answer to, to asking questions in what Edgar Schein calls humble inquiry, once you make that shift, learning becomes a natural course of events in the organization, because now people are interested and honestly asking questions of one another to honestly get answers.

Not to get honest answers, that's something else altogether, but to honestly get answers, right? So, we leave space for this idea through the learning leadership journey, and we literally set out to challenge the assumptions that our leaders had around safety. So, for example, when I started in the journey, the leadership asked, "Why don't they just simply follow the rules? We have the 10 Standard Firefighting Orders, the 18 Watch-Out Situations. If they followed those, they'd be safe."

And so what we had to do is we had to pose the question to them in such a way that we drew them to inquiry around the 10 and 18, and that's what we call the 10 Standard Firefighting Orders and 18 Watch-Out Situations, the 10 and 18. This became not only a role of the change and shift of investigation, but also it became a fundamental role for the research and the organization to start to understand how assumptions are created and sustained in an organization.

- Okay, cultural change in an organization is hard. We've all experienced that. And so, do you think that, in this case, you know, moving into that humble inquiry, I mean, that's a big shift for people more so for an organization. Do you think it had to get as bad as it got? Was that the main impetus or, I mean, what do you think was the environment that allowed this kind of a shift to happen?

- So, I think that's a great question, because what we start to see with our leadership is a genuine desire on their part to improve the system. What they had to do is they had to move to introspection to see what their role was, and we literally redefined the role of leadership. Leadership for a long time in the Forest Service was there to provide answers. Well, they're not the answer people anymore, they're now the people that create an environment wherein people can be successful.

And when they recognize that shift, then they naturally move to a different place because they had a genuine desire to make things better. Four hundred line of duty fatalities, that's a lot of people. And they got tired of going to funerals, and they got tired of beating the same drum and getting the same results. So, they literally sought different results. And it's in that moment that we had that transformation.

Now, I'm not going to say that culture is only steered from above, because simultaneously in the field, the field was pushing back and saying, "Investigations aren't giving us results that we can use. They're telling us that we're messed up. They're not telling us how we can be better." And so the field themselves wanted to be held to account. And so the chief of the Forest Service literally sat back and started to think about what is accountability, and he shifted the definition of accountability from the traditional thoughts on accountability that we all have to following an incident we are accountable to learn all we can from the event.

Now that's a huge shift. And when the leader of the organization, the chief of the Forest Service says that, you've got a pretty good impetus for change. Aside from that, we actually had leaders who desired change.

- Have you ever faced any pushback or anyone who's asked essentially, you know, if you throw out traditional forms of investigation, are we essentially letting people get away with incompetence if they do make an error? I know you don't agree with that statement. That's why I'm bringing it up.

- Of course not. If you take a person who's entrusted by the organization to do something as significant as fight a wildland fire, you placed trust in that individual. And then if everything goes well, you reward them for doing what they do. But if they do the same thing and the outcome is terrible, then you blame them for it. If you did this with a dog, you'd never housebreak the dog, and the dog would likely cower in the corner and bite you.

You can't treat people that way. You can't treat people as though they're trusted members of the family until something bad happens. And then once something bad happens, now they're outcasts. When you do that, you're sending a message to everybody in the field that they're not valued. They're not valued for their innovation. They're not valued for their minds.

They're not valued for their abilities. Are you saying that we're just machines and we should just process? Because that's not what firefighters do and that's not what people want.

- And their expertise, right? I mean, there's a lot of learning that goes into having the skill to do that kind of firefighting.

- And that's another thing. That's actually a good point that you say about expertise because one of the places that we studied was how experts performed, and what is our expectation, what is our reliance on the expert, right? And what we came to understand was that we expect our experts to know how to improvise. We expect them to apply rules to situations and adapt them as needed.

We expect them to use complex adaptive problem-solving or critical thinking skills to achieve results. We expect them to use their intuition and to know when to depart from the plan. And we also expect them to apply intuitive knowledge based on their experience, their practice, and everything else that they bring to the table so they have a competitive advantage through innovation.

And then the last thing that we expect of them is we expect to understand that experts no longer rely on rules, guidelines, or maxims. Instead, they have an intuitive grasp of situations. And they only go to an analytic approach in novel situations. So, when we recognize that, what we said was we've got to improve our ability in the field to recognize when the system is delivering the unexpected.

That's what we've got to focus on, not chastising them when they remain in what we tell them to do in process and flow, but instead, help them to recognize when the system is delivering the unexpected, that novel situation, because they then will naturally go to the analytic approach.

- You learned some things during this process from cognitive scientists, I believe, that have to do with routine novelty, critical thinking, like, the types of skills that need to be encouraged. Can you tell us a little about that?

- Yeah, sure. So, cognitive science told us a lot of things, but the main thing that it told us was that people don't always act deliberatively. They're not always making choices. Many times they're acting intuitively. When they're in that intuitive space, they're not deliberating it. And so that made us question immediately something that you see in accident investigations, in traditional investigations.

They chose to do this. Had they chosen to do something else... Counterfactual argument. Had they chosen to do something else, the accident never would have happened. The idea that they chose to do something is in itself a judgment. And one of the principles of the learning review is to suspend judgment, so immediately we had to take on choice. So, we turned to cognitive scientists to say, "When is a choice a choice and when is it not a choice? And is everything that has a bad outcome, the result of a bad choice or is it something more seriously insidious? Is it they didn't make a bad choice but the only thing that they had left were bad choices?"

And so once we started into that dialogue, we started asking questions very differently. We didn't ask our people questions that dealt with choice. We developed questions that helped us to understand the context around decisions and actions. And so we literally started not differentiating between decision and action. We recognize the cognitive scientists told us that everything a human being does is someplace on a scale between a decision and an action.

Sometimes it's a little bit more toward decision, and sometimes it's a little bit more toward action or intuition, but it's always on that scale someplace. But the point is, why argue about it? Because it's not the act itself that's the most important thing. It's what led to the act. Why did the act make sense to the individual in the moment? That's what we had to understand.

And so our whole line of inquiry shifted from the individual literally to the conditions that surround decisions and actions, and that's the major theme of the learning review.

- And that's where we bring in just that recognition of complexity, right? Every context is different. You know, a pilot that's taken off a hundred times in the same plane, you know, the wind conditions are different, or I mean, there's all kinds of things. They didn't get enough sleep last night. So, let's talk about complexity.

One of the tools that you like to use is called a Network of Influences Map. So, can you tell us what that is, how it's useful to you, and how it may be useful to say a client that you're consulting for?

- Absolutely. So, a Network of Influences Map starts with some decision/action that occurred normally, and we ask ourselves, what are the conditions that supported that decision/action? And we look for things like you mentioned, things like tired. That's up there on the list. Are they fatigued?

We also look at other things like goal conflicts. Is the organization saying that they have to do one thing like be safe and at the same time demanding production? We want to identify that. Is the organization properly providing tools for the individual? Are they deconflicting messages or are they establishing goal conflicts and holding them up as the most important thing in the organization?

Where does the organization place a priority? What's being measured, rewarded, and punished inside the organization? How is the organization promoting? How is the organization recognizing? All these are the kinds of things that we want to put on to a Network of Influences Map. And we find that even in small simple incidents where there might only be a population of 10 things on the Network of Influences Map, even in those cases what we find is that there's a really interesting relationship between those items on the Network of Influences Map that we never would have seen had we not depicted them the way we did.

And so we start out by looking at the conditions themselves, and we literally take a Post-it notepad, and we put a condition on a Post-it notepad and throw it up on the wall. Then somebody else will notice another condition, put it up on the wall. And we keep doing that until we've got the wall fairly populated with Post-it notes. Then we step back and we say, "Are any of these things related to one another?" And so we allow the data to self-organize.

So, in this way, the data, that's what's driving the system not some sort of pre-arranged hierarchy but the data itself. The net result of that is we end up with headings that become really important to us. Some of those headings might be things like principles of operation, rules, and regulations that didn't make sense. Some of the headings that might come out of it might be procedural confusion.

I can pull up a whole bunch of these. I should have one right in front of me, but I don't. I can pull it up. Let me just pull it up real quick so I can talk to it more intelligently.

- And while you're doing that, I think that just the act of mapping those relationships does something in our brains to help us understand them a little bit better rather than, should we start writing, you know, just a list? The spatial...you know, you can play with that. Things that are more closely related can move more closely together, and that sort of thing.

- So, here's a great example. So, one of the things that we recognized was there were conditions that affected the margin, and you can say margin of safety or margin of maneuver. The British likes to say margin of maneuver. You can say there are conditions that affect the margin. Some of those things are like trade-offs between efficiency and thoroughness, the operational capabilities of the organization, communication breakdowns, time pressures, physical and mental conditions, the environment itself, expected or unexpected hazards, social interactions, organizational expectations, political constraints and pressures.

And how about the behavior of the fire? Those are the kinds of things that will emerge as we're starting to map those conditions. And that was actually right off the list of things that affected margin. What we see is that by letting the data lead us though, what we don't do is we don't go out and look for things to populate. Like, we don't go out looking for organizational pressures.

Instead, we allow the organizational pressures to emerge. Other things that will come out of it are things like emotions, tactical pressures, assumptions that workers have about the system, or a sense of customer service. All those kinds of things emerge as we do these Network of Influences Maps.

- And just to describe for our listeners, because I've seen one of these. Imagine a chart or whether it's physical Post-its but some kind of chart with little bubbles with information and, oh, a hundred different connections going in all directions at the same time. Like, you could say a spider web, but spider webs are organized. But, yeah, it self-organizes through the mapping of the relationships, that's what you're saying.

- And that's the way we end up developing an understanding of the kinds of pressures that existed at the time. What we want to do then, once we've got the initial Network of Influences Map, is we want to move away from the accident and into normal work as quickly as we can, because we don't want to spend a lot of time on things that don't exist in normal work. So, we hold a focus group, and we show them the initial map, and we say, "Do these conditions exist in normal work? Are these things we should spend time on?"

And when we have that first focus group... And we've done this with as few as three people and as many as 45 people, depending on the complexity of the event. So, sometimes it's cumbersome, and sometimes it's really streamlined and really easy. It depends not on the severity of the accident but on the amount of learning that we can bring out of the situation.

So, we look at learning potential as opposed to severity to determine how big a response we want to create, which is also something very different about the learning review. Most accident investigations the larger the investigation, that's based on the larger the loss of either structure or life. We don't think about it that way because you could have a catastrophic failure of a large system that was completely mechanical in its origin and very little learning can come out of it.

We don't want to spend a lot of money on that. We want to spend money where we can really learn something. And then we want to think about what kind of learning products will emerge out of this. And remember, when we think about learning, we think about the organizational learning. So, we'll think about learning products for leadership, learning products for middle leadership, sometimes logistics, often for the frontline operator. Everybody will get a different learning product.

And that emerges largely from the second iteration of our Network of Influences Map coupled with what we call a complex narrative or a storyboard. So, the complex narrative is something that I really think storyboard is a better name, because when we create a learning product, we create a learning product that is something that is tangible for people, something that they can relate to, something that they can have empathy around.

Instead of judging, maybe move to the space that was occupied by the people in the moment so they can understand it from that perspective. So, we don't resolve perspectives, and we don't attempt to tell a singular truth based on factual information. That's not what we are. We're telling a story, and we're telling the story of those people who were involved with the incident from their perspective.

Because of that, we may have perspectives in the story that differ from one another. And that's the richness of the storyboard because that shows that people didn't really understand what was happening, or they didn't have the same understanding of what was happening. The follow-on question for leadership and for field personnel would be, was there a way for them to recognize that in the moment and address it?

Was there a mechanism for them to share that information that they had different perceptions during the event or was that only something that could be uncovered once they sat down and talked about it after the event? And that generated another thing, another learning process which is after-action reviews. And how we started to move to after-action reviews is a story of multiple perspectives rather than an accounting of what occurred from the perspective of some factual recognition of what we believe to be true.

- I would think that the storyboard idea also is a way of almost mapping blind spots, right, and assumptions.

- Yes, yes, exactly.

- That would be missed in the traditional thing. I want to ask a little bit about...I want to stay on the methodology of narrative but also ask about terminology. So, words like investigation and witness statement as opposed to the narrative and the learning, changing these words, is it kind of window dressing, or do you think it has an impact on the outcome of the activity?

- It has a profound impact on not only the outcome of the activity, but it has a profound impact on the people who are conducting the review. They're investigators and they're imbued with that title of investigator there's a certain amount of ego that gets stroked in that moment that we don't want to stroke, right? So, instead of doing that, instead of propping the investigator up as the investigator, we put them into a mode of facilitation to facilitate learning, and their goal is to facilitate learning.

So, for that reason, they're learning team members. That's what they're called. And there may be a learning team leader because sometimes we need that. We need the person that is going to liaise with law enforcement and liaise with the community and with the families. So, we ascribe that level of hierarchy, but when the doors are closed, it's a very planar relationship.

All rank comes off and everybody is free to talk to one another about what's going on. So, we try and establish that through language and through an understanding of what our purpose is. And the language has to support the purpose. If the purpose is judgment, stick with investigation. You're going down the right track for that. But if the purpose is learning, let's start to change the language around that so that people start to see what their role is, start to believe it, hear it, see it, every place that they look, they're invested in learning and facilitating that learning.

That is very profoundly important. It also makes a big difference for the family. So, one of the things that you asked at the beginning is, what's been the outcome? Before the learning review, we were pretty well-guaranteed that a family would sue us because of the Forest Service's organizational response to incidents and accidents, the Serious Accident Investigation process resulted in innumerable lawsuits. Once we put the learning review in place, that changed because people, families of the fallen firefighters, families of the survivors, the survivors themselves, all became part of the process.

They weren't held at arm's length as though this was some secret thing that was happening in the shadows. No, we're right up out front, open, here's the process, here's what we're doing, and I literally... I can remember this very vividly because I broke the rules this day. It was my birthday, and I make a habit of never working on my birthday. And this was the first and only time that I actually worked on my birthday.

But the family had gotten together, and we were about to release the report, the Learning Review Report on this one fatality, and the family had gotten together in Spokane, Washington for a firefighter gathering, and they invited me to come. And I spoke the day before about the learning review process to the entire body of firefighters and family members.

And the next day, I sat down with the report, and I gave the family the report. They were the first ones to see it. They got the first crack at it, not leadership of the organization, not some approval process, but the family, and lawsuits went away. Understanding emerged. And what did we end up with? Thank you instead of a lawsuit. And that's been the story of the learning review since we put it in place.

- It really shows a shift in priorities when the family is, you know, not just included but prioritized.

- Yes.

- And I think really families, I mean, they don't want to sue. They don't want to lose their loved ones, but if it happens, they don't want to be in this adversarial relationship, which I could just see it festering and feeling toxic and that sort of thing. Whereas, here, the organization is showing that, hey, things went bad, and we know this, and we recognize it, but we are committed to learning from it.

So, I think...

- And the families truly saw that. And I told everybody who I taught this process to in the Forest Service, I told them and I very rarely tell...usually I speak with, not to. And I told them, I said, "I want you to be able to go to wherever the accident happened a year after the accident, and I want you to be welcomed when you walk in the door."

You should not be adversarial with these people, you should be acting from a place of empathy. These were good people the morning of the accident, they were good people after the accident as well. That didn't change. Don't treat them as though they're not. Don't hold them at arm's length. Don't make things secret. Be upfront, be open.

And as a result, we worked with HR to develop the process as well, human relations, and we were able to start putting out reports that were completely unredacted. People really like that. Up until this point, if you go on the Wildland Fire Lessons Learned website, and you look back at accident investigations, you'll see all these black lines. And the very first thing you see or think when you see that is, what are they hiding? I said, "I don't want any of that.

We want this to be out there, and we want to show it to people and get them on board with the learning," because the last thing you want to do is put out a narrative at the end of an accident and have the very first person that reads it, who was close to the incident, say, "Well, that's crap. That's not the way it went down." That's not what you want because that undermines the entire value of the report, and that's what we were doing.

Fundamentally, the accident investigation process was resolving the circumstances and actions during the event into a plausible story. There was a fictional account that made sense only to the investigators. And when it went back to the field, most field people said, "Nah, it didn't happen like that." That's not the way you want it to go down. And the same with recommendations, right? The very last thing you want to do is put out a new recommendation and have the field go, "That'll never work."

So, we again hold focus groups. We bring people in who are going to be involved, who are going to be affected by any kind of recommendation that we're going to make. We bring them in and we say, "What kind of recommendations can we make to systemically improve this entire work environment. What can we do?" And we go to focus groups for that. We don't believe any longer that the investigator's hands are blessed somehow miraculously by some entity, and they're suddenly imbued with all this knowledge.

Instead, we recognize that we're again facilitators, and to facilitate, you've got to go and seek expertise to find out what the best interventions are going to be. And that's kind of my most recent work. My most recent work is now in how can we build a system of interventions that fits both simple and complicated systems as well as complex systems and the difference in interventions. That's where my research is going right now.

It's fascinating. And I actually have a couple of students that are working on this right now in their master's program, and they've turned pretty good capstone projects into workable items for their field application in power generation industry and power transmission industry in particular. But the idea again is that they're not finding the answers. They're seeking them from the people who do know the answers, going to the field and honoring expertise.

And that's another big difference about the learning review.

- I was going to say how... When you're setting these up, you were talking about a learning review team. What factors go into the decision of how many people are on the team? You said sometimes that team has a leader, sometimes not. How do you decide how to structure a team? And is it different for each incident?

- Yeah, it is. So, we had a very simple...a very complex. I should never have said that. We had a seemingly very simple incident occur when a laptop was stolen that contains personally identifiable information for everybody in the Forest Service. And the reaction on HR's part was, "We have to fire the individual because they broke a rule. They left the laptop unattended, and it was stolen."

So, being the context guy, I, kind of, wanted to understand what the context was. And I also wanted to see if there were any rules that were in conflict with that rule, because there is a rule that you can never leave a laptop unattended. By the time this came up, I was up in the senior leadership of the organization, and I was sitting with two other leaders, and I noted that neither one of them had their laptop with them.

And so I said, "Where is your laptop?" "Well, it's on a desk downstairs." "So is that desk attended? Well, then you're in violation of the rule. Where is your laptop?" "It's in my hotel room?" "Well, that's really unattended, right?" In order for you to comply with this rule, you just about have to have the laptop surgically inserted someplace in your body because there's no way that you'd be able to have your laptop with you at all times.

It just doesn't make sense. And so that immediately put a little doubt in the minds of the leaders that were there. So, then we started looking at what were the conditions that supported this decision/action to bring this laptop home because HR was ready to fire this guy, right? So, I go to this facility, and I see the guidelines for transporting laptops carrying personally identifiable information is placarded in the elevator, not in just one elevator but every elevator in the building.

And there it is. If you're going to transport a laptop with PII, this is what you're supposed to do. And guess what. He was in complete compliance of that regulation or that guidance. So, he did that. And here's this terrible employee that HR wanted to fire. The guy's going home for knee surgery.

He is the only person in the Forest Service that can approve other people to approve pay. He feels like his job is pretty important. It needs to go on. But he's going home for knee surgery and he says, "I want to bring my work home with me, so I can continue to do my job while I'm convalescing." That's a pretty dedicated employee. He's not getting paid any extra to do that, right?

He's bringing this home for a darn good reason. And as we come to find out, technically, he's thwarted because there's no place that he can remotely access the database. So, he had to download the entire database of Forest Service personnel to this laptop computer, which he did, but it's an encrypted hard drive, and it's protected by several layers of protection including passwords and personnel card that he had to insert into the computer that has a microchip in it.

So, all these layers of defense are in place, and he feels like it's a pretty safe operation, and he's doing a good thing for the organization. So, I explained that to everybody, and I show them the Network of Influences Map, and everybody in the room goes, "We don't have to fire this guy. Let's try and figure out a way that somebody can do their job remotely safely. IT get on that. Let's fix that problem instead of fixing a broken worker."

- Yeah, that's another benefit is finding the real problem.

- And that team was three people, myself and two supervisors. That was the entire team.

- Yeah, that was seemingly simple. So, what about something that is quite complex like a series of unlikely perhaps events in, like, an aviation situation where there's a lot of factors to take into place? Do we know off the bat that we're going to need x number of people or how does that work?

- We established a response team protocol for serious accidents that would bring large media or public attention. And that included public information officers, that included law enforcement officers, so that they could liaise with the local law enforcement folks. In some cases, we would actually bring in a flight surgeon, because flight surgeons have a lot of skills, not just the obvious ones that you would think of, but flight surgeons are also really good at asking questions.

And we also would bring in, at that point, a team leader, an assistant team leader, and then somebody who's senior executive service in their agency to talk with other senior executive service in the agency. So, we established a hierarchical system around that. We call that the Coordinated Response Protocol where we establish all of those levels. And then in that is the learning team nestled within that larger structure as the learning team.

And the learning team responds directly to that senior executive service individual. That's for something that's going to have a lot of public scrutiny, or there's a great deal of community liaison that has to take place. So, a lot of times, accidents in the Forest Service happen in small communities of firefighters and navigate that space.

We have actually engaged psychologists to help us crafting messages to those families. So, we don't put the families in a terrible situation or so that we can also be assisted in recognizing when that situation is getting too much for the family and how can we bring in other people to help us with that. One of the key ones on that one that we did was we did one from the Bureau of Indian Affairs. And I was running the learning review because they actually asked for a learning review.

They never did before this, this was their first. And I had to go to an Apache Indian Reservation, and I'm Native American by birth, and so I have a little affinity for where they are. But we also brought a senior executive service person who was from the local pueblo here in Albuquerque. So, we had Native American presence, and we could speak with them from that perspective.

That becomes really important to meet people where they are socially, racially in this case so that we can actually engage in a community to help learn. And we knew that this was successful when one of the Apache hotshot crew members was sitting in the back and he wasn't speaking and his head was down, which I know is not unusual, and he's doodling.

I think he's doodling. That's my image of what he's doing. And I finished speaking, and I think John Waconda started speaking right after that. And while I was sitting down on the side, he came up to me with what he was doodling. And he had drawn a hogan, a traditional dwelling. And he said, "This is what you're creating, you've created a hogan and now it feels like I can enter."

And that's what broke the ice for the conversation. And it was all just in being forthright with them in showing that we're not veiling this and showing them that we're not blaming the person who died. We're not going to try and look for something criminal first, which very frequently accident investigations do is they try and rule out criminality. We're not going to do that. So, if something criminal emerges, we'll handle that, but we're not looking for that first. So, once we explained all that and he showed me the drawing of the hogan, everything really opened up in terms of communication.

And that's the breakthrough kind of moment that you want to have. You want to develop that trust. And there's a lot of different ways to do it. A lot of it is just listening and engaging with empathy. I see three trainees, so the fire is broken into different hierarchies. This is an emerging Type 3 fire, which means it's becoming much more complex. And the guy that was in charge of it was a trainee, and he was a very thoughtful individual.

Really had a lot of concern for the families of the two fallen firefighters, had a lot of concern for how the community would relate to this because, again, small community. And I started to figure out who he was before I met him. And just as a happenstance, I found out that the morning of the fire he woke up, and a rat had come and eaten the shoelaces out of his boots, literally eaten the laces out of his boots.

So, the first thing that he wakes up to is not just the complexity of the fire, it's also that he's got to find new laces for his boots. And that's how he started his day. And so my first question to him was, "So, how did you start your day?" And he just unfolded with a story about his laces. And what it meant to him, the small, simple little thing, and what it meant to him in that moment to face something like that in the looming cloud of smoke that was over the top of him, it was just an amazing little moment for him.

And that allowed him, again, to stop and relate. And that's what you want to look for. You want to be able to relate to people and allow them to share their perceptions. And don't change them. Don't change their perceptions. Those perceptions belong to them. That's their reality.

You can't create another reality and superimpose it over the top of that based on some factual information that you gathered that conflicts with their reality, because most of that's only available after the incident. You have to appreciate the only information they had was the information they had at that time, and that's what they were making sense of. And that's what you want to understand. And don't change it to some counterfactual argument about, "Oh, well, if they turned left instead of turning right, so from now on, all trucks will turn left."

You don't want to turn it into something like that. It's ludicrous. What you want to do instead is understand where they were, what they were dealing with, and what their perception is, what their reality was, and don't change it. After we've taken our notes on this and we've written it up, we'll go back to the individual who provided us with that information, and we'll ask them, "Did we get it right?" No process has ever done that before.

The reason that we do that is because we don't want to change their perception. And we know that they're going to think about it, their perception is going to change, but that's not the important part, right? The important part of this condition, by the time we're dealing with their perceptions, we've already captured their conditions, the conditions that influenced them at the time of the incident. That's where the real learning takes place.

And then if we can capture their perceptions and learn from that, all the better. That's the storyboard.

- You mentioned earlier learning products. Can you give me a couple examples of what those might look like and if there are specific situations where you would use a particular format or anything like that?

- Yeah, sure. Anything's open to us, right? So, you've probably seen these RSA animates, so animated lectures. We created one of those around the concept of margin. That was a learning product. It's available in Lessons Learned Center if you Google margin. It's a great little video, it's available on YouTube.

And it's again designed to help the firefighter look at the world from a different perspective. That's what we're really trying to do, challenge assumptions and look at things through a different lens. Those two things are learning products for field personnel. There are also learning products for the senior leaders of the organization. We want to change their perceptions often too. Often the crack that we get at leadership, senior leadership, is short period of time, high-impact.

And what we like to use for them are videos that are high-impact videos that usually end with a question, not with an answer. So, what they're accustomed to is closure and getting an answer. The answer is they failed to do something. Well, that's not an answer, right, or it's not an answer that's very satisfying I should say. It is an answer, but it's not one that's very satisfying. So, what we do for leadership is we try and put them into position again where they begin to inquire, and they move to inquiry through a very different way of looking at it.

And, you know, a 30-second commercial can bring people to tears. Look at Coca-Cola, they're excellent at this, right? We can do the same thing. We can craft a message that helps people meet situations with empathy and gets them to ask why did it make sense for the people to do what they did and how can they make that better. That's the kind of learning product we want for leadership.

Often what we'll also do is we'll do that complex narrative or that storyboard and we'll embed questions in the storyboard. And this is a remarkably effective tool. The questions, again, we don't make them up. We go to people in the field, and we go through the storyboard and we say, "Where's a good place to stop and ask a question?" And these middle-level leaders, very frequently middle-level leaders will say, "Stop here because here's the question the field should be talking about."

And again, these questions aren't designed to be yes or no questions. They're designed to elicit a dialogue because the learning takes place in that dialogue. People will defend the positions of firefighters, sometimes friends, and that's fine because as they start to defend those positions, they start to understand some of the things that individual was going through.

And that's when we start seeing learning breakthroughs.

- So these questions would that be something like, you know, at this point in the narrative, why did so and so assume this? Like, not in a blame way, but what were the conditions that lead to this assumption that in hindsight turned out not to be accurate? Is that the type of question?

- We would likely not start with why. We'd probably start with something like what assumptions exist at this point. Very openly, what leadership mode is this individual in? What kinds of things are impeding the ability of that person to send a message? Those kinds of questions. So, you see a firefighter on the side of the road, and he's directing traffic up this one way in, one way out road.

Why did it make sense for him to let the vehicles continue to go up that road, one way in, one way out? What made sense about that? And then what we start to do is we start to look and allow the audience to build their own network map. Well, he sees that people are trying to get back to get their loved ones out of their houses, or they're trying to get their dogs out of their houses, or they want to make sure their house is safe. How many of those questions can we answer without people going up the canyon?

- These questions basically lead to potential systemic changes.

- Yes.

- Right, like, that's where they take you to as opposed to blame.

- Exactly, that's exactly right. And that's what we're looking for. Is there a way we can position people to, in the moment, understand that what they're facing is an anomaly, is something that's unexpected, and move to that deliberative space in the moment, not after the fact but in the moment? Can we get them into deliberation in the moment? And a lot of times we can do that by pointing to examples of other instances.

And it's kind of like scenario-based training, right, at that point. So, here's the scenario, how could you react? What could you do differently? That might be a question that we'd say. Even though that's a counterfactual question, set the stage for how you could be more successful in the future.

What questions would you want to ask of your leadership if you were in this position?

- So, that answers my next question about how do you teach critical thinking, right? Critical thinking is improvising and changing contexts, right? Some changes you have to say, okay, the mental model I have had certain assumptions and those are no longer true so I need to adapt.

- Exactly. But the first thing you've got to do is challenge your own assumptions, which can be a very hard thing for people to do, especially in high-risk, high-stress environments. So, we've got all kinds of labels for this. But the one that probably resonates most with your listeners is plan continuation. So, why did they remain in plan continuation? Well, they didn't recognize that the system was dealing an anomaly, something that was outside the norm. So, how can we sensitize people?

How can we develop a capacity to recognize when the system is delivering the unexpected? Well, we recognize pretty quickly through the learning reviews that we did was that when people fell into routine, they became more vulnerable as the system was delivering unexpected results. So, you can't meet anomaly with routine. It just doesn't work.

So, we found probably the smartest person in the world about this, and we literally sat at his feet to try and learn everything that we could. His name was Reuben McDaniel. And sadly, he's passed away. But Reuben taught us that the capacities that we want to look for, and the type of things that we want to cultivate in learning products for field personnel is the capacity to make sense of conflicting information.

Learn in the moment, which means sharing information, and innovate solutions, innovation, making learning improvisation. So, Reuben taught us that, and we started thinking about ways that we could create learning products in the field to develop those capacities in field personnel. Well, when you start to think about that, those things are inextricably linked to things like Amy Edmondson's concepts around psychological safety.

They're inextricably linked to tangible things like communication networks, the physical ability to communicate. They're also related to trust relationships. So, if the leadership on a fire, for example, isn't trusted by field personnel, there's not going to be an exchange of information. And then that led us to, okay, let's say we've had a successful outcome.

Does that mean that we don't have anything to learn? And we started changing that dynamic as well to say, how can we start to craft learning reviews into after-action reviews? How can we start asking those kinds of questions post-incident? And so we've got a couple of different processes that we developed around after-action reviews as opposed to copying something from the military, which is what we've done in the past.

Different ways to ask questions to challenge those assumptions or to recognize when we were in assumption.

- So in a way, you were redefining success as well, right?

- Really, yeah.

- Success of an investigation is essentially, well, this is what went wrong. All done. Whereas, your success is learning from...

- I had a talk with a bunch of hot shots, and that's probably our best line firefighters. Although the smokejumpers would argue that they're the best hotshots putting [crosstalk 00:50:04].

- Of course.

- And I was talking with them and I said, "So what is your number one metric for success?" And they said, "The line that we put in line, the amount of line we create, so fire line." And I said, "Not the amount of fire line that holds? Now let's think about that. There's a challenge of an assumption." Now, a lot of times when you challenge a rooted assumption like that, you're going to be met with anger.

And I was. I was met with anger in this room, "That's the way we've always done it. We compete for line construction, the length of line, not the line that holds, the amount of line that you scratch." So, when you're scratching line, you're exposing your troops to all these different hazards, and your thinking is that the amount of line that you produce is the proper metric, not the amount of line that holds? I think there might be another way to look at this problem.

And so a lot of it is really in shaping better questions for people or helping them to shape better questions. And that's what we found was, again, the job of facilitator. You go there and you start listening to things. And it's kind of fun being the novice, being the person who doesn't know anything because you can ask some really stupid questions, right? Why are we doing this?

Tell me why does this make sense, you know? And that's a question that asking in an air-conditioned room before or after an event, that's a pretty good question. During an event that may not be the question that you want to lead off with, right, during an adverse outcome event. But in a positive outcome event, you might lead off with a question that helps to shift that paradigm of thinking.

And so sometimes being the dumb one in the room, the less experienced person allows you to get away with all kinds of really stupid questions. And those stupid questions often lift the veil on these cultural things that we don't even touch because we accept them so readily. "Well, we've always done it that way." I don't want to hear we've always done it that way that's contributing to 400 firefighter deaths between this date and that date, right?

We don't want that we want something different. If we want something different, we've got to do something different. So, what does that look like? I'm not going to tell you what it looks like, but I'm going to ask you some questions around it. And that's what we try and think of in terms of learning products. And we're not the only ones to ask those questions, we get senior firefighters who are respected in the community, we bring them into the fold and we allow them to be the emissaries as well, we let them go out and do the teaching.

And so we provide them with teaching products that they can take out to the field with them to inspire dialogue. That's a pretty good example of some of the learning products.

- It's very difficult, especially in a complex system to draw a very clean line of causality. But you made all these changes and you talk to us at the beginning of our conversation about the status quo, what was the problem? All these changes were made. You did mention some success like fewer lawsuits, but how do you feel...you know, the question that I asked, did it make a difference?

- So I'll answer this in a couple of different ways because there's no simple way to answer it. There is but I don't think any process can claim that they've reduced accidents and incidents significantly. I will say that because of all of the things that the leadership, that innovation, and organizational learning, the learning review that all these different pieces working in concert with one another, did contribute to a reduction in accidents and incidents.

That reduction did not come because we simply did something like the learning review. It came because we introduced new language that the firefighters could talk about things differently. We opened up a space for a different type of dialogue and allowed learning to happen. So, it's not that people don't want to learn. Sometimes they're afraid to learn.

We faced this early on because when we first came out with a learning review, there were a lot of people who said, "Well, we want to find somebody to blame." And I'm like, "Wait a minute, this is counter to my understanding of the entire system. Please help me understand this because I must be a little stupid." And what I came to realize is that some of these folks had to justify their existence on the fireline. And if they could say, "So and so was stupid, and I'm not stupid," then they're saying it can't happen to them.

And so what we had to start thinking about then was, what is the dynamic that's taking place there? And I actually coined a phrase around this called Normalization of Risk. Now we got normalization of deviance from Diane Vaughan after Challenger, and I got to say that I think...I talked with NASA people about this. I think she gave NASA an easy way to wriggle out of responsibility of Challenger by saying there was a normalization of deviance.

What really happened with Challenger and with Columbia was that there was a normalization of risk. They came to believe that they had reduced risk to Ten-to-the-Minus-Sixth, and nothing could upset the maths. The maths were sacrosanct. And so now what we had was a system that was fundamentally safe. And what we like to say is "it's safe until it's not." And that's the mark of a complex system, because the one thing that's not recognized by probability and severity mathematical equations like FMECAs, PHAs, all these other risk mathematics that we do, we get to Ten-to-the-Minus-Sixth, but we don't recognize that what's left is residual risk inside the system, and that residual risk is the uncertainty that's inherent in the system itself because the system is complex.

And complex systems by their very nature deliver uncertainty. So, that always goes unnoticed because the only thing that we can do with probability and severity is really map the things that we know and the things that we can predict. Well, guess where uncertainty is? It's outside prediction. And so what we end up doing is we say that the system is safe to Ten-to-the-Minus-Sixth, we put that out there that risk has been mitigated to this acceptable level, the field believes it.

Now you've got to try to train them to do something completely different because their sense of probability is very, very low. Sense of severity may not have changed, but the sense of the likelihood of that happening is very low. Can't happen to me, that's the simple math of it, right?

- And that changes the risk. As you're saying this, I'm just thinking Chernobyl.

- Right? Yeah.

- That, you know, there was a faulty assumption that the system was failsafe. And so during a safety test, all these improbable things happen and extra risks were taken by the operators because they thought and were told that, you know, they had a failsafe, and they didn't.

- And the same thing with Three Mile Island. We see this in a lot of different endeavors. Let's even make it even simpler than that. I don't have a child, but I've helped a lot of children learn how to drive because I've had eight exchange students at that age. And what I find is when they get behind the wheel, they follow every rule, regulation, policy, and procedure, every one of them, and then slowly the rules and regulations start to fall by the wayside.

And what I noticed is what they're doing is they're normalizing the risks associated with operating that vehicle. This is a normal human endeavor. We all strive to become more efficient, and the only way to do that is to normalize the risks associated with things. So, if we start to question what we're normalizing, if we start to think about it differently, then we start to ask questions very differently. So, NASA started out with a presumption that they wanted to prove that the space vehicles were safe to launch.

Once they developed risk assessments that got them to Ten-to-the-Minus-Sixth, now they moved to prove that it isn't safe. So, when McDonald stands up and says the O-Rings are burning through, he doesn't have enough data to upset their data. He doesn't have enough physical evidence to prove that what he says is a viable outcome. And so because he doesn't have the math to support it, they discount it and they launch and [inaudible 00:57:43.228].

And then you get books like the ones that he wrote, the book that McDonald wrote on "Truth, Lies, and O-Rings," which is a really interesting account. The reason I know McDonald is because I flew with his son. His son was a lead plane pilot in the Forest Service, And I got the chance to meet McDonald and talk with him about exactly what happened before Challenger was launched. And it's an amazing story, and it doesn't have anything to do with normalizing deviance which is, by the way, a terrible phrase.

Deviance, really? It doesn't conjure up the right images for me for some reason, but normalizing risk, that's really what they did. And then you had to prove that their belief, their assumption about the safety of the system wasn't valid. And people really don't like having their assumptions challenged. And then you start looking at the network of influences around those guys.

- We're winding down towards the end of our time, but there are a few more questions I want to ask you. So, for a lot of our guests, I ask them, you know, imagine that you're a university professor. Well, you don't need to imagine. So, I'll just ask you straight up. What soft skill or, you know, human relationship skill do you think is the most important for tomorrow's safety professionals to be taught I suppose or discussed?

- So, discussed is probably better because soft skills, you don't really teach soft skills. Soft skills are learned somewhat empirically. I think that perhaps what you're really asking is qualities. What qualities do we have to have in the next group of health and safety specialists?

- And more so what is... I'm forcing you to choose what is the most important quality. I realize it's kind of a...

- Well, it's kind of an easy one to say because I'm going to use a small word that means a tremendous amount. I mean, it ties in all of the things that we've talked to, to this point and that's empathy. If an EHS specialist, a safety professional, doesn't have empathy, if they can sit in judgment, they will never be effective. They will always be looking for linear causality, and they will never be trying to understand the context that surrounded actions and decisions.

And the learning takes place in that context. So, I would say empathy is probably the most important thing. And that ties in Amy Edmondson's work, and it ties in Edgar Schein's work, it ties in Reuben McDaniel's work. Everybody who really has had their hand in safety. Sidney Dekker, Erik Hollnagel, all of them are tied into the safety network through empathy.

And it's in that caring that cultural change takes place. It's in that caring and that empathy that understanding is built, and then understanding is shared throughout the organization, because we're not going to get there through rote knowledge. That's not going to do it. It has to be through understanding and has been so far. It's gotten us to where we are, but if we're going to make the next big lift, we got to do something different.

- If you could go back in time to the beginning of the safety portion of your career, what one piece of advice might you give yourself?

- Don't do it. No, just kidding. Wow.

- I end with these impossible questions.

- Yeah, that's a good one. Maybe I should have had you give me the questions ahead of time because I could have thought of something really snarky for it. Go back in time and give myself advice. I would say probably the thing that I would say to myself is be patient. This is a cultural shift that I'm asking for and where I expect the empathy to be extended to the people in the field, the empathy has to be extended to the leadership, and the organization, and to the people who you're trying to bring over to a different way of thinking as well.

So, I would say probably patience. And I don't think I had that in the beginning, I think I could have used that little piece of advice in the beginning to be a little bit more patient. I wanted it to change right away.

- I think those are things that we... Patience is something we learn as we get older but also it makes sense. You're fired up about new ideas and ways to change and you're trying to drag people along with you and it just doesn't work that way usually.

- Yeah, and I think at that stage, it's more of a push when if you're really going to create a cultural change, it has to be more of a pull. You have to draw where you want them to be you can't push them to be there. And I think that's what I did early on. So, I guess my advice would be patience and don't push. Pull, don't push.

- How can our listeners learn more about some of the topics in our discussion today? Is there books, websites?

- There's a university program where I teach. They're welcome to join the advanced safety engineering management program at the University of Alabama, Birmingham. Fifteen-month program to get you a master's of engineering. And five of the courses deal with this subject specifically. In addition to that, my wife and I have a company, a consulting company, Dynamic Inquiry, so we can be hired to help your company navigate this.

We often work in spaces around leadership, but of late, we've been asked to present the learning review to a spectrum of different companies. So, we've got a major airline that's using it, another major airline that's interested in it, a major rail network that's using it, and an organization that develops explosives that is having us come out and teach the learning review to them.

So, that's another easy way to get us, Dynamic Inquiry, LLC. You can always just hit me with an email, and I'll be happy to answer questions. And I'm on LinkedIn, and I talk frequently on LinkedIn. Although right now, I haven't been on LinkedIn as much because we're writing a book about the learning review and cultural change. And that book should be out by December.

- Do you have a title yet or is that still waiting?

- No, I don't write the title till the last. It's the last thing I do [inaudible 01:03:14.364].

- That makes sense that's...

- Otherwise, the logo would be with me.

- All right, well, that is all the time we have for today. Thanks to our listeners now in 50 countries. And thanks so much for lending us your time and your ideas, Ivan.

- Thank you so much. It's been a real pleasure, Mary. I really enjoyed it.

- My thanks as always to the "Safety Labs by Slice" team, we're always learning together and I love it. Bye for now.

Dr Ivan Pupulidy

Professor at the University of Alabama, Birmingham and HOP/New View Coach at DYNAMIC INQUIRY LLC

The university program where Ivan teaches: School of Engineering – Advanced Safety Engineering and Management | UAB

Ivan and his wife’s consultancy business: What is Dynamic Inquiry?