Gareth Lock
EP
43

Subaquatic Safety Lessons

This week on Safety Labs by Slice: Gareth Lock. Gareth shares his extensive diving safety knowledge and expertise to help EHS professionals enhance workplace safety management. Scuba diving faces similar pressures to high-risk industries, so Gareth’s innovative and meticulous guidance on risk, human error and accident investigations is highly transferable to workplace safety.

In This Episode

In this episode, Mary Conquest speaks with Gareth Lock, the owner of The Human Diver, an organization that improves the safety and performance of divers through an understanding of human error. Gareth’s safety career spans over 20 years, and he uses his experience in managing high-risk operations to help HSE professionals improve workplace safety.

After sharing his fascinating journey from flying in the sky to diving underwater, Gareth explains why attributing accidents to human error actually hinders learning. He also recommends managing uncertainty rather than risk (numerical probabilities that can’t account for human reliability) and gives practical guidance on how to make this shift.

A key focus of this fascinating interview is the importance of safety pre-briefs, debriefs and validating plans through training: “It’s much better to sort problems on the surface before you go into water”.

Gareth introduces the concept of inclusive ‘learning reviews’ that uncover a safety event’s rich context and ‘second stories’. They help EHS professionals move past the immediate cause and blame to understand and address the complex combination of converging factors.

An underlying theme to Gareth’s interview is the need for organizations to shift from a blame culture to a ‘just culture’ - where the workforce feels psychologically safe to speak out, and leadership recognizes that human error is inevitable.

Gareth is as passionate about safety as he is about diving, and his insight applies to any environment where people are involved.

Transcript

♪ [music] ♪ - [Mary] My name is Mary Conquest. I'm your host for "Safety Labs by Slice," the podcast where we explore the human side of safety to support safety professionals. We move past regulations and reportables to talk about the core skills of safety leadership, empathy, influence, trust, rapport, in other words, the soft skills that help you do the hard stuff.

♪ [music] ♪ Hi, there. Welcome to "Safety Labs by Slice." Today, we're because have a bit of a wide-ranging conversation about risk, human error, and investigations in the world of scuba diving, where there's really very little margin for error.

While diving may sound a bit niche and it isn't entirely under the jurisdiction of regulatory bodies for workplace safety, it faces a lot of the same pressures and tensions as more prevalent high-risk industries such as oil and gas. My guest today is Gareth Lock, the owner and director of The Human Diver, an organization that improves the safety and performance of divers through an understanding of human error.

The Human Diver also delivers training programs focused on developing non-technical skills and adjust culture within the diving domains, for example, sports scientific, public safety diving, military and commercial diving. Mr. Lock's safety career spans over 20 years. He's the author of "Under Pressure: Diving Deeper with Human Factors," and he also produced the documentary "If Only," which looks at a fatal diving event through the lens of human factors and adjust culture to better understand how the event occurred in the way it did.

Gareth joins us from Malmesbury, England. Welcome.

- [Gareth] Excellent. Thank you very much, Mary. Nice introduction. Thank you. Yeah. I think it pretty much covers everything. Thank you.

- Okay. So I wanted to start off saying while you're firmly associated with underwater diving, your career started actually up in the air with the Royal Air Force. Both flying and diving are high-risk activities with little margin for error, so I'm curious, what attracted you to these pursuits? Adrenaline, exploration?

- So, it's quite boring actually. I mean, so the pursuits for joining the Air Force was, was probably the best recruitment the military ever had, which was Top Gun, the original one.

- Yes.

- And so, you know, I joined the Air Force, Royal Air Force in 1989 at 18 years old. I went straight from school, joined, and wanted to fly. And got involved, really enjoyed it. Founded a challenge, went through, and had a number of different jobs, which gave me quite a broad view.

So I did eight years of flying on Hercules transport aircraft. I did a couple of staff jobs. I did a master's degree in aerospace systems, so a crash course in systems thinking. Then went to do some flight trials work, then went to do some research and development work like DARPA does, but basically for the UK. And then my last five years was as a requirements management systems engineer.

So very broad professional career. But in '99, I was on holiday in Greece and I can't sit on the beach and do nothing. So I was like, "Right, what can I do?" Went to go and find a little yacht to sail to hire, but they wouldn't sell it...they wouldn't hire. So I walked through the harbor and went, "Oh, diving. That sounds interesting."

So I did my sort of paddy open-water dive and loved it. Really good. But looking back, I realized that I didn't know what I didn't know. So that very first course that I did, we broke the standards straight away. We were limited to 18 meters, so about 60 feet, but actually, we went down to 24 meters. So it was about 80, 90 feet.

And I went to write my lock book that I'd been to this depth to 24 meters. And the instructor said, "No, you can't do that. We're only allowed to 18 meters." So I was like, okay, signed 18 meters. So then I didn't dive for about five years, and when I was on a trip to South Africa with work on the master's degree, we went diving.

And nobody checked my card. We didn't really have a checkout dive. It was a bit of winging it. And that's what happens in a lot of cases. And it was like, "Yeah, I want to get back into diving." So months later, we go to San Diego and on the Saturday I dived. We did three great dives with a guide all to limits, all to standards.

And it was like, oh, okay. And then on the Sunday, I went diving to a depth that I shouldn't have been, and I had a near miss. And I got back to the UK and, you know, human factors and crew resource management, incident reporting, just culture, they're the norm for aviation. And it was like, "So how do I report this near miss that I've had? Where's the culture for doing this?

And there isn't any. Although there certainly wasn't then, it's improving since. And it was like, "Oh, I'm interested." So I then spent the next five years learning more about human factors in diving and I was progressing my diving career as well. And it was really about doing underwater photography. That's what floated my boat was because I did a lot of topside photography and I started to develop my in-water skills such that I could go off and do a 250-foot 75-meter rec dive with cameras with helium-based mixes.

And you might be in the water for an hour and a half, two hours. So you might only spend 40 minutes on the bottom, and then you spend another hour and a bit decompressing on the way up to prevent bends. So what I really liked was the skill needed, the precision that was needed and, you know, the sort of the constant working harder and taking that mentality from military aviation of train hard and fight easy.

And it's train hard and dive easy because then you can deal with these things that are going on. And that sort of progressed in through my whole sort of diving career. And then in 2012, I started a part-time self-funded Ph.D. for about five or six years, knocked down on the head because it wasn't going anywhere. But in the meantime, teaching and getting the knowledge out through the book and also the documentary.

So there was all of self-validation there when people were looking at this going, this is really important stuff, really interesting stuff, but because it's unknown, it's really difficult to get across. And it's no different than trying to bring crew resource management, well operations, crew resource management into the oil and gas industry. People know it's a good idea, but because it's not mandated, it requires innovative and insightful leadership to say, we're because put a program like this in place because it's discretionary, not regulatory in how it's deployed.

- So you touched a little bit about why you decided to explore human error specifically. Is there anything else that you want to say about that or...?

- So actually, the journey of understanding human error is really quite interesting. So when I first started this sort of journey of learning, there was this perception that 80% of accidents are down to human error. And if we can fix the humans, then everything would be okay. And then actually, the more I spent researching this, you realize that human error is just a bucket that things get thrown into that actually don't help us learn at all.

It's the same as saying that gravity's the cause of accident or, you know, falling over. It doesn't help us at all. So, you know, the piece that if we end up with human error as an outcome, you know, that's written in a report that says the cause of the accident was human error, that report's worth very little.

And so it's looking at the conditions that surround that and really digging into what does error really mean? It's not a cause, it's a consequence. Go dig in deeper.

- When we chatted before, you said we don't manage risk in diving, we manage uncertainties. Can you tell me more about the distinction and why it's important?

- Yeah. So, a lot of activities in diving and elsewhere, and I would say the majority of those at the frontline are not managing risks per se, they're managing uncertainties. And to me, the difference is how we look at it. So, often risk is managed using numerical data, quantitative data.

We use risk matrices that we plot the risks on and we've got a magnitude and we've got a likelihood that that's going on. And often when I go into industries and when I go into the diving space as well, so how do you work out the probability of that event? And this is to somebody at the frontline who's trying to work out a risk assessment or a diving instructor at a dive site, how do you know what that is?

That's how most people manage the likelihood of an event occurring.

- For the podcast listener, he's licking his finger and putting it up to test the wind there.

- Totally. Oh, yeah, forgot this is a video and audio. So that piece there is often that probability or the likelihood is based on previous experiences. Now, in established spaces, process safety, what they will do is they'll look at past data, they'll go through a whole bunch of engineering designs and say, right, these mechanical systems have got these probabilities of failures and we can work through a tree of and all gates.

And we've basically come out with the probability that says it's 1 in 10,000. And you say, that's great for a mechanical system. Can you tell me what the reliability of a human is who's now in that system? And the answer is we can't, not with any sort of sense of usefulness. Even when we look at the error rates of skill-based, rules-based, and knowledge-based performance, they are very much wet finger in the air again.

So if we are doing things that are biased or using heuristics, what we're now doing is we're using emotions and we're trying to manage that uncertainty. So, any time we are managing that sort of unknown, what we're trying to do is increase the certainty of the outcome that we want to actually be what it is.

So it could be...and we look at both sides and I'm sure your sort of listenership will be there about bow tie diagrams. And I use a very simple diagram where we have an event in the middle and then we have stuff that we can do beforehand to control the uncertainty, make it more likely that it does what we want it to do and less likely to do an adverse event.

And then we've got stuff on the other side where the probability of the event is one, now what do we do? So we reduce the likelihood we end up with a catastrophic failure or a fatality. So in terms of managing those certainties, what we're trying to do is drive that probability of event as close to zero as possible through both technical and non-technical means.

And that's where I spend a lot of my time in non-technical skills. So decision-making, situational awareness, communication, teamwork, these I'm going to because say soft skills, but they're really hard and soft skills really puts them down as to how hard it is to develop. So we've got that in place and then say, "What do we do to mitigate the effects afterwards?"

And that could be in the diving space, it could be about first aid training. It could be having oxygen cylinders on board. It could be about doing an emergency evacuation. And then I say to people, "Okay, so when was the last time you validated your emergency action plan?" You've got this stuff written down, but when was the last time you actually tried lifting somebody from depth, giving them a surface toe, getting them on a boat, giving them CPR, and get the boat onto the shore?

Well, I've never done it. All right. So, your plan is just a plan. There is no validation. So what we talk about is uncertainty management is how to reduce the unknowns that are within there. And that's about going to practice and train and look at those things. Does that help?

- It does. Yeah. I like the idea where you're no longer focusing on trying to control the uncontrollable, such as which is the human portion of it, right? Like, we don't know how tired will someone be that day, or how tall are they. Will that affect what they're doing physically or, you know, all kinds of things like that, and pushing towards creating more certainty that they can do, or will do the "right thing."

- Yeah. And it's also teaching people about how to run effective pre-dive briefs or pre-job briefs. And the same thing for afterwards of running a post-dive debrief. And I've put a structured debrief framework together, which is about defining the aims and goals, creating a psychologically safe environment, looking at the planning, and then looking at the individual elements of learning.

So what did I do well and why? What do I need to improve on and how? What did the team do well and why? And what did the team need to improve on and how they're going to do it? And then talk about creating fixes afterwards. And out of those eight questions of improvement and, you know, doing well, then why did it go well and how do I improve it are the most important questions because the observations are easy.

Oh, that didn't work out. Okay. Or it did. Why? How do we fix these things? And that's a skill and what you're doing now is you're increasing the certainty for the next operation. So the brief is about setting people up to understand what's going on and the assumption.

So, it could be, we're going down to a wreck. We're because descend the anchor line, we're because go to a wreck and we expect to see this. We're going to go to the bow. We're going to go round. We're going to enter this doorway. We're going to go down these corridors. You might see this, don't take that turn.

This is what our gas limits are. And coming up with this bit so that people have got a mental brief. And then rehearse any of the skills, critical skills that people might not have done before, so field drills on the surface. And then when people get in the water, they've got a visualization of what might happen. And then that helps in terms of individual and team decision-making because somebody might forget to do an activity and the rest of the team might flash the lights and go, "Question, are we going that way?"

"Oh yeah." So by having a brief that creates a shared mental model before we get in the water because once we're in the water, the ability to communicate is really limited. You've got a piece...you know, you've got a regularity in your mouth [vocalization]. And now you're just limited to hand signals and you might have a wet note, you know, a waterproof notebook, but now your vocabulary set has reduced from this big to not a lot.

So, you know, it's much better to sort out the problem on the surface before you go underwater. Then you go and do the dive and then you do the debrief to learn, which then sets you up for the next activity.

- So, one thing that you've told me before that you see as problematic in the diving industry is that accidents and near misses, of course, happen but they're not discussed enough. Why do you think that is?

- I think there's a number of reasons for that. One is that divers are not taught and agencies are not taught about how adverse events occur. So there's no investigation training, there's no organizational learning frameworks that are in place. So the first bit is it's easy just to blame the individual. If you don't understand systems and how they interact and how people make, I'm going to say stupid decisions, how it makes sense for them to do what they do at the time, then it's really easy to jump to the natural conclusion, you know, the fundamental attribution bias, you are stupid because you didn't pay attention to what was going on.

All you need to do is pay more attention or communicate more clearly or very simple things. So the first bit is there. The other bit is that the way that the industry is set up, it is not designed for organizational learning. So the training agencies create training materials, training programs, and they train up their train the trainers.

We might have instructor trainers or course directors. And their role is to train instructors to deliver a program that goes out to divers who then qualify from a class and then they leave the training system. And so now the training agencies have no real interest over what happens to those divers when they go out there.

There is an enforced air gap that isolates the organizations from what happens at the dive centers and then what happens outside. So the other thing then is there isn't any pool, you know, to get...there are a couple of organizations out there like the British Tobacco Club or Divers Alert Network, but they are voluntary organizations which receive submissions.

But if people don't know what an incident is, they don't know what error is, they don't know what error-producing conditions are about, there's a huge amount of stigma associated with making a mistake. And so we end up going into the social media because that's really the only place. And it's a really immature environment where people put the hand up and say, "You know what? I made this mistake."

And everybody throws metaphorical rocks at them for being stupid, "Oh, it's so obvious that that was because happen." And it's where I have my most funded games online with people because I just... And some people open their eyes and go, "Aha, that's what local rationality is about. That's how it made sense for that individual to do what they did." Others are completely closed to go, "No, that was obvious. They should have spotted this coming. And therefore, they deserve to be punished either socially, or they get punished through the litigation process."

And I think that's the other piece is when adverse events occur, there's a fear that litigation will happen. And so if you don't talk about things, they don't exist. And that exists at the organizational level. And I've heard number of stories from different agencies where if an adverse event occurs, might be a serious injury or it might be a fatality, the conversations happen by telephone because then it's not written down and then it's not available for discovery because then there's a fear of litigation.

So, when I've challenged these multiple individuals and said, well, how does organizational learning happen? A word of mouth, which is great until somebody leaves and you don't actually share this stuff word of mouth context stuff because people's memories fade. And so, you know, it's not set up as an environment to learn because it's about pushing the risk to the lowest level possible.

And if you failed, it's because you didn't follow the standards. But there are a number of standards out there that can't be followed in the context of a commercial entity. So it's for you as a dive instructor to manage that risk, even though the organizations know that it is not manageable, it's manageable as long as nothing goes wrong. If it goes wrong, then you are all there because you broke the rules.

It's like, but we can't follow the rules the way they're written, but it has to give you the freedom to do what you need. So, I mean, it's a rich area for investigation and research. And that's where I'm doing a lot of my stuff.

- Well, and you have said too that when you go on speaking engagements or to different events, you often have sort of the hallway confessional. People come up to you and they tell you about near misses because they trust that you will not call them stupid, that you will sort of not judge.

- Yeah, totally. And so, yeah, I mean, and so you get these confessionals. You get these stories. And they go, "Well, I hope that helps, but you can't tell anybody about it." Oh, man. So, you know, the way that we learn is through stories and it's understanding that rich context. And that's really what I put the documentary "If Only" together about.

And it was a tragic circumstance of a diver who died on a training course and his wife and now widow was five months pregnant. And, you know, just really tragic situation. And she was the first person I've seen that has shared whats [inaudible 00:21:13] about what happened on this dive.

And after probably about six months or so through an intermediary, I made contact with Ashley. And we sat...I basically explained what I was doing and how I was motivated to produce something like just a routine operation. So if any of your listeners are from the medical space, they'll probably recognize this documentary that was produced by Martin Bromiley about the death of his wife in a routine operation.

And so I sat at...you know, basically talked with Ashley and she wanted to share the story as did the rest of the dive team. So I flew out to Hawaii and basically for three or four days, ran a small documentary. We did a lot of face camera work.

We did a simulation, a reenactment of the dive, and then we put the video out there. And it's being used by an energy company as their safety week video. We topped and tailed some of the bits, but it is a story that is applicable in any space because it's about error-producing conditions, time pressures, lack of psychological safety, unclear guidance, lack of leadership, conflicting goals between doing one thing and another, the inability to speak up when the team are listening and going, "This isn't right."

But you know what? And one of the driving factors, they were all either ex-military or police. So there was this bit about you don't criticize in public, you praise in public and criticize in private. And there was this piece of, we're because speak to the instructor after this, and it was too late. And it's a really powerful documentary.

And I'd love to be able to do more of that, that explores an event through local rationality. And I use it in my training courses. I provide a very simple narrative, the sort of thing you would get in social media, three or four sentences, and then I say to people, "What do you think happened?" And there are a number of responses back.

They go through the course and the final modules of the course are about watching the video, the documentary. And then I go back to the question and say, "What do you think happened?" And it's like, "Oh, there's a lot more to this. It's not the final he was last to touch it, therefore, it's his fault." You can see multiple factors coming together as this sort of convergence of bad things, which each of their own wouldn't have been an issue.

And therein lies one of the challenges we have as safety professionals. What do you report? Well, it's that. Yeah, but that's not really... But if you add that, that, that, that, that, then we get a failure. But nobody knows what that order is going to be. But if we can raise the bar and we say, look, we're looking for excellence, then actually we can close that down and we can make that good.

We can close that down, we're more likely to have a positive outcome rather than negative one.

- So, talking about that rich context of understanding, we've had Ivan Pupulidy on the show talking about his learning review guide. So first of all, for listeners who missed that episode, can you briefly summarize the kind of approach that he talks about?

- Yeah. So Pups pieces is actually about telling stories. It's about getting people into an environment and, you know, and works at the fatality level as well. It's this is not just minor things, these are very serious events. And they moved from what you can consider quite an accusatory approach under the Serious Accident Investigation Committee into this learning review approach, where you would get people at all levels within the organization, but talking about how did they make...or how did it make sense for them to do these decisions they did?

How were they managing the risks? What were the trade-offs they were having to take? Because there are always trade-offs in place. You know, in hindsight, you go, well, you traded that wrong. So we have efficiency, thoroughness, trade-off, ETTO. And in hindsight, we could look at that and go, "Uh-oh, that was stupid." But if you don't get their story, then you have no idea how to improve it because they're the people making those decisions in real-time with incomplete information, with conflicting goals, and a target that's out there that they hope they can get to.

I mean, it's a great thing...and it's a free resource as well.

- So, learning reviews you've said before are a great tool. What kinds of pushback from management or leadership do you see or foresee may be, and how would you suggest overcoming any resistance?

- I'm going to say the biggest resistance is, is this piece that says I have to have somebody to blame? I have to have somebody who's going to be responsible for this or accountable for this because that's just a human nature.

It's also a societal and business piece. And often insurance and lawyers in the legal system want to have an individual to hold to account to say, if we punish that individual, they won't do it again and therefore, we've fixed the system. But in complex systems, it doesn't work like that. We have multiple factors together.

So in terms of approaching the problem and solving it, I'll take an example that I got from Todd Conklin, another, you know, sort of human and organizational performance leader in this space is the Jenga game that he plays. And I've done a slight adaptation to that. So Jenga is 18...sorry, 54 bricks.

You build three bricks on a time and you build this tower that's 18 layers deep. Now I've bought a colored version of that, so I've got six different colors of that. So I've got three layers per color. And what I did was, you know, this was leaders in your organization, I gave them some instructions on the board that basically said, you've got to make a mixture of these layers.

Some of them are solid and some of them are mixed-up colors. And so we had four tables and all of the towers were completely different because they put them together differently. You know, they've got the same instructions. So then we said, right, what we're because run now is an exercise. Every two minutes, I'm because roll a dice, it's got colored sides, and we're going to take a brick out of that.

You're going to take that colored brick and put it on the top. And those colors were things like HR, legal, work instructors, you know, so they were just categories of sorts of paperwork and processes that go into an organization. So funny thing, you roll the dice and people start pulling bricks out in the different part of the tower. So you get to the end of the 12 minutes and people's towers are all different.

They've got different holes in them, even though they've been given the same instruction. So we've now got, you know, a complex setup. So we start building this and there's a competition who can get the highest tower. Some of the teams quite reserved, and they're like, no, we're go... Because if we fall, if this tower collapses, we've lost everything. So there's also...sometimes there's this game where basically they wait for others to fail and they don't.

So it's this the winner isn't necessarily the highest performer, they're the ones who can stay in the game the longest. And so when you pull that final brick out and the tower collapses or that specific brick out, and then you say to them, "So was it your brick that caused this to collapse?"

"Well, yeah, because I pulled that one. But it was a pretty unstable tower, wasn't it?" There's a whole bunch of holes depending on the different processes you have. So you are just the unfortunate individual left holding that brick and this collapse pile. And what I've found with any of the work that I do is get the point across in a non-confrontational manner, something that they can relate to that doesn't trigger their professionalism.

I use computer-based simulations. I've got a game called Planks. I've got some stuff with Lego. And it's all about getting them to understand the cognitive and social aspects of an operation. And then you explain some of the theory, ask them to reflect and draw on the parallels for their own operation and now they've discovered the answer, as opposed to going in and saying, "Well, you know what? This is the problem with your organization and I had the solution for it, and I don't."

And, you know, it's just like, well, that's a pretty rubbish consultant then because you're not able to sell anything. But they've got the knowledge. All I can do is unlock it and I can teach the leaders how to unlock it. And that's also a difficult piece because the leadership in the organizations often aren't given the time to go and lead.

They're managing paperwork and it's easy to count stuff. It's much harder to go out there and have a conversation and you come back and go, so what did I get from that? What's my metrics? What's my product from that conversation? And go, it's knowledge. It's about shaping stuff going forward. It's about joining different teams, but that's not countable.

It's like, yeah, you're right. It's not. Tough. You know, when we're talking in a complex space, counting doesn't help us. It doesn't. You know, unfortunately, some people want those metrics. And going back to 2015, I gave a presentation for some work that I'd done at the oil and gas sector.

And I stood up on the second day, so it was the inaugural IADC Human Factors Conference. And I stood up on the second day and I said, yesterday we talked about the fallacy of using these metrics like total recordable incident rate and those sorts of, you know, lagging indicators. And I said, you as the prime contractors are putting those requirements on your people bidding in because they were talking about how do we move away from this?

So you have to stop putting those metrics on because the only reason they're coming into those is because they have to win a contract that comes up with some fallacious number that says it's 0.9 or it's 2.7. Whoopy. You know, that's as much about luck as it is about the performance and the, you know, what's going on. There's that soap box.

Sorry.

- No, that's exactly what this is for. I mean, I don't know if you can narrow it down, but is there something specifically you would like to see discussed more openly or better understood in the diving industry? All of the above?

- Well, actually, one of the things...so I'm doing a master's at Lund University in Sweden, master's in human factors and system safety, and I've just come back from the thesis preparation week and so we've got to come up with a research question. And actually, what I'm because look at is second stories. And it's that ability to create an environment and maybe use something like axi map, which is a representation of an adverse event or a successful event, but looking at relationships that exist at sort of the work, you know, the diver level, the instructor level, the trainer level, the agency level, what happens nationally in terms of regulations and societal stuff.

So, you can see these things interacting, and it's a very powerful way to show not causality, but relationships. And that's what I'm probably because do for my thesis is using axi map to explore second stories. And second stories for those who don't know, first story is where something happens, an accident happens, an incident happens and you get the stuff that's happened...you get the narrative straight away, the time, the people, the immediate cause about what happened.

There's often a lot of blame. There's a lot about focus on individuals. The second story comes out slower. It takes curious questions to understand what's going on. It requires a just culture so that people can talk about, you know what? These are the trade-offs that I have to do all the time to get this job. But why haven't we had the accidents before?

Because I'm managing that capacity, I'm managing those uncertainties and getting it done. And for whatever reason, something tipped the balance and I lost control of the capacity and we had an accident. It's about understanding how busy management are, because often there's this bit of safety culture, poor leadership, that's it. And you can actually find out how busy the leadership is and how busy the management are and the goals that they've got to manage.

And instead of going through, you know, this expectation, here's the document set, you need to review all of this as part of the sort of the process, the validation process, make sure it's okay, the auditing process before it's sent out there. And then you find out that actually they only got 20 minutes to review it because they had a whole bunch of other big KPIs, which their bosses said that's the most important thing. And they know that that document set has to go out because there's a project coming in and they've got to execute it because there's money associated with it.

So it's going up and out, not down and in to find out what's going on. So that would be my big thing is, how does it make sense? Explore the local rationality beyond what's going on.

- Yeah, it reminds me of, and I don't remember who originated this, but the sort of the five whys. Why did this happen? Because of this. Okay, but why was that the condition? Well, because of this. And the second stories, you know, pulled out in that way.

- Well, they do. And one of the challenges of five whys was it came from the engineering space. So why did a mechanical system fail? Well, because of this and because of this and because of this. But as soon as you put people in the system, then actually you've got that variability. So five whys is often described as a way of finding the root cause.

The problem is the first rabbit hole you go down sets the decision-making in train. So now you go, why was that the case? And invariably, it ends up as poor management or more training needed because that's how you end up closing these things down.

Whereas something like axi map or anything that's systems-based, what you're now looking at is multiple elements and their relationships. And five whys, you can do...you know, and a lot of the root cause analysis type tools that are out there have this why, why, why, why? And then they start drawing lines between those whys, and that becomes really difficult to try and work out how to address the system.

We can't fix the system, but what we can understand is might be a bunch of those things go into one node and you go, you know what? That's a critical thing that if we could address that, then actually there's a whole bunch of things around it that we can probably influence. So I would caution people about five whys because if you go in as an engineer, you'll go in with an engineering five why?

If you go in with a social perspective, there's...and I gave a presentation earlier this year or tail end of last year called the root cause of an accident is your imagination because we create causality based on how we look at things. And people will look for their evidence that meets their piece. So actually having human factors in a systems approach means you've got to have an open mind.

And that means the same thing in the learning reviews, joining this stuff back from Pup, you need a diverse audience. You can't just have one or two people, which often is what happens in a safety investigation. You get the safety officer leading it, or you might have one or two people, and then they go round interviewing people individually.

The power of the learning review is you have people together and they will bounce ideas off and somebody will say something, you go, "Oh, hang on a minute. I remember something like that six months ago." "Oh, why didn't we pick that up?" So you end up with these really rich stories by having cognitive diversity, and that's what psychological safety supports.

- Yeah. And I think last time we spoke too, you mentioned diversity in terms of perspective, like not just the frontline, not just the safety person, but leadership because they've got a different context for why certain things were set up, procedures were set up in that way, right? So, you can get different contexts, whereas the person on the sharp end of the stick has a very specific context.

- Yeah. And it's interesting, you know, you go and ask frontline workers, "Why do management do what they do?" Because they're trying to make money. They're trying to, you know, save time, make money. And it's like, you don't think they're interested about safety. No, no, no. Because otherwise, you know, they'd do these things.

And you go, "Have you actually spoken to these individuals?" And that, again, is part of the problem is we have this sort of management silo. We have these work silos. And that's where the learning teams and the learning reviews add massive benefit because you can now hear their story and then you see the reason why we got this set of gloves is because it might have been a contract that was already in place.

And actually, you know what? If we got those gloves, they were about a 10th of the price because they came from a supplier that allowed them to be thrown in as something else. There were commercial imperatives or whatever they are. It might be that there's a big contract that's there. You know, why do we pick that work? Well, because actually it's a huge piece and if we get that, we get some...you know, we can then get revenue, and then we can improve things.

But unless there is a feedback mechanism to understand this work is done, then those further up the chain have got no idea what goes on at the frontline. And that's I'm going to say a relatively easy thing to do technically, physically. It's hard to do socially as you go out there and say, "Right, tell me about your job."

And if you only do it infrequently, the world will smell a fresh paint because you will go around and everything will be as it's supposed to be. Whereas actually if it becomes a normal activity, "All right, John, come and hear what's going on," as opposed to, "Oh, John's here, why? What have we done?

- Yeah, what have I done? I'm I in trouble?

- And, you know, it takes time to build those relationships. And, you know, two simple questions that I use when I go into places, what works around here and what sucks. And you can get so much. And people are like, yeah, but when you say what sucks, you get lots of gripes and bitches and moans about stuff. You go, you're right, but what you've got to do is filter the little nuggets and go, "Oh, hang on a minute. I didn't know that."

Because I know that pay's never enough and that the conditions aren't there. Yes, we're trying to address those, but it also gives you an opportunity to talk about the rationality from your side and even getting frontline workers up into the management space and shadow a manager for a day and realize what's actually involved and the drivers and the pressures that the management are under is often unknown by the workers.

- Yeah. So, you had talked about speaking of different tensions, different factors. Something that diving has in common with other high-risk industries is the kinds of pressure it faces and tensions it needs to balance. And you mentioned Jens Rasmussen's I'm not sure if I pronounce that correctly, model of risk management in a dynamic society.

So, tell me a little bit about that and how it applies.

- So, the space that Rasmussen put together was that we have three key tensions that are in place for a commercial organization. You've got a financial aspect, which is a boundary. You cross that boundary, you go bust. You've got a resource boundary, and that's about how much workforce...you know, how big's your workforce, the workload you can deal with.

And if you cross that, you can't get the work done. The work will collapse for whatever reason. And then the third dimension is safety. So we've got finance, resource, and then safety. Now, this unacceptable performance line is unknown in location. So if you go bust, you know, you can look at your bank balance and you can see where things are going wrong.

In terms of your manpower and your resource, you can see where you are getting close to that boundary and you're because do something about it. The problem with the unacceptable performance, the accident line is we don't know where it is. So what we do is we provide a little bit of a buffer and we call that the sort of a margin for error. And what we'll say is that's how we perceive to be unsafe, this inside lane, and we'll operate at this space here.

Now, the thing is that every time management cut costs or, you know, reduce the financial viability, what they do is they push the operating point towards the unsafety line. And when you mess around with the resources and you take that off, you take things out towards the unsafety line.

And then you get the operations people and the safety people, they're trying to push it back this way which costs you more and it takes more people. And so within this three-part model, we have this dynamic place where the system is operating. Now from a purist point of view, the most cost-effective and resource-efficient point is on the accident line or just inside the accident line because...

And the thing is that you don't know where that line is, and every time you take a bite out of safety, you are reducing that margin. Now there's a double problem here is, A, don't know how big a bite you took and you don't know where the failure line is.

So you've got this tension that exists, which you're trying to push safety this way, which costs you more and isn't as efficient. So organizations can navigate that space and they need to have feedback and they need to turn around and say, you know, to the operators because they're the people dealing with that risk, the frontline risk all the time.

Management are often uncited how close they are if they haven't got any feedback. And if all you've got in terms of your feedback mechanism is a lagging indicator, oh, had an accident, oh, had an incident.

- Guess we crossed the line.

- Yeah, exactly. We crossed the line. You are, oh, man.

- Oh, we found it.

- So, it's this bit that's...and it's a really simple piece to understand and conceptually explain this. When you at the management level make a decision and you change something, you push the margin that way. I don't know where it is, you don't know where it is until you cross over it, and therein lies part of the problem.

But if you don't understand that you are pushing because of your behaviors at the management level, then actually you are dealing with...you know, what you think is something a little bit more certain than this.

- And you also mentioned that there's tensions between learning and judicial systems. So, what are some of those?

- Yeah. So, in the diving space, I sort of touched on why doesn't learning happen. And it's because there's a fear of litigation. And having spoken to some lawyers about this, they think that fear is unfounded. What we see when things go wrong are big cases, but what we don't see is the number of cases that are actually thrown out.

So you might raise...you know, you might go to, to sue somebody for something and they go, "No, it's not." So we don't see all of those actions that get thrown out. And those that hit the judicial system as opposed to the civil action system, there are very few civil cases that happen in terms of judicial punishment and a sort of a crime rather than a civil action.

And one of my co-students at London is doing some work in the healthcare space about just culture and how it's managed. And he said, the reason why people go to this civil litigation is because it's a grownup way of stopping going around there and beating the person up to get your vengeance. Whereas often what is needed is, you know what, Mary?

I'm really sorry that that happened. This is what we're going to do to fix it, and what do you need to resolve it? And Sidney Dekker has put a checklist together based around restorative just culture of who's being hurt, what do they need, and who's responsible for fixing it. And that's a simple thing that can be put in place, but often there's this piece of, "No, I'm right, you are wrong."

And the only people that win are the lawyers, and especially the no-win, no-fee lawyers.

- In Canada, I would say that if you are wanting to learn about restorative justice, there's a lot of information and exploration in indigenous communities because it's a more traditional way that societal harm has been dealt with.

- Yeah. Oh, totally. And really my wife's studying for...doing some studying for a forensic linguistics. And one of her colleagues shared with her a story in Norway. So in the UK, there was a tragic, horrific case of a small boy, James Bulger, who was killed by two young boys in public and they basically tortured him and he died a horrific death. And there's a parallel with a case in Norway from 2010, where a little girl was killed by two boys while they were out in the snow and they've no idea what the trigger point was.

And in the UK, it was real sort of [inaudible 00:48:12] about trying to get revenge and the punishment for these two boys. Whereas in Norway, the town came together and said, "How did we fail? How did we let this situation develop in the way it did?" And that included the little girl's mum.

And you're thinking, wow, the ability for a community to come together and say, how do we fail and what can we do to improve things is fantastic.

- I'm going to shift gears here a little bit because there's something that I want to get to before we finish. You've also talked about drift in training, which is something that we've never really covered here, but is applicable certainly beyond the bounds of diving. So, how do you define drift?

- Ooh, outcome. As a simple bit... So drift can be positive and negative, so that's why I said about outcome. So drift could be innovation or adaptation to the situation you are in. Often drift is seen as a negative connotation because people have deviated from the standards and we've had a bad accident, and they turn around and go, "Hey, look, there's the standard. They've drifted from this."

Therefore, there is drift that occurs. But you could also say, what's changed in this situation? Why is it that they've had to adapt their behaviors to deal with that situation? So, what I would say is what's drift, it's a deviation from a baseline that was already existed. Why you've drifted from that is the second story. It's actually understanding, you know, the adaptations that are there.

- That's the difference between, well, this was a stupid thing you did and it wasn't part of the plan versus, oh, something unexpected happened and you're a genius for having improvised in this way.

- Oh, massively. And, you know, not relating directly to drift, but if you look at Sully and the ditching in the Hudson, you know, hero. What would've happened if it crashed into the skyscrapers trying to get into LaGuardia or Teterboro? It would've been villain. And there was a significant amount of luck involved in that situation.

- And, in fact, there was a big investigation that didn't see him as heroic at all, seeing that he had taken unacceptable risks and gotten away with it. So that's a whole discussion.

- So, it's, you know, the outcomes we have. Yeah. So going back to the drift for instructors, there isn't really any...I'm going to say there is a quality assurance or quality management process within the training agencies. So the instructors get qualified and then with the exception of a few agencies out there, the majority of them that's then qualified for life as long as they pay their annual fees to the training agency and they attend some webinars and things like that, but nobody really looks at the performance of that instructor delivering a class or what they're like in the water.

So, there's a lot of I'm going to say self-policing that goes on. And the problem is that we know we're rubbish at self-review. And so we have to have some form of critique, and that could be co-teaching a class, but if you're going to co-teach a class, you've got to have high levels of psychological safety to be able to critique.

We've also got to have a defined bar of what good looks like. In the majority of cases in the diving industry, it's about ticking boxes that you've said certain things, certain activities have been done, but there aren't performance metrics. There is a vague term called mastery. So, somebody can pass a class, then they've achieved mastery. Okay.

So what does that mean?

- Yeah. And how do you judge that? Well, you've almost mastered it, but not quite. Now you have mastery, - So did you get it right two times in a row, or three times in a row? Or did you not get it wrong when you were placed under a lot of pressure and so you could replicate it in a realistic environment?

So if you look at those centers' instructors who are quantity-based, so you know how many certifications they can issue, it's more likely to be at the two or three because we can get people through classes. Whereas if you have at the other end of the scale, where you have people who are quality biased, then they're going to be looking at the you did make a mistake when you were under pressure.

But there's no real enforcement, there's no standard, there's written stuff, general stuff. But in terms of performance standards at what happens at the delivery level, it is often quite vague.

- I wanted to ask you too about HOP, human and organizational performance, which I think is gaining traction in the safety world. What do you think is the, maybe not ideal, but preferable relationship between safety management and operations when an organization is trying to implement HOP?

- No. I mean, so like a lot of these things, the pure answer would be operations run HOP. The real answer is safety runs HOP because it's seen as a safety thing. But HOP applies to HR, it applies to logistics, it applies to the accounting systems or everything. HOP applies to all of that.

And so I think it should be the operations teams that run with HOP. Now that means that the organization has to give them some capacity to learn about HOP because you don't just get this transfer of information. It's like a lot of human factor stuff. It is general in nature and specific in application.

You can teach somebody the general concepts, but they have to go out there and learn the specific application, the stories that apply in their space. They've got to be able to create an environment where people will talk about things. And probably another side advantage is that operations probably has more clout, more power than safety does. And power is a huge thing when it comes to change and trying to get good outcomes.

- Yeah. And I think maybe one of the reasons for that is when operations go well, you can see a financial difference. When safety goes well, it's...

- Invisible.

- There's absence. There's no.

- Oh. Yeah. Totally.

- In the way it's traditionally judged.

- Oh, it is. And, you know, the sort of...oh, it was probably...was it '97? So, nearly 30 years ago, we talked about safety or reliability as a dynamic non-event. And that then became safety as a dynamic non-event. And it's like, yeah, nothing happens, when things are safe, nothing happens. But what you don't realize, in the same way as that dynamic risk model, there are constant adjustments and tensions, adaptations that are going on.

So actually, your workers, your leaders, your managers are creating site safety dynamically all the time. And safety is often measured by its absence. So, you have a whole bunch of people who are trying to stop bad things from happening And their only metric is when bad things happen.

And the paradox of safety is the safer you make something, the fewer the bits or the lesser the bits of evidence there are that you're dealing with unsafe. So if you've got a failure rate of 1 in 10,000 or 1 in 100,000, whatever the number is, and you increase that metric, how many times do you have to do something to make the next incremental jump?

And I'm sure you've got a whole bunch of statisticians who listen to this, who'll go, "Right, if you wanted to move from this order of magnitude to this, this is what it is." What I do know is it's a lot. And there's no way that you are going to get that many identical scenarios given the variability of individuals, you know, is your safety performance based on luck, or is it based on the activities that are going on?

Wet finger in the air [inaudible].

- It's a lot of people working very hard for an outcome of nothing.

- Yes. Yes. And so you're in this ethical moral dilemma, Right, What we'll do is we'll just stop safety and we'll just...you know, people can get on and do what they need to do. It's not quite as simple as that. We have to have something. But what is that something? And that's the constant adaptations that are going on.

Find out what people think is the issue and listen to them.

- So I have...we're past the main part of the interview, but I do have a few questions that I like to ask all my guests. So, let's go to the University of Gareth, where you can teach whatever non-technical skill you like, but it must prepare tomorrow's safety professionals for the workplace.

What would that skill be, or what would one of them be? Let's make it a little.

- Yeah. Psychological safety as a general term, because I'm really quite passionate about it. It sits there. It can be a bit of an airy-fairy term. What I would say then is the ability to create trust within your teams up and down and across so that you understand the stories that are going on out there.

And people can come to you and say, "You know what, Gareth? That's a rubbish idea." And I sit there and go, "Okay, thank you. Why? Give me the examples." And I don't go, no, it's not because of this. So humility is linked with that psychological safety as well. Recognize you don't know everything and you'll never know everything.

You're on a learning journey. So I'll probably give you about three or four things there.

- That's fine. I hope people have their pencils and their notepads out. So if you could go back in time to the beginning of your career, this could be your entire career or your safety part of your career, what's one piece of advice that you might give your younger self?

- That bit about listening That I don't know everything. So I would say I'd go back to 2011 when I wrote a white paper, which was about trying to bring HFACS, the human factors analysis and classification system into the sport diving industry and looking at incident reporting and I didn't understand the industry, I didn't understand the stakeholders that were in it. And I wrote quite a blatant paper that upset quite a few people.

And some of those people 11 years later still don't talk to me because of the criticism that happened in that white paper. And I look back at some of that stuff and some of it is changing, you know, stuff has been incorporated since then, but it took me quite a while to build the bridges that I had torched massively at that stage.

I'd gone in there and gone, I know the answers, here's it. And basically charged straight in and burnt stuff big time. So that would be my sort of guidance of understand the system and how it operates and the interactions that exist within it because I was certainly quite naive as to how the system...that bit of any system, it operates in the way it's designed or every system operates perfectly in the way it's designed.

The diving industry is about publishing training materials and isolating the training agencies from the risk, the litigation, the liability for what happens at a dive center and a dive instructor.

And I hadn't realized that there was this air gap that actually why would the organizations get involved with what happens down there? Because they don't want to be. And as soon as they start getting involved and something goes wrong, you know, they attract liability and it just won't work that way. So yeah, understand the system in a bit more detail.

- All right. Well, in fairness to young Gareth, you don't know what you don't know. And now you know.

- Exactly. Exactly. On a learning journey. And actually, it's been a useful piece for me to then explain how the system operates to lots of divers out there who don't know who are like, "Yeah, well, this is what it's about." It's like, "No, it's not." And the reason why we don't have organizational learning or we don't have, you know, spread learning, distributed learning is because the system is not set up for that, and the system's not interested in that because it's a different part of what goes on.

- Okay. Well, now I want to ask you about resources. So, obviously, there's your book, there's your documentary. Are there any books or websites or projects that you think would be useful to listeners who want to learn more about really anything that we've talked about today?

- October last year, I ran the first Human Factors in Diving Conference. It was an online conference. So within the Human Diver YouTube channel, there's a playlist which has got 32 videos on there. The website, the Human Diver has got about 150 blogs. And when I share them on LinkedIn, you know, I'll just say this is not just diving stuff.

That's the beauty of HOP and human factors. You can just change the stories. People are people, the environment's the same as in how we interact socially and culturally. You just change the topic, you change the targets and the rewards, but people broadly the same.

So, anything on thehumandiver.com is out there. "If Only" documentary is applicable in all walks of life. As I said, I've had people in an energy power company ask if we could modify it, and we did. And it went out to about 5,500 people, and it was really useful.

- And where can our listeners find you on the web if they want to reach out?

- Www.thehumandiver.com, or on LinkedIn, I'm quite a prolific poster on there, so you'll find loads of materials. And I try and post stuff on a fairly regular basis, stuff that I've picked up from my research or good ideas.

I just love sharing stuff. So yeah, I would say pretty much every day, there's something on LinkedIn that will grab somebody's attention.

- Well, that's all the time we have for today. Thank you to our listeners for tuning in, and thanks so much for sharing your unique perspective. Gareth.

- Thank you very much for the invite. Really enjoyed it. As you can tell, I'm quite passionate about this stuff, so thank you for giving me the opportunity to share that passion and knowledge.

- And finally, I'd like to do a deep dive of gratitude and yes, that was a dad joke, to the Safety Labs team for all their organization, their production, their management, and more. That's all for this episode. Bye for now. Safety Labs is created by Slice, the only safety knife on the market with a finger-friendly blade.

Find us at sliceproducts.com. Until next time, stay safe.

Gareth Lock

Founder of The Human Diver | counter-errorism in diving (TM) | Author of 'Under Pressure: Diving Deeper with Human Factors' | Connector | Public Speaker

Find out more about Gareth’s company, The Human Diver.

The Human Diver’s YouTube channel with over 30 video resources: The Human Diver - YouTube

Watch “If Only”, Gareth’s documentary about a tragic and avoidable diving accident: If Only... (thehumandiver.com)