Spring Parker Accelerating Health Care Performance

Purdue University School of Aviation and Transportation Technology

Featured Guest: Dr. Debra Henneberry, EdD and Professor Abner Flores

What they do: 

Dr. Debra Henneberry, EdD is an Assistant Professor in the College of Aviation and Transportation Technology at Purdue University.  She's an experienced commercial pilot, flight instructor, and aeronautics professor.  Prior to her current role she served as Assistant Professor in the Aviation Department at Vaughn College of Aeronautics and Technology in New York City.  She also served as a government administrator for several years. Dr. Henneberry has worked as a first responder and emergency medical technician for over ten years.  Her research interests focus on human factors and she has spoken about pilot training at a number of international aviation psychology conferences.

Professor Abner Flores is a Senior Lecturer in the School of Aviation and Transportation Technology at Purdue University.  He is an experienced ex-military pilot, navigator, and aircraft technician, who has served in both, the U.S. military and the Honduran Air Force. Over the last 11 years, he has been traveling around the world, teaching and training professional pilot courses in several countries throughout Europe, the Middle East, and lecturing in colleges and universities across the U.S. Prior to coming to Purdue University, Professor Flores was a lecturer at the University of Nebraska-Kearney, Aviation program. At Purdue, his research interests are focused on Human-Interactive Pilot Enhancing Performance Technologies (Simulation), Data Science for business processes modeling in aviation and Neuroergonomics. He currently teaches courses in Aerospace Vehicle Systems Design, Analysis And Operations, and Human Factors for Flight Crews.

On risk:  "Risk is something that we look into, from every possible perspective associated to whatever it is that there is anything going on, with the understanding that when we think risk it's a constant … When we focus on what it is that we are trying to accomplish, and in this case as far as in aviation is simply we minimize risk, we become safer. And guess what, when we reduce those levels of risk, we become safer, automatically at the same time we are becoming higher performers. And that's what we want. We want to perform at our best. And so risk gets to be where it needs to be down there and we can fly above it … Fatigue is culprit number one, or threat number one, to what we call in aviation situational awareness. It means that you're not anymore connected with the environment, you're not anymore within our concepts of aviation connected with the airplane, you are not any more connected with any one of the systems, you're going to be seeing without seeing … People under emergency situations may revert back to their native when under stress … Everybody must be involved in risk because risk affects us all … If the human element is not up to speed, and that typically is unfortunately the case, then there are going to be issues needing to be resolved."

Transcript

Scott Nelson  0:01  
Welcome to The Risky Health Care Business Podcast, where we help you prepare for the future by sharing stories, insights, and skills from expert voices in and around the United States health care world with a mission to inform, educate, and help health care organizations and individuals, ranging from one doctor practices to large integrated systems and organizations throughout the dental, medical, and veterinary health care industry with risk, while hopefully having some fun along the way. I'm your host, Scott Nelson, a guy that grew up in Ohio and has been working all over the United States during my 20 plus year and counting career in the health care industry, with a commitment to accelerating health care performance through creativity, not just productivity. Let's dive in.

Medical errors have historically been a health care issue in the United States, sometimes a contributing factor in sentinel events.  Forbes published an article in 2020 that noted prescription and pharmacy errors, errors in judgment, skill, coordination, diagnostics, system or automation defects can also be contributing factors.  That article - "What Can Healthcare Learn From Aviation Safety" - is a topic we are discussing in this episode.  Health care has, is, and will always be compared to other industries with high levels of risk.  It should be.  And health care should listen and learn from other industries to adapt and implement into it's systems, teams, and individuals.  But that isn't a widely accepted and practiced opinion.  Becker's published a piece in October of 2023 - "Healthcare wants to fly as high as the aviation industry. Can it?" - that points to both sides.  On one side it quotes a doctor that says "If the airline industry made as many mistakes as healthcare does, there would be about 460 plane crashes every day" and on the other side another doctor states "There's a finite number of different things that can go awry when you're flying a plane and there's an infinite number of things that can go awry in medicine."

There are many things that can introduce, influence, increase - and decrease - risk vulnerabilities in health care and not just medical errors and the other contributing factors in that Forbes article.  Finance, operations, culture, people, technology - almost anything and everything that touches a health care organization.  Aviation has long been argued as a place to look and learn for health care.  Dr. Atul Gawande has been a long-time proponent of this idea, including his popular article in The New Yorker that highlighted why hospitals should adopt the aviation industry's heavy reliance on safety checklists as well as his book "The Checklist Manifesto." 

Today I'm speaking with Dr. Debra Henneberry and Professor Abner Flores about risk in aviation.

Dr. Henneberry is an Assistant Professor in the College of Aviation and Transportation Technology at Purdue University.  She's an experienced commercial pilot, flight instructor, and aeronautics professor.  Prior to her current role she served as Assistant Professor in the Aviation Department at Vaughn College of Aeronautics and Technology in New York City.  She also served as a government administrator for several years. Dr. Henneberry has worked as a first responder and emergency medical technician for over ten years.  Her research interests focus on human factors and she has spoken about pilot training at a number of international aviation psychology conferences.

Professor Flores is a Senior Lecturer in the School of Aviation and Transportation Technology at Purdue University.  He is an experienced ex-military pilot, navigator, and aircraft technician, who has served in both, the U.S. military and the Honduran Air Force. Over the last 11 years, he has been traveling around the world, teaching and training professional pilot courses in several countries throughout Europe, the Middle East, and lecturing in colleges and universities across the U.S. Prior to coming to Purdue University, Professor Flores was a lecturer at the University of Nebraska-Kearney, Aviation program. At Purdue, his research interests are focused on Human-Interactive Pilot Enhancing Performance Technologies (Simulation), Data Science for business processes modeling in aviation and Neuroergonomics. He currently teaches courses in Aerospace Vehicle Systems Design, Analysis And Operations, and Human Factors for Flight Crews.

Let's talk with Dr. Henneberry and Professor Flores about risk in aviation and how it can inform and help health care.

Dr. Henneberry and Professor Flores, welcome to the show.

Dr. Debra Henneberry  4:09  
Thank you.

Professor Abner Flores  4:11  
Thank you for having us.

Scott Nelson  4:12  
Before we begin our conversation about risk in aviation and health care, I'd like to take a moment to talk about your background and work. What is your work in the field of aviation? Dr. Henneberry, would you please and then Professor Flores.

Dr. Debra Henneberry  4:24  
Absolutely. And thank you for having me today. I'm Dr. Deb Henneberry. I am an assistant professor in the professional flight program at Purdue University, and I joined Purdue in January of 2023. And for 11 years prior to that I was a professor at Vaughn College of Aeronautics in New York City. Prior to then I worked as a pilot to people who own their own airplanes. So being a New Yorker, I was flying a bunch of Wall Street people and professional athletes, things like that. And what is an interest to me, particularly in regard to to the field of human factors and aviation, is in the past, and even currently, I still hold the certificate but I was an EMT, I've had my EMT since 2010. So I have a little bit of a background in the medical field, not nearly as much as many of your viewers, but I do like to use that to inform a lot of my teachings regarding human factors.

Scott Nelson  5:22  
Oh, absolutely. I think that's a great correlation and tie in. Because when you think about just all the different things that are happening in those scenarios, just on a rapid response rate, I think that fits in perfectly with risk. So thank you very much. Professor Flores.

Professor Abner Flores  5:36  
Yes, thank you for having me, Scott, it really is a pleasure to be here with you today. My background is mainly from a military perspective. I grew up in Honduras, I joined the Air Force in 1984/'85. And so I have been in the aviation field for about now some close to 40 years. In that position, I've been in the military, going through different stages of that career. I was a helicopter pilot, I was also C-130 Navigator, and I flew also different fixed-wing type airplanes. I was also involved in one of the things that perhaps marked my my career forever. I was involved in a search and rescue of a jetliner crash in October 1989. And seeing death, firsthand, 131 people right there in front of you. And I was the first one to respond to that. And the one who found the crash site that has been with me for over the last almost 40 years, as I have said, from the time that that happened in 1989. And so my perspective and everything that I do, it's permeated deeply with the thought of having to be careful, every step along the way. Now to bring us up to where we are right now, I did teach the Human Factors at Purdue, the SATT school, and I have a variety of different backgrounds, military first and then from an MBA in business. And I also do a lot of things associated to how we actually get to understand from a psychological standpoint of view, the human brain. So in fact, I find myself right now, doing research regarding the neurology, and how it is that perception does affect us. Because in the end perception is everything in terms of how we get to perceive things around us. And so I am also involved in artificial intelligence, I have just recently published with a few other colleagues a book that it is entitled "Implementation Guide for Artificial Intelligence in Aviation" training. And so my main focus has been always in understanding the human factor involved, so, I can tell you that my approach then from that standpoint of view is going to be thinking in terms of who the human factor, human factor element involved is, and so that we can then talk risk.

Scott Nelson  8:17  
Transitioning to risk, how do you view and think about risk in aviation?  Professor Flores would you like to start us off?

Professor Abner Flores  8:23  
Yes. In the aviation industry risk's typically defined as, as the potential for events, any kind of events, or circumstances that may have adverse effects on operations, safety, and the overall performance. And performance is typically to be understood from different angles, because again, it does involve the human factor element, as well as everything else that goes on with the operation, the environment itself, and then the systems associated to it. So risk is something that we look into, from every possible perspective associated to whatever it is that there is anything going on, with the understanding that when we think risk, risk, it's a constant. It is something that we can never, ever eliminate altogether. We can simply find ways constantly to try to minimize or mitigate associated risks. And then the latency of risks. It's another burden in this highly dynamic environment that is the aviation industry.

Scott Nelson  9:42  
Dr. Henneberry, how do you view and think about risk in aviation?

Dr. Debra Henneberry  9:46  
I completely concur with what Professor Flores says. So we always have risk. There's really nothing we do that doesn't involve some amount of risk, although sometimes we don't even realize it. But we do want to mitigate risk as much as possible. So for example, before every flight, we'll do some assessments, not just of ourselves and how we're feeling today, because that's certainly different before every flight, but we look at the flight itself. So in addition to the pilot, we would look at the airplane, we'd look at the environment that we're going to be flying through. And what is really most important to us, and which is actually the hardest to mitigate is the external pressures that are put on us. And I think that's true for both the medical and the aviation field. And we put pressure on ourselves, we want to do right by our passengers or our pilots, we want to live up to our expectations in terms of the training we went through. But at the same time, we have to look at the end result. And sometimes we can't go with Plan A. And it is difficult, you know, for example, airplanes don't make money when they're sitting on the ground. And we do see situations where, for financial reasons, pilots might be pressured into doing a flight that really they know they have no business doing. And that's where it gets very difficult, where you have to really look at your situation objectively. And even though you might really need the money from doing the additional flights that's in the grand scheme of things not the best idea. So like I said, it's something we do before every flight. And it's also something that we do on a regular basis, just kind of assessing where our skill set is at because not every flight, not every mission, is the same. But like Professor Flores says, while there is always going to be some amount of risk, we mitigate as much as possible. And if we think that there are some issues that are going to be popping up during a flight, we look for those alternatives as to as to how we can kind of minimize the risk as much as possible.

Scott Nelson  11:44  
To expand on that idea, there are many strategies and resources in aviation, tools like checklists, to standardize flights and missions to make them as similar as possible.  Does that factor in all potential scenarios - or as many as possible - and generally apply to all flights and missions or is there also an individual case by case approach? 

Dr. Debra Henneberry  12:02  
I think it's both. So there's certain acronyms that we use. So a very basic one is IMSAFE. So when looking at ourselves, I for illness, M is for medications, S is for stress, A for alcohol, F for fatigue, which is enemy number one for pilots, and I know that creeps its way into into the medical field as well. And then the E would be for both eating and for emotion. How are we doing today? Do we belong in an airplane? Do we belong in a hospital or an operating room or whatever. That's going to be different every day. So some of the strategies we use, and that's just one of them, there's, there's actually many different models that we can use, some of it is actually dependent on where you are. So in the United States, we fly under the FAA, and they have their suggestions. But if you are in Europe, EASA, the European version of FAA, they have some differences there. And it's important to take all these things into account. But as you're going through these standardized lists of things, you apply it to the situation. So for example, doing a flight during the day is very different than doing a flight during the night, because our eyes work differently. And it's not that our eyes necessarily adjust, there's certain things that the human eye can't do. But what we do is we get more aware of our limitations as humans. So we look at daytime versus nighttime and the unique challenges there. If it's a beautiful day out, can we see where we're going versus if we can't see where we're going. And that would be an instruments flight, a tremendous number of challenges there even from a physiological sense. And then the mission itself, which is always going to be different, because if this is a path, that you have flown many different times, you know, you know that routing off the top of your head, the airplane could be different, the weather could be different, the airspace could be more congested that day. So you really, I think, in terms of situational awareness, the more broadly we think about these things, and realize that all of the parameters are are dynamic, and they're changing throughout the flight to, we can try and project forward as to what we think is going to happen, but there is a bias in us that you know, we think we know what's going to happen and then that turns out to be not what happens. So the challenge is to keep an open mind as you run into the unexpected and what you plan for does not turn out to be the case.

Scott Nelson  14:20  
Health care is a system of models and structures, processes and flows, individuals and teams, a great many items. In my mind this is another similarity with aviation when I think of operations centers, air traffic control, flight crews, ground crews, again, a great many items. Where is the risk in aviation and how is it identified by the system or teams or individuals? Professor Flores, if you wouldn't mind

Professor Abner Flores  14:43  
Do you mind Scott, if I go back a little bit to talk about something that I think that Dr. Henneberry mentioned that is its key, and then we can also build on correlations, associations, inferences, and whatnot to the health industry. And it has to do with fatigue, it has to do also with how the brain works, that the the amount of time that we spend on in our case scenario, for instance, in aviation, typically for a pilot, the amount of time that we spend on specific tasks, if it goes to about 30 minutes, then there is such thing as the vigilance threshold, that gets to be absolutely reduced completely to the point that as Dr. Henneberry was mentioning, we start behaving in ways that really, it's not us anymore. So the human factor, the human element involved in our environment associated to health care, I think in terms of for instance, and ICU, or in terms of an emergency room, if the doctor has been in that location, say it is already the third or the fourth day and it is the 11th hour on that specific shift, that individual is not going to be at what we in aviation look at with very, very different types of magnifying glasses, which is the fatigue stage of the human element involved in the operation, because we know that once a an individual over time becomes fatigued and fatigue can be understood as being chronic or non chronic. And so looking into that, and then bringing it into the different instances in the correlations there. The human element continues to be exactly the same, we all have the exact same load capacity on being able to cope with whatever environment it is that we are involved. So the overall operation, whatever it is that that might be goes down. And that's when when risk or any kind of risk mitigation guidelines and or programs, models, etc. need to be already in place. And that's when we need to be more observatory. Because in aviation, for instance, we know already that most of the accidents happen within only about six minutes for the overall hour that a pilot gets to fly. And that typically happens while taking off and landing. So that's already been recorded and so the statistics show that. And there is also the 22% of individuals, for instance, out of the overall 100% in terms of pilots working out there who may be achieving poorly or performing poorly or marginal, and we get to see also that these individuals typically would have some associations of fatigue, etc. So, and by the way, that 22% right there is also the culprit of pretty much the main bulk 80% of all those accidents and incidents that are happening where they are actually missing things along the way, even when every other possible tool associated to mitigate risk have been utilized. So this is really an important aspect in what we do in every other industry.

Scott Nelson  18:21  
If I'm a doctor's office, or if I'm a hospital and looking to try to identify the areas where I may be at risk, aviation has spent decades identifying addressing vulnerabilities and a great model to turn to for ideas. Fatigue is a risk factor in aviation that is monitored and reported. How did that come into practice? Were there metrics or events that identified fatigue and can be pointed to as a simple method to develop a risk management tool or is that more of a complex process

Professor Abner Flores  18:47  
This is a much more complex issue. And therefore, it needs to be approached from both the individual specific case scenarios, according to the specific type of operation that is ongoing, and also the overall umbrella of the holistic approach that we have in aviation in terms of being able to mitigate risk. And so, to be more specific, in terms of associated fatigue due to the amount of hours that pilots get to be exercising their duties, and just to give an example, there is about an 18 and a half, perhaps up to about 20, it varies from one body regulatory body as Dr. Henneberry was mentioning, we are under the FAA regulatory body, and then there is also EASA regulatory body, and then there's ICAO, so different entities but they pretty much coming to a court as to since we're talking about the human element and the human element involved in the operation doesn't really change at all significantly, although there are some specific differences, even when we talk about different cultures, from ethnic backgrounds, even in the way in which we process information. And then then there are other aspects that we would need to have two weeks of this podcast to talk about. But in terms of how is it that then these hours effect, there are already thresholds in place based on statistical research for many years, that then now allow companies to really say, okay, so the human can withstand the human element can withstand this much this many hours, we're gonna give them this much. And so now it's reduced to perhaps, again, somewhere between 18, 20. And if it is like a transatlantic type flight, for instance, now, what if that transatlantic flight has some weather associated type delays, right, etc. And they are already about to get to their destination after perhaps having been flying, say, a flight from Hong Kong, Sydney, right, where you're going to be looking at 16, 17 hours, and so on. And now those pilots are there at their limits, and by the way, they are about to engage in that specific area of the flight, which involves a highly cognitive load on their behalf, because it is the approach time and the time that when they get to land, and there's a lot of things happening, there's a lot of aviating having to happen. That means controlling, keeping under control constantly, that airplane. And there is also the navigating aspect of it. And that's a very important one because that when you get to destination, that's when you need to be confirming that whatever you have placed in that computer, which it is known as the FMS, is exactly the way you wanted it for navigational purposes. And then the communication aspect of it, and it gets to be involved at that stage. So all of that increases. And now the scenario here is that those guys are already at their, the top of the hour where they become squash, because of the time limitation in being able to cope in that highly dynamic environment that is up in that cockpit in that airplane. And now their, their limits, cognitively speaking, and even in the way in which they perceive things, is changing dramatically. And so if we now bring that exact same type of approach into an emergency room, and now we have this top of the line, neurosurgeon who's respected, highly respected, and now all of a sudden, the hospital is getting sued, because someone died. And they don't know how that was possible when they had employed their best doctor to take care of X or Y, specific type of individual. Well, if someone was to look back at the time that that individual may have been ongoing, on duty, and under a high cognitive load, well, there you have perhaps the answer, the culprit. Because when we discuss fatigue, fatigue, is the culprit number one, or threat number one, to what we call in aviation situational awareness. It means that you're not anymore connected with the environment, you're not anymore within our concepts of aviation connected with the airplane, you are not any more connected with any one of the systems, you're going to be seeing without seeing. And so this, obviously, for as long as we're humans, therefore, the exact same amount of issues associated to risk are going to have to be involved in other industries, and perhaps more so within the health industry in those emergency rooms or when there are emergencies where the crew, it's already fatigued, not only because of that one specific week where they were on duty, but maybe they have been doing it for three months. We get to see these, we got to see this during the pandemic. We don't know right now, how many people actually die because the cognitive load that those crews had to undergo during the COVID-19. We don't know. Perhaps we'll never know. And that's a very, very seminal type issue that we need to pay attention to across the board.

Scott Nelson  24:58  
Professor Flores, with fatigue and maybe across other areas when considering situational awareness, is that an individual responsibility or a team or system responsibility? Who raises their hand or triggers an alert that something might be at risk?

Professor Abner Flores  25:13  
Everybody. So when we talk about who the stakeholders are, all the way from regulatory bodies who want to look at it from top down, to passengers and the public at large, because the public can throw in new things into the operation, into a specific case scenario that had not been really contemplated. And those involved in aviation, let's say, the crew, for instance, may get startled at this new item that they had not really planned for, say that everything is going normal and now we have a passenger who was getting drunk back there, right. And if that crew has never, ever really had to deal with that, that's something new for that crew. And that can change everything. Because the way our brain works, is by means of utilizing this like a photographic memory of experiences that we can recall. And then accordingly, we can just then apply that specific type of solution. But if we have never dealt with, if you have never really been fired that with a rocket, like, I got fired with a rocket, though, fortunately, it didn't hit me, how do you deal with that? You don't know unless you go through that. So experiencing new things in that environment, can really bring someone, according to how that individual actually perceives, to, to an area that it may take more time to realize that we don't need to be paying attention to that and we now become unfocused from what needs to be prioritized. And in aviation, for pilots, that can happen a lot. When we have moments in which we are perhaps paying attention to some kind of noise. And all of a sudden, we're not flying the plane. And before, you know, the two pilots are now thinking, scratching their heads, what was that noise, and now the plane is just descending. And it's not going to stop for them because they're thinking about a noise. It's just going to crash. And there's been many, many of those instances where people get distracted, and again, going back, were those pilots, somewhat fatigued, had they been flying for a while now, and maybe they perhaps were tired. What was their level of situational awareness? Because that gets to be reduced. And again, fatigue is the main threat to how we respond to different things, even to the most normal of all things.

Scott Nelson  28:07  
Dr. Henneberry, we had a previous conversation I think is interesting when thinking about planned and unplanned or expected and unexpected risk.  We were talking about those ideas and you mentioned an example where a plane's destination had experienced an earthquake during the flight which immediately made me think of a personal experience one time when I was flying from California to Japan, and when we were boarding the plane they told us that there was a typhoon that was expected to reach Tokyo - our destination - when we were expected to land.  Around the halfway point - which was over the ocean - they were going to tell us whether or not we'd have to turn around or divert to another airport in another country.  I think about those scenarios and as doctors when they go into a room with one plan for that patient or it's a surgery and they go into the operating room with a plan but in both cases something could happen in the moment to alter the course and plan.  Would you talk about that earthquake example?  Because I think it is really interesting to think about in scenario planning.

Dr. Debra Henneberry  29:02  
Absolutely.  And it does tie into so many crucial points that Professor Flores just just mentioned. So when we're planning our flight from point A to point B, there's certain things that we know we need to be aware of so low clouds, low visibility, that's very common. Something like a thunderstorm is an absolute no go. That's extremely dangerous to go through it can just destroy an airplane, even something like severe icing because as much as we have very good deicing systems we've come a long way, technologically. There's, you know, for various reasons, we can get icing to the point where we can't get rid of it, and that can bring down an airplane. So those are some some common kinds of things to look out for. But the example I had shared was you know, as much as these common things, especially based on where you're going and the time of year, we know to look out for those, I was mentioning a flight I feel like it was in Mexico, but it was it was a long time ago. But the pilots had planned point A to point B, and they had taken all these normal things into account and en route, there was a massive earthquake, and it affected the area where the destination airport was located. So the question became, can we even go to our destination airport? And if we can't, what is a usable alternative? Because we don't know how many think like in the aftermath immediate aftermath of an earthquake, you don't know how extensive that damage is. And then here you go picking your way from airport to airport to see if there's a usable runway, to now what's your option? Do you just turn around and go back to where you were, because you might be at the point of no return, where you don't have the fuel to go back. So that goes back to keeping as open mind as possible in terms of situational awareness, and, you know, Professor Flores, raised so many great points, because when we look at fatigue, and how it affects human performance, fatigue certainly impacts our situational awareness. And fatigue is very complicated, because we can have acute versus chronic fatigue. So for example, last night, here in Indiana, there was a massive thunderstorm that was shaking my house. So I had very little sleep. So right now, I'm under acute fatigue, because it was a short term thing. In theory, if I were to get enough sleep tonight and bring back my sleep debt I should be okay, we'll see if there's another thunderstorm tonight, but then you've got got more of the the chronic fatigue, which is the burning the candle at both ends on an ongoing basis. So that is that is much more difficult to identify, because I think that we all fall into just that kind of behavior. And then you have to realize, if you're always feeling this way, we need to take more drastic measures to to reconcile that. And you know, in terms of aviation, there are challenges to the human body. So we know noise impacts fatigue, greatly. Vibrations, we're not even sure how much that affects fatigue. But you know, emergency rooms, I've spent a lot of time there, those aren't the most calming places in the world, so even if you show up fresh and ready to go, we still have these challenges that we're not going to perform as well as we would in other environments. And going back to the situational awareness and how fatigue is affected by that you one of those landmark accidents that changed aviation was back in the 70s. I'm sure many of us remember when Eastern Airlines ended up in the Everglades. And it was because when they went to put the landing gear down, we get three green lights that indicate that the nosewheel and then the two mains are locked into the position. Well, one of the lights didn't come on. And so now they think they only have to of the three wheels down. And so they become obsessed with, you know, what's up with the landing gear. And like Professor Flores described, you get so involved in this one distraction, that the plane went into a descent and they descent right into a swamp. And you know what was wrong with the airplane was not the landing gear was broken, it was that that one light bulb was burned out. And so that really changed things. You can't be crashing airplanes, because of the burnt out light bulb. So from a cultural standpoint, that brings us to where we are today, when we talk about things like crew resource management, and proper communication among teams, we have something called just culture. So when you asked who's responsible, we're all responsible. And we should create an environment where people do not have to worry about any kind of repercussions, or any kind of punishment that may happen if they speak up when they see there's a problem. And in the same way, there should not be punishment if we know that we're, we're not feeling well. So for example, the FAA puts it on me that if I had a flight today, I absolutely cannot do it because if I follow the IMSAFE checklist I'm not safe. That "F", there's the F is fatigue is that I didn't sleep as well as I should have. And so that's always incumbent on us to self disclose. And in a similar kind of way, if I were a medical doctor doing a surgery today, we've all had those scenarios where the unexpected happens with our patients or our passengers. And at least I know that because of my fatigued condition, if something unexpected happens, and especially if it happens in a subtle kind of way, if it's not jumping out at me, I'm in no position to pick up on that right now. So that's where you self ground and someone else has to take over for you.

Scott Nelson  34:19  
Culture has been mentioned during our conversation.  When I think about the top one, two, or three big ticket items that can be proactively addressed to have positive risk impact it seems culture is in that grouping in aviation.  It sets the tone where everyone is involved, has a role and responsibility.  However, culture is also an area seen as not a quick and easy fix because it takes time.

Dr. Debra Henneberry  34:41  
Absolutely. When that when that CRM, crew resource management, concept was first introduced here that was not widely embraced. And now it's become the norm. Other cultures, you know, as Professor Flores mentioned, you do get some cultural issues. So I'm not saying one is better than the other or there's a right or wrong, necessarily, but civilian and military are not the same, just the very missions are different in very serious ways. And so sometimes you saw a military mindset creep its way into a civilian flight deck in an inappropriate kind of way. So we do need strong leaders in all situations. But going back to that just culture, if the more junior of the pilots raises an issue, sometimes in the past, there's been some some caution there about raising that and a strong leader should be willing to be questioned and, and justify why they're doing what they're doing. So, you know, when you're establishing a just culture, it's about getting everybody involved, and there's no retribution. There is, in addition to just culture, there's a very, I think it's a New York Times bestseller, a book called The Checklist Manifesto, which is based on the medical industry. And it's about getting everybody involved and speaking up when we see there's a problem. This has been a strategy that has worked quite a bit. Again, we do need leaders, and there always is that pilot in command. And that is an authority figure. But it has to be implemented in the correct kind of way. So even if a flight attendant is mentioning an issue that they may see, the captain, the pilot, and command absolutely has to take that into account. And even if it turns out to be nothing, well, we'd still want to know about it and address it in the right way rather than a more junior member of the crew see something doesn't say anything and then something far worse happens.

Scott Nelson  36:34  
Professor Flores, with what Dr. Henneberry just said about culture, resources and tools, such as checklists, are those along with standard operating procedures resources that can help as a redundancy and a backstop so if a junior officer, a junior pilot, or a crew member does speak up and raise a hand or an alert they have the support of the checklist, the process, the workflow?

Professor Abner Flores  36:56  
Yes, definitely, yes. And just going back to a previous question that you had asked as to how is it that the aviation industry industry has been able to come along, up to today, and I think Dr. Henneberry touched on two very important points, on one hand is that of leadership, leadership styles, and how we have actually evolved, from the time where the pilot in command, the commander of that airplane, was "God." And so perhaps we can look at some emergency rooms where there is this one senior surgeon who's "God" in that room. And I don't know if you ever heard of how is it that the Japanese fisherman utilize sharks in a tank to keep the fish fresh as they come ashore after being for a few days out in the waters. And that's supposed to keep the fish fresh. But having an individual within the context of an organization for organizational cultural purposes, who is not really doing it's part of being a leader that is following suit with every other directive and or guideline that is in place or following the overall vision, therefore, not having clarity of goals as to what it is that the organization as a conjunct of it, the organization in general is supposed to be doing then that there's going to be problems. And so in aviation, we have, for the most part eliminated the <UNKNOWN/> in the cockpit, so to speak, where we get to see now, those who are senior officers, flying commanders, flying those airplanes, they do have the training now today, that has to do on the power up in the cockpit not to have this big degrade of from me to you. It's more level. And then it's more level simply because everyone is involved in the exact same type of operation which is to maintain risk at bay. So when we focus on what it is that we are trying to accomplish, and in this case for as an aviation is simply we minimize risk, we become safer. And guess what, when we reduce those levels of risk, we become safer, automatically at the same time we are becoming higher performers. And that's what we want. We want to perform at our best. And so risk gets to be where it needs to be down there and we can fly above it. So going back to the organizational culture, being able to pair up teams that have different types of personalities, perhaps if that is possible, right. And I think that that needs to be more really aggressive approach into pairing up people who can get along. In teams, right. That's what we do in teams, I can tell you from past experience from another life, where I used to be a pastor for about 14 years, and I, I was basically putting together different types of teams for different aspects of the church that I was leading, I was constantly trying to investigate. And I would do it through different types of tests that I would give to them to just realize what type of personality they have, and see if maybe I can pair up Jimmy with Joe here, and I would have problems. So even when it comes to that, like in an emergency room, or a hospital, the relationship between, say, even the front desk personnel, and the people behind, it has to be one that people feel that they are going to be treated correctly. And so there are a lot of different areas perhaps within the health industry for minimizing risk, that have to do with how it is that that organization is handling its own internal culture, from an organizational culture perspective, because those organizations who are able to move in sync with the different directives with what's in place with the vision and the mission of the organization, they have clarity of goals, and they constantly then perform at their highest levels. And so in aviation is exactly the same again, for the same reasons that we've been discussing this here that whatever the human element is involved, that's where the weak link is going to be. So we may have the most what we call air worthy airplane out there, right. We may have the best systems in place, the most ergonomic AI based ergonomics involving cockpits, top of the line avionics, different systems for communications, etc. But if the human element is not up to speed, and that typically is unfortunately the case, then there are going to be issues needing to be resolved.

Scott Nelson  42:31  
Dr. Henneberry with strategy and resource development and tools such as IMSAFE routinely used in aviation if I'm a doctor or a hospital or a health care organization are there simple approaches or models or simple things that can be done to develop a strategy or a resource? For example when the IMSAFE tool was developed aviation had experiences and identified issues that built the IMSAFE tool. Is there an approach aviation normally takes when those opportunities are identified? 

Dr. Debra Henneberry  42:58  
Yeah. So we have certainly, another thing aviation in the medical industry have in common is our love of acronyms. So IMSAFE is how we look at ourselves. But you know that that does impact the whole environment, you know, if everybody has some kind of threat there under one of those six letters. And that's going to vary on a day to day basis than yes, that is something that we have to look at. And also, it's one thing to look at yourself at the start of the mission or the start of the workday at the start of a surgery at the start of a shift. That's not how you're going to feel halfway through or towards those last few hours, particularly if it's a long shift. So you kind of have to project forward as best you can, based on where you're at right now. And then everything else is a little bit of an unknown. But another, it was a model that I actually mentioned before, but not by not by name, but it's called PAVE. So P is for pilot, and that would be the IMSAFE checklist, then the A is for the airplane itself, which could also be something like the setting of the hospital or the medical office or what have you. So it could be something just about the the feasibility, the layout, something regarding, you know, patient care and privacy, things like that. And you know, and that's going to change to on a day to day basis. If something like utilities or something gets affected, that's always worth looking at. The V is for the environment. So there's a lot of things that play into environment, you know, just in terms of, you know, patients and those kind of extraneous variables that we can't control. And then we have the V well, so the V was cheating because it's environment, everything has to be an acronym. And then the E is for external external pressures and external factors. So those are those really soft things that are very hard to, to identify. They kind of play on our psyche, that that needs to complete the mission or the need to do something and it's kind of that toss up of what's the right thing to do here. Hopefully, we are on the side of caution as much as possible, I think the more experience we have, the more that that will help. But external factors and external pressures is where things tend to go wrong the most, because they're not as easily defined. And the whole expect the unexpected. When something happens during that surgery, that that's, that's where we earn our money is when we have to make those very difficult decisions. And so we have many other models too, but but right off the top of our heads, IMSAFE and PAVE, at a minimum are going to be used for every single mission that we do. And another thing, just to keep in mind is, you know, as we do get fatigued, really, it's our cognitive skill sets that degrade faster than our motor skill sets. So for pilots, we need both. But when our motor skills start to degrade, that that will bang up an airplane, and it's not a good thing. They'll they'll be you know, dings and bumps and bruises and all those kinds of things. But when the cognitive skills start to degrade, that's where we kill people, because that's where our decision making, and our situational awareness starts to fail, as Professor Flores had mentioned, that's where we become the most vulnerable. And the real challenge about fatigue is that we don't realize we're going through it. So it's, you know, trying to be objective in these situations is so difficult, but that's also where our team becomes so crucial. So like those, those frontline people, they can sometimes see the deficiencies that we can't see. So taking a very holistic approach to these sorts of challenges is I feel the best way to address them.

Professor Abner Flores  46:38  
And, Scott, if I may, tag along a little bit on what Dr. Henneberry just mentioned, she's already quoted a couple of the models that we utilize heavily in our operations, there's an important key aspect to all of these different types of models, that from a strategic perspective, we need to understand. And that is that every single model that we have belongs to a specific category. Example: we have models that deal with when things happen under an emergency situation. We have another that deals with when things happen under normal operations. We have different models that deal with incidents, models that deal with accidents, ways of reporting incidents and ways of reporting accidents. So every single model belongs to a specific category. And in very many instances, these models also belong to a hierarchy type approach, where we know where to go because it belongs to this specific area right here. So by having those kind of like in a mind tree type map, right then we can easily think in terms of utilizing A or B type model, such as the threat error management, for instance, right? That it is to be utilized within the context of normal operations. And when we talk about threat or management, and we think we need to then think in terms of how do we manage errors within that environment, very different than from when we are now on emergency situation type mode, where we perhaps have to utilize a different type of, as Dr. Henneberry has said, with our love of acronyms, may have to use a different acronym, different model for that purpose.

Scott Nelson  48:57  
When an individual goes on a mission or a flight and the situation changes with time, such as they get tired where the scenario or model is different, how much reviewing and assessing is done after a flight or mission is completed? Is that every time? Does that happen on a regular basis?  And the team is involved reviewing and assessing to look back to see what can be learned and applied and moved forward? 

Dr. Debra Henneberry  49:19  
Absolutely. Particularly in a training environment. And I'm sure Professor Flores would agree that every mission is a learning opportunity. So I always feel that in terms of our training, the debriefing at the end, is probably the most valuable part, especially if that debriefing is done immediately after the mission. And when everything is still fresh in our head to look at what went right and also what went wrong. And the circumstances behind those things. And even on your day to day flight if it's just you're flying the line. It's still worth doing at the end once everything is shut down, have a quick debriefing and clear up any any kind of misunderstanding or anything any challenge that may have come up especially the unanticipated ones, because that's how we get better is by looking at these things. And it's our experience that that makes us great pilots and, and acknowledging that we operate, I think the medical industry, too, we operate under a culture of continuous improvement, and learning how to do better the next time we face a similar kind of challenge.

Professor Abner Flores  50:22  
Absolutely.  I just want to just reinforce or perhaps just add to what Dr. Henneberry has mentioned, because she's already stated it very well, and it is the fact that we, in aviation, constantly find ourselves utilizing for every other iteration that we do in many different things. We even apply some, in some cases outside of the realm of what we're supposed to be using simply because it's so ingrained in us right, doing the before, during, and after of things and closing the loop through feedback. So it is constant, continuous improvement already involved in every other step by step procedure that we utilize when we're constantly looking at that mirror as to did we do it up to par or did we do it under par and to that effect for training purposes, also, we have several different and we have been evolving from different models, to the point that today, we can talk about CBTA, for instance, which is a competency based type training, this type assessments that we have evolved on to where airlines are all, for the most part, going through that specific type of training, because geographically speaking also, in some countries, they may not, perhaps yet have adopted that specific type model. But for the most part, we take a look here in the US, and they are either on EBT type training, or they are utilizing the CBTA nowadays, the competency base and those key terms like being competent, for instance, it's a very singular specific type term that we need to have in our, in our organization today in aviation everywhere we are, because for the most part, when we look at what the regulatory body, in our case, the FAA has, we're looking at being proficient at something right. But being proficient at something doesn't mean that we are the best performers, we're simply meeting a specific standard when we are proficient. But when we are competent, that is a step beyond that of being proficient, and so we are constantly thriving, trying to get to that place where we are for all and every single iteration of what we do to be competent, not just proficient, but really be performing at our highest level.

Scott Nelson  52:52  
What are potential obstacles? And how can those be overcome? We've talked about strategy and models as well as resources and tools. Doctors aren't necessarily trained to be standardized, where individuals and teams in aviation are trained and oriented in a standardized way. It's not necessarily a simple way of putting a square peg into a round hole. How can obstacles be overcome when you're looking at it from a risk perspective? Professor Flores, would you share your thoughts and then Dr. Henneberry?

Professor Abner Flores  53:18  
I think that we're looking at the exact same central piece, because the main piece that gets to be involved in these operations is simply the human element. So from that perspective, the challenges are the majority of challenges are going to come from the human element involved in any kind of operation. So from that perspective, the way in which individuals trained regardless of the industry has to be one that everybody looks at it from or under the same magnifying glass. There's got to be clarity to every aspect of the training that goes on, there's got to be clarity to the specific types of resources and/or guidelines, and how we go about conducting business day to day. I can think, for instance, at one time that I was teaching the transition of military pilots from the French Air Force into the civilian world, in in France. And I remember looking at some of the checklists that they were utilizing, where I was trying to figure it out. I don't speak French. So I was trying to figure out some of the words that I was looking in there at simply because it caught my eye that the same checklist, different pages, and then I would go to another checklist someone else had had a different word. And it's like, what's this word in the same location from that other checklist, but it's not different. Well, it turned out that they had like five different ways to call a door handle a door handle. And so, and that, I was just really blown out of my own mind when I saw that, but they within their culture, everybody was at ease with it, everybody understood that. And then when we observe some of the statistics associated, for instance, for pilots around the world that obviously, English is the language of aviation. And everybody has to be where they need to be at in terms of English, we get to see that some some people on their emergency situations who are from different cultures may revert back to their own native language when they are under stress. So just to give you an example, as to how the brain works, that when it is at its best, it is at its best, that is just a fantastic as a hallucinating type thing that we can think of our own brains, right. But when it is outside the thresholds of what's what's to be normal we see it that it fails us. So we need to really pay a lot of attention to the human element involved in everything that we do. And then obviously, find appropriate solutions so that everybody becomes standardized, no matter what. And that's from top to the bottom, from the guy who is really conducting a brain operation type surgery to the guy who's up front front desk, picking up the phone. Everybody must be involved in risk, because risk affects us all. And it is just amazing to see the amount of money that gets to go, basically wasted, because risk is not being properly handled. So that's a different subject for another time on the macroeconomics of what risk actually is.

Scott Nelson  56:58  
Dr. Henneberry, what do you see as potential obstacles and risk and how can those be overcome?

Dr. Debra Henneberry  57:02  
Absolutely building on everything Professor Flores said, a lot of it goes down to culture and standardization. From that leadership perspective, setting norms and expectations, setting that just culture so that everybody is invested in either the organization or the mission. You know, that's essential. The whole if you see something, say something that should be laid out from the outset, and also building upon what Professor Flores said, previously, Scott, you and I had discussed an industrial psychologist named James Reason. He looked at the medical industry, he also looked at pilots, he looked at nuclear energy, and things that we had done right and wrong over the years. And why those why those things happen, why Chernobyl and Three Mile Island happened. But one thing that Dr. Reason did say at the end is that the human mind is capable of the heroic recovery. So while we can get ourselves in trouble, we are actually very adept at problem solving. And under the right circumstances, absolutely, that heroic recovery, the opportunity for it is there. And that's what's so wonderful about the human mind and allowing people sometimes it is just trial and error. But that's how we problem solve and setting that culture that people should be able to do that to uniquely solve a problem that has not been encountered before.

Scott Nelson  58:28  
Well that's a great point to conclude our conversation today.  I would love to keep this going and talk more about these two industries and how they can work together and learn from each other.  We will have to do this again at another time. Dr. Henneberry and Professor Flores, thank you very much for your time and sharing your thoughts and experiences today. I really appreciate it.

Dr. Debra Henneberry  58:45  
Thank you. 

Professor Abner Flores  58:46  
Thank you.

Scott Nelson  58:49  
Thank you for listening to The Risky Health Care Business Podcast. You can listen to all episodes from the resource center page of the SpringParker website, springparker.com, or click the Listen link in the show notes to listen and subscribe for free on your platform of choice. And remember, accelerating health care performance is achieved through creativity, not just productivity.

More Episodes

Load More