pic

pic

Saturday, October 28, 2023

DON'T TICKLE THE DRAGON!


(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)


image courtesy historyskills.com

I wish I could describe what real terror feels like. Words don't suffice. Either you have been there, or it's hyperbole.

Recently, my neighbor lost track of his 3-year-old. He ran through his house, gradually becoming frantic. No kid. He ran over to my house, opened the door and yelled: "is he here?"

"No."

We both went out front, and he yelled for his son at the top of his voice. The anguish and despair apparent in that call were very powerful. I witnessed full-blown terror in another human being. I have felt it in myself. 

My moments of terror took place in a Chinook in 1987. I was a young co-pilot, and my pilot-in-command and I had flown from Fort Bragg North Carolina to an airshow in Virginia. The hot humid summer day wore on, and clouds began to build. As we prepared to depart in the gathering dusk, he told me that we would return to base IFR.

Cool! Cloud time!

We planned and filed and got ourselves into the clouds headed home. Unfortunately, we didn't have airborne weather radar and data-linked radar didn’t exist. I learned about embedded thunderstorms that day. We were in a large, powerful aircraft - we weighed around 32,000 pounds against a max-gross weight of 50,000. We had lots of reserve power. We had heading-select and altitude-hold. We had each other. 

We got our butts handed to us. 

When you stumble into a worsening storm two things come to mind. How bad will this god-awful hammering get? And how strong is this helicopter?

Thank God Boeing builds a mighty strong machine.  I have never been so scared - before or since - as I was that evening. I was so scared I started giggling. And then I started flying away from the course line on the horizontal situation indicator (HSI). My brain quit. If we had been in a light helicopter, the kind we use for HEMS, I would be dead.

We turned off altitude hold and gave up on maintaining an altitude. We got slammed up, down, and sideways. 

The wind gusts put the blades out of phase and the helicopter shook mightily. The rain on the windshield and forward pylon made a roaring sound that mixed with the blood roaring in my head. It was all I could do to maintain aircraft control. My PIC started shaking his head, the expression on his face was one of pure dread.

You only have to learn that lesson once. Don't mess with a thunderstorm. When I remember that flight, it invokes imagery, "visually descriptive or figurative language..."  

Here's some imagery... Don't tickle the dragon!

I came into HEMS fresh from duty as a flight-lead in the 160th SOAR, with an attitude that was spring-loaded to the go-position. I was a real "can-do" guy. I used to run to the helicopter, and I went inadvertent-IMC twice in my first year of HEMS flying. I was going to get it done. I felt that being a can-do guy made me a better pilot than many of my peers. I was willing to do things that others weren't. There are still people like that in our industry today, and I fear for their safety - and the safety of the teams who climb in with them. 

In Ernest Gann’s aviation classic "Fate is the Hunter," he posits that fate pulls the strings of our destiny. I don't believe this premise; that it's all about luck, but I can tell you from experience that bad choices improve fate's aim. Back in the day, I made several really bad choices - so bad that more than once my fate did come down to dumb luck. Another pilot from that era with a similarly fatalistic bent said this: 

“Death is the handmaiden of the pilot. Sometimes it comes by accident, sometimes by an act of God.” 

I believe that most accidents aren’t. And I think it’s most-often us doing the bad-acting, not God.

Consider his case, a pilot with more than 9000 hours of experience. A pilot who was experienced with flying many different aircraft. "he was formerly an aeronautical research pilot with the National Advisory Committee for Aeronautics (NACA) High-Speed Flight Station at Edwards Air Force Base, California... On November 20, 1953, he became the first human to fly faster than twice the speed of sound in the Douglas D-558-II Skyrocket.  From 1955 to 1960, he was employed by North American Aviation as the chief engineering test pilot during the development and testing of the X-15 rocketplane." (NTSB report)

It's interesting that this pilot was involved in the beginnings of supersonic flight because such flights also invoked imagery in the minds of writers at the time.

"There was a demon that lived in the air. They said whoever challenged him would die. His controls would freeze up, his plane would buffet wildly, and he would disintegrate. The demon lived at Mach 1 on the meter, seven hundred and fifty miles an hour, where the air could no longer move out of the way. He lived behind a barrier through which they said no man would ever pass. They called it the sound barrier. (From the 1983 movie 'The Right Stuff.")

Well, as it turns out, there is no demon lurking behind the sound barrier. Indeed, there is no "barrier" at all. Just some aerodynamic phenomena that engineers and designers were able to overcome long ago. 

Now let's talk about thunderstorms. Tangle with a thunderstorm and your aircraft may indeed buffet wildly and disintegrate. The Airman's Information Manual covers the topic in detail, but it's hard to get the full effect of what they are saying when you are reading the text safe and warm in an easy chair. They write, "avoid by at least 20 miles any thunderstorm identified as severe or giving an intense radar echo. This is especially true under the anvil of a large cumulonimbus." and also "above all, remember this: never regard any thunderstorm 'lightly' even when radar observers report the echoes are of light intensity. Avoiding thunderstorms is the best policy." 

I cannot over-emphasize the importance of taking this to heart, but not everyone does.

A HEMS pilot decided that he was going to get home after dropping a patient at a distant hospital. A peer warned him about storms approaching the destination, and the victim ignored the warning. He almost only killed himself, but as he was preparing to depart, the crew popped out of a door and waved to come aboard.

"Although the pilot encountered an area of deteriorating weather, this did not have to occur as the pilot could have chosen to stay at the hospital helipad. The pilot, however, decided to enter the area of weather, despite the availability of a safer option. Based on the pilot’s statement to the oncoming pilot about the need to “beat the storm” and his intention to leave the flight nurses behind and bring the helicopter back... he was aware of the storm and still chose to fly into it. The pilot made a risky decision to attempt to outrun the storm in night conditions, which would enable him to return the helicopter to its home base and end his shift there, rather than choosing a safer alternative of parking the helicopter in a secure area and exploring alternate transportation arrangements or waiting for the storm to pass and returning to base after sunrise when conditions improved. This decision-making error played an important causal role in this accident." (NTSB)

The initial mistake was in leaving the distant hospital. He had been offered a van ride and turned the offer down. As he proceeded he was probably evaluating conditions - thinking he could stop if it got too bad.  As he crept closer and closer to the dragon, he might have been thinking, "well, we made it this far and now we are almost there." He could have decided to land anywhere along the way, but as he was in for a penny, he was in for a pound. "The helicopter crashed in an open wheat field about 2.5 miles east of the home base." 

Any pilot should study decision-making in naturalistic environments. The real world isn't rational, neat, or orderly, and HEMS puts us in high-consequence, time-critical situations. There isn't enough time to think things through, and the penalty for choosing wrong means someone suffers or dies. 

One of the key points in the study of naturalistic decision-making is that experience doesn’t always equal good judgment. Very experienced people - even pilots with 9000 hours - make bad choices; partly because they recognize situations as similar to past experiences and use heuristics. Rather than making the best choice, they pick a choice that works, a choice that "satisfices." Too often, the results don't suffice or satisfy.

An experienced person is also liable to think "Well, I did this before and got away with it." Such an attitude might manifest like this, "The pilot also discussed the weather with an acquaintance, mentioning that he might need to work his way around some weather." The urge to keep pressing on and the lack of concern about being in the vicinity of convective weather prompted a tragic choice. "The airplane entered the severe convective weather; the pilot then requested and received clearance from the air traffic controller to initiate a turn to escape the weather. The airplane was lost from radar about 30 seconds after the pilot initiated the turn."

A toxic soup of hazardous attitudes combined to kill this good man - an aviation hero and icon. It wasn't a demon that did him in, it was a thunderstorm - a dragon! 

You can read more about this event here.

The last of these events I will dredge up involves an almost identical scenario. From reading the text, I imagine that it's the same NTSB investigator in two of these investigations. I bet after picking through rubble and body parts and then having to write about it, he feels sick about these outcomes and wishes us pilots would behave differently.

In this case, the pilot was a friend of mine. After he was hired, I briefed him on being a base manager, helped him start his base, and witnessed him making bad choices. I guess in the end I wasn't a very good friend because if I had been, I would have talked to him before the fact instead of talking about him after. The problem is, when you raise your voice before a crash, people think you are crazy or bad for business. There isn't much pleasure in saying "I told you so."

"Although the pilot encountered an area of deteriorating weather and IMC, this did not have to occur as the pilot did not have to enter the weather and could have returned to (a safe) airport or landed at an alternate location. The pilot, however, chose to enter the area of weather, despite the availability of safer options.” (NTSB)

Now here's the crazy part of this event. This pilot knew the weather was bad in the area. He had just flown through it. Indeed he warned another pilot not to go there...

"...the pilot of the accident helicopter contacted (another pilot) by radio and advised him to double check the weather before returning to (the area) The accident pilot stated that “bad thunderstorms” were in the area and that he did not know if he would be able to return to his base that night."(NTSB)

None of the people I am discussing here were bad. None of them were dumb. These were good, smart souls who fell prey to a bad choice. If you fly, you owe it to yourself to try and understand what it was that led to these choices. You should understand that these folks were just like us. And if they could make a bad choice, so might we. After all, the last thing we want is for the NTSB to write our epitaph,

"Based on the pilot’s statement ... regarding bad thunderstorms in the area, he was aware of the weather and still chose to fly into it." 

These pilots tickled the dragon and it ate them. 

Don't tickle the dragon.


First, Do No Harm


(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

The Duke Life Flight crash has sat squarely in the middle of my mental desktop since it happened. That aircraft was sophisticated and equipped with redundant power and a well-trained pilot. Duke did everything right. They spent the money. Why? What? How? 

Image courtesy LifeNet 4 in SC. A super crew!

When I first looked at the headline about the accident report, I was disappointed that the NTSB had--apparently--faulted the pilot for "shutting down the wrong engine." It’s not that simple. The National Transportation Safety Board (NTSB) opined that “the pilot of the Duke Life Flight helicopter that crashed in 2017, killing all four people on board, likely received confusing cockpit indications that led to him shutting down the wrong engine during an in-flight emergency.”  

It’s worth discussing the idea that maybe, just maybe, when faced with confusing or conflicting information about an aircraft malfunction, a pilot might consider doing nothing initially beyond landing if that option is available; especially when single pilot.

“Stress-induced limitations on human performance capabilities are often overlooked when considering how crews respond to emergency and abnormal situations. These limitations have important implications (emphasis added) for the most effective designs of non-normal procedures and checklists and the design of training for non-normal situations. Berman, Dismukes, and Loukopoulos (2005) conducted an in-depth analysis of human cognitive performance issues in airline accidents. Their analysis demonstrates that normal cognitive limitations experienced by all humans when dealing with stress, concurrent task demands, and time pressure, underlie those errors made by crews when responding to emergency or abnormal situations.” (https://ntrs.nasa.gov/citations/20060023295)

While a pilot flying a large jet airplane several thousands of feet up in the atmosphere might be forced by circumstance to “do something” in order to get the aircraft safely on the ground, a helicopter pilot flying at one or two thousand feet above a suitable landing surface may have the option of landing and then diagnosing. Too often, while trying to simultaneously fly and diagnose a problem and take steps to mitigate said problem, pilots misdiagnose and make mistakes. 

Reading "I wasn’t sure what was happening so I decided to land and then sort it out," in an event report would be better than reading “the pilot was given confusing information by the aircraft systems displays and shut down the wrong engine which led to a fatal crash” in an accident report; don’t you think?

This is just another iteration of “the most conservative response rule.” and is in line with a physician’s pledge to “first, do no harm.” We all have heard the pilot’s guidance to “aviate, navigate, communicate,” and maybe you have heard the phrase “put down the radio and fly the aircraft.” These axioms emphasize the importance of aircraft control, first and foremost.

While discussing this idea in a hangar full of team members in New York, the CEO of Mercy Flight Central reminded the class of what he was taught during Navy flight training. Only half-joking, he said that when a Navy pilot is faced with an emergency the first thing they are taught to do is “wind the clock.” I submit that if you can land the aircraft in the time it takes to wind up the clock you might be better off landing.

While reading the Duke docket, I learned about an Avera McKinnon EC-145 that suffered a similar malfunction to the Duke aircraft. In the pilot's statement, he writes that at one point "I was trying to figure out what was happening with the aircraft." He was confused by what he was seeing and hearing and smelling. I consider this an indictment against the designers for a faulty design and the trainers for a failure of imagination. Plan accordingly.

I was once flying a BK from Savannah to Atlanta, full of fuel and at max gross weight with an isolette and baby on board. Five minutes after takeoff and adjacent to Savannah International, I got a master caution, an engine-low light, and an engine-out audio alarm. Startle effect! The N1 gauge on the left engine was at zero. But the aircraft was still flying normally! What the hell? The team couldn't hear the alarm, but they could hear me thinking out loud. "Ok! What's going on? I have a caution light and an alarm. An N1 is at zero, but the rotor is normal and the torques are matched. The TOTs are matched and normal." 

I declared an emergency with Savannah, turned, and landed at the FBO. Only later did we learn that failure of an N1 gauge could provide confusing and alarming indications of an engine failure.  I had to learn about this by living through it. No one had ever sat in a cockpit with me, looked me in the eye, and put their finger on that gauge while saying, "if this gauge fails, you will get indications of an engine failure, with lights and audio. But it may just be the gauge! 

The Avera Mckennon pilot reported that at one point he smelled smoke and heard the sound of an engine winding down. He was headed to an airport for an emergency landing. It all worked out for him and his team, but the question as to whether or not he should have landed immediately--a normal response to a fire on board--will make for a lively discussion in classes. In any case, he was there, we weren’t and it’s easy and pointless to armchair analyze and criticize. They all lived. Well done sir.

Finally, the idea that perhaps initially doing nothing in response to confusing cockpit indications in an emergency situation and simply landing is no justification to avoid putting forth effort to fully understand your aircraft and its systems and be as familiar as possible with the limitations and published emergency procedures. Indeed, understanding that an aircraft’s indicators could confuse you is all the more reason to gather up every single bit of information you can about your machine. Study the mishaps. Read the reports. Learn from history and imagine it happening to you. Because it very well might and you need, no—scratch that--you must be ready. You must not end up in an NTSB report. 

Tonight’s the night! 


THE RIGHT STUFF


(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

The thing that makes safety somewhat—well—boring, is that it is the absence of occurrence. We devote time, effort, and money in order to make something not happen. To keep us motivated and focused on this goal, it helps to remember the overwhelming sensations of disappointment and sorrow that follow a catastrophic mishap; one in which an aircraft is destroyed and people you know are killed. 

Use these memories to keep yourself mindful. Maintain a preoccupation with failure, because such a preoccupation is one of the hallmarks of a highly reliable company, team, or person. While sentinel events often involve death and destruction; occasionally, thank goodness, we are confronted with a story in which people rise up above an exceedingly dire situation and pull victory from the jaws of defeat. They never give up trying to turn an in extremis situation into something survivable. These stories, and the events which spawn them, are the stuff of legend. When we read these stories we learn about people possessed of “The Right Stuff.” 

As I am officially old, and my cultural references are dated, I will point out that author Tom Wolfe wrote a book with that title in 1979. It was about the first astronauts that America put into space on rockets, and, as well, those who went to the edge of space in rocket-powered winged-aircraft. The story was made into a fantastic movie--if you haven’t seen it you have something to look forward to. 

There is a scene in this film in which Chuck Yeager, the first person to fly faster than the speed of sound and live to talk about it, is testing a high-performance jet aircraft which, in the dispassionate and clinical language of an accident investigator, “departs controlled flight.” I hope you never have to experience an aircraft departing controlled flight, because at that point you are basically along for the last ride of your life. 

Unless…

Unless you can somehow wrest control of the situation and regain authority over your aircraft. Often you can’t, which is why several fighter pilots over the years have taken advantage of the seats made by the company founded by James Martin and Valentine Baker and ejected from aircraft that have experienced a departure from controlled flight. Yeager does this in the book and the film. There’s a reason he’s a hero and it goes beyond breaking the sound barrier.

Helicopters don’t come with ejection seats. When a helicopter departs controlled flight, such as is described in the NTSB’s recent preliminary accident report about an EC-135 that crashed in Pennsylvania in January of this year, the only option available to pilot and team is to stay with it and keep trying.

By all accounts, that is what the pilot of this aircraft did. Thanks to eyewitness reports and doorbell cameras(?!?) we can, in our imagination, ride along on this flight which was unremarkable right up until the instant when the helicopter went “BANG” and rolled right out of control. According to the flight team in the back, it went inverted—maybe more than once. They were pinned to the ceiling. 

The pilot never gave up. And he got it right side up and in a nose-up decelerating attitude just before planting it in the one clear spot available to him that didn’t involve hurting anyone on the ground. He dodged wires and houses and a church and the people in it.  

He displayed “The Right Stuff.” 

After the crash, the flight team demonstrated that they too have the right stuff. The NTSB report states, “Following the accident, the flight nurse evacuated the patient then evacuated the pilot while the medic shut down both engines. The nurse travelled with the patient while the medic travelled with the pilot to area hospitals.” (Do the people you fly with know how to shut down the engines on your aircraft?) The photo of that evacuation—of that nurse climbing from the wreckage of that helicopter, baby in arms--will be iconic, like the picture of a child in the arms of Oklahoma City Firefighter Chris Fields after the bombing there, and other images of rescuers and the rescued from 9/11. These images restore our faith in humanity and remind us which way is up. 

click here to see image

Nice work, team. 

This legendary helicopter story will go down in the annals of aviation history, right up there with Sully Sullenberger and the Miracle on the Hudson, Al Haynes and United Flight 232 in Sioux City Iowa, and Captain Richard Champion de Crespigny and Qantas Flight 32 in Singapore.  In each of these cases the aircraft came apart and the crew kept it together. 

It’s good to read these stories and talk about them. In your imagination, put yourself into that crew’s position. Aircraft are extremely reliable, and you might go through an entire flying career without ever having an engine failure or other significant mechanical problem. And then again, they do happen and it might happen to you. The fact that you have thousands of accident and incident-free hours does not preclude a big bang on your next flight. As John Jordon says in the HAI video Autorotations: Reality Exposed, “every time I fly, I tell myself, today’s the day!” 

When your time comes, (and you should decide right now that it will) you must try to avoid being startled. The best way to do this is to expect trouble.  By studying events like the Duke Life Flight crash—which didn’t end well--and this more recent event which did (a destroyed helicopter notwithstanding), you can increase your own mindfulness. You can nurture your preoccupation with failure and work against the probability of that failure killing you. You too can possess The Right Stuff.


Anticipation!

 (This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

The first two lines of the old Carly Simon song “Anticipation” go like this, 

“We can never know about the days to come.

But we think about them anyway”

Image courtesy Josh Henke

After writing a column in this space on highly reliable organizations, I was discussing high-reliability with another safety-advocate recently. We talked about the importance of “anticipation.” In any high-risk high-consequence endeavor, it pays to anticipate. As well, it’s good to focus on the attributes of high reliability, one of which is “a preoccupation with failure.” If we take it for granted that people will make mistakes; then our next step is to anticipate where and when those mistakes might occur, and devise “safety-nets” to trap the error and lessen or eliminate the ill-effects.

I recently reconsidered a purposeful behavior that I blogged about long ago, “the objective continuous risk assessment process.” (O-Crap).  You continuously and objectively consider the proximate threat to your safety while flying.  You discuss this threat and you formulate a tactic, technique, or procedure to deal with it. What’s going to kill me right now and what am I doing to prevent that from happening?

I am a proponent of helicopters having instrument-flight-rules capability. Even singles. I think we spend enough time flying in marginal weather and at low altitudes to warrant the extra expense that IFR entails. But IFR comes with its own pitfalls, which we should anticipate.  

Sit back and ask yourself, “when is an IFR pilot most likely to encounter problems?” Perhaps you, like me, come to the conclusion that the end of an IFR approach in O-Crap weather could be the time and place when things get wooly. 

At the bottom of the approach, the pilot is understandably interested in breaking out of the weather, seeing the surface, and landing the helicopter. She might continue for a few seconds if she can’t see the ground at the decision point, and she also might drop a few feet lower than the published decision or minimum descent altitude. She might slow the aircraft to expand the time available to react to an opening. Doing these things is human-nature, and this natural tendency has to be aggressively trained out of us IFR pilots. Anticipation and a preoccupation with failure will help justify a training budget that enables the training that prevents misfortune. We don’t train until we can do it right – we train until we can’t do it wrong. Anything less is disaster in the making.

The mind-set of an IFR pilot should be, “I am not going to break out, even though the reported weather at the beginning of the approach points to that happening. I am not going to break out and I am going to perform the missed approach as published. And it’s going to be the best missed approach ever! I will anticipate problems and have my hands and feet ready to take control of this aircraft if need be, because there isn’t much room for error at the bottom. If the aircraft has a tendency to get squirrelly at low speeds I will keep my speed up. I will maintain my scan and fly this aircraft on instruments, and if we do break out in the clear I will be pleasantly surprised.” As a young army instrument pilot I flew 20 hours a year in a UH-1 “Huey” simulator. It had no visuals and no stabilization. Every approach was followed by the missed approach procedure. That was very good training.

Here’s an excerpt from an accident report,

“During the instrument approach to the destination airport, the weather conditions deteriorated. The pilot was using the helicopter's autopilot to fly the GPS approach to the airport, and the pilot and the medical crew reported normal helicopter operations. Upon reaching the GPS approach minimum descent altitude, the pilot was unable to see the airport and executed a go-around. The pilot reported that, after initiating the go-around, he attempted to counteract, with right cyclic input, an un-commanded sharp left 45° bank. Recorded flight data revealed that the helicopter climbed and made a progressive right bank that reached 50°. The helicopter descended as the right bank continued, and the airspeed increased until the helicopter impacted treetops…” What we had here was a failure to anticipate.

During my travels to present Air Medical Resource Management training, I hear and tell stories. I tell on myself. Some of my stories are embarrassing; how could I have been so dumb? But I would rather be embarrassed and hopefully make a life-saving impression on some young man or woman than shelter my ego and perhaps read about how they died in a helicopter crash. Stories can save lives. 

The pilot was performing an instrument approach in dark night IMC conditions. At the missed approach point, he wanted the weather to continue. He wished it so – even though it was not. He did not initiate the missed approach procedure and continued toward the destination, partly on his instruments and partly by looking out the wind screen. He became disoriented and got lost in the goo. While struggling to maintain control of the helicopter and reorient himself he latched onto a patch of good visibility – a “sucker-hole” - that enabled him to get the aircraft down near the ground. 

The team members on board were understandably  upset when they realized that they were at ground level right next to the multi-story hospital building, and that the helipad they were supposed to be approaching was on a roof-top several stories above them,  in the clouds. This is a true story. 

So what do you think happened here? I think a good guy with good intentions – a normally safe and conscientious pilot – made a snap wrong decision at the decision point. 

“It’s almost good enough. Let’s keep going and hope it gets better.” 

It doesn’t matter what we want, it doesn’t matter how hard we wish, it’s straight-up no-kidding what you see is what you get. The training-imperative must be “fly according to the environment.”   Our response must be conditioned, and that conditioning takes time-in-training and anticipation.

Our simulator-training scenarios should be tricky, the way life is, to engender thought and discussion. You can learn almost as much sitting at a table and discussing a flight after the fact as you can while performing one. And it is during calm thoughtful discussions of what actually happened versus what should have happened that values and norms and ingrained behaviors are written into our psyche.

As Carly Simon sang, “I'm no prophet and I don't know nature's ways” but I do know that we should try our best to anticipate human nature, and train for it. 


All the Kings Horses and All the Kings Men…


(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

Humpty Dumpty sat on a wall,

Humpty Dumpty had a great fall.

All the king's horses and all the king's men

Couldn't put Humpty together again.

People just like us died here. Ask yourself, 
"what were they thinking?"

My friend Marcus called it. We sat at the NEMSPA booth at the Heli-Expo trade show, and as I discussed our near year-long stretch of fatality-free helicopter-emergency-medical services flying, he said, "you know Dan, there will be more fatal crashes, you know that, right?"

Sadly, he was to be proven right.

But why? Why must we take it for granted that some number of us will be killed each year? Why can't we emulate Delta, or American, or any other large air carrier? They haven't killed a passenger or crew in a long time. What is it that they do that we don't? 

Is it their equipment? Is it their training? Is it because they have two pilots and we in HEMS overwhelmingly have one? I don't think so. I think the reason the larger airlines are so safe is due to how their pilots think. They think and act like "airline pilots." They make decisions based upon sound operating principles, with an ever-present eye towards safety. The mindset of the pilots and the safety culture of an airline is understandable. When they crash they kill a lot of people. Not one, or two, or four. A whole lot.

When Colgan Air, a smaller airline, crashed on February 12, 2009, forty-nine people lost their lives. Including the two pilots who were determined to be suffering from fatigue and a lack of proficiency. That sentinel event rippled throughout the air travel industry and caused major changes to operating procedures, standards, and experience requirements. Ask any chief pilot or director of operations, the Colgan crash changed the paradigm.

When airline pilots are found to not be thinking and acting like "airline pilots," then, by golly or by government, something gets done. I suspect that most of our crashes in HEMS occur because too many of us fail to behave like our CFR Part 121 brethren. We helicopter pilots tend to be rugged individualists.

We have gotten where we are in life because of our own effort and determination. We resist anyone's attempt to change us. After all, what we have been doing has worked so far, right? Why would we ever need to change our behavior? When I start to think I have all the answers, I page back through my memories to all of my dumb mistakes and near misses, and I ask myself – what should I have done differently? 

A friend of mine, with whom I used to fly the line, left HEMS and began to fly in the utility sector. One day he was ferrying an aircraft in the southeast U.S. He landed in a field for weather, then made the decision to take back off. And pretty quickly he was dead from crashing into a river in bad weather. Another friend made a willful and conscious decision to fly into an area of storms after discussing these storms with another pilot and being offered a safe place to spend the night. He and his crew are dead. Another pilot elected to dispense with the sort of pre-takeoff challenge and response confirmation checks that help us make sure our aircraft is properly configured for flight. He had a switch set wrong on takeoff. He is dead too. 

You can't undo dead.

I can't tell you how many times I have seen a pilot perform an aggressive "top gun" takeoff. And maybe I have done one or two of them myself. Delta doesn't do that. I was at the Caraway Hospital base in Alabama once, and I watched a pilot repositioning for fuel roll a 206 all the way over onto its side. Sure, he looked cool, but if that motor had quit during that showing-off stunt, he would be another Humpty Dumpty.

The common denominator with all of these pilots?  They weren't thinking like airline pilots.

They weren't making choices the way an airline pilot would. In HEMS, we pilots and crews operate far away from the flagpole, from headquarters, from oversight. (OCCs notwithstanding) And we make lots of choices that have unbelievably severe repercussions. The altitude we fly at, the fuel reserve we operate with, the weather and winds we proceed into, the places we land and depart from and the manner in which we make those landings and departures; these decisions are super-important, and we should make them with one thing in mind. 

"We will not crash and kill ourselves today."

That's how professional airline pilots think. And no matter whether we carry four or forty-nine, that's how we should think as well. Because some actions can't be undone. Some choices are forever. As we operate, perhaps we should ask ourselves; "would a professional airline pilot make this choice?" If not maybe we should take a more conservative path. 

As we are out there on our own, we should never accept an undue risk because we think – or worry – that our peers would do it. And if we take pride in pressing on when “the other guys would have turned back,” maybe we should remember that pride goes before destruction. When its crunch time and you are wondering if the other pilots would do something, remember - they are not sitting there in your seat. You are. The safety of your crew and passengers trumps every other consideration. 

As we make choices, we should listen to that small still voice inside. That voice is there for a reason. Our ancestors who ignored it became some carnivore’s dinner. They no longer swim in the gene pool. We should make up our minds to be as safe as an airline pilot. If you find yourself wondering if something is a bad idea - it probably is. At the very least, avail yourself of one of the benefits a two-pilot crew has, the chance to bounce something off of another aviation professional. Call another ship, or your OCC if you have one, or even your next level of aviation management. Don't be afraid to ask for input. It makes you look wise and professional. 

Like an airline pilot.


Getting Less Bang for Your Buck


(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

It must be tough to be the director of operations or chief pilot of a helicopter company. My old friend and mentor Clark Kurschner, who was a long-time D.O. for Omniflight Helicopters and extremely Yoda-like; told me once that playing beach-volleyball was his only escape from stress and worry. These positions within a certificate-holding aviation company are required by regulation – for good reason. The gravity-like pull towards lower costs and greater profits could easily lead to a culture of cutting corners – the D.O. and C.P. have to stand up and hold the line for safety. To provide the training required by federal, state and even local rules is very expensive. Any company that can cut costs has a leg-up on the competition for contracts and resources. One of the easiest places to cut corners is in the realm of training, and someone with a business background may not understand how important training is. They also may not fully understand this axiom of human behavior: experience drives attitude and attitude drives action when no one is looking.

Image courtesy Mercy Flight Central, where the author 
has presented CRM training twice.

While preparing for a leadership job in 2004, I learned that South Carolina demanded an EMS helicopter pilot have 25 hours of pilot-in-command flight experience in the “make and model” of any helicopter they would be flying within the state. 

We were going to operate a Bell 230 and a Bell 206 and were looking at having to fly a tremendous number of “non-revenue” hours. I called the state’s division of health and environmental control and asked where that rule came from. The state’s guy had no idea so I wrote a letter asking for relief from that rule so that we might more quickly offer life-saving services to the state’s citizenry. I mean, after all, it would have been a shame to have some kid’s mom bleed out on the side of I-26 while we were out chopping holes in the sky to satisfy a requirement that exceeded anything the FAA or other state in our region required.  

We got the hours required down to 15 per pilot per machine, and saved the direct-operating-costs of a few hundred flight hours. None of the six of us pilots ever crashed because of the hour-reduction South Carolina permitted us. In fact, in the majority of instances when a pilot crashes an aircraft it has nothing to do with their flight experience or technical proficiency. It most often comes down to how pilots “feel” about the situation they find themselves in. Even in the instances of death by lack of skill, the situation that required the demonstration of a skill that was lacking was most-often created by “attitude.” Attitude is a non-rational mix of behavioral inputs from the cognitive and emotional components of our personalities. And to be sure – “the way we do things around here” drives attitude as well. As it turns out, many pilots (and business executives) are loathe to delve into the touchy-feely world of feelings and attitudes. As a retired Delta pilot with whom I was recently sharing beers said, “That stuff don’t matter!” Taking nothing away from our conversation and sharing of flying-lies, and his cold beer: I disagree. I believe to the extent that we can affect attitudes; we can shape “hearts and minds,” we can stop helicopter pilots from crashing helicopters.  And right after a crash is when it must really suck to be a D.O. or C.P.

As you prepare a training plan and budget for the pilots you employ, consider that the technical skills you seek to instill, reinforce, or verify in your team – while extremely important – are very likely not going to be what prevents you from having to explain why your helicopter crashed and left dead bodies on the ground. While you must comply with what the “rules” require, and such compliance is very expensive, the good news is that shaping attitudes has zero direct-operating costs and you will never bang up a helicopter providing touchy-feely training – or as the FAA calls it, “Soft Skills Training.”

So where do you start? Well, first of all, if you aren’t familiar with Crew Resource Management Training, open up your mind and get with google. Learn the basics; what works and what doesn’t. Seek to understand group-dynamics, the power of social-settings, and the influence of charismatic leadership. Know that the relationship between CRM “facilitation” and attitudinal-change is elastic.  

If you tell me that the way I feel about something is wrong – even if you show me evidence and examples and valid information, I may well refute and refuse your efforts. And then, over time, I may come to see things differently – and that’s what you want! We don’t want a pilot to simply recite safety and success, we want a pilot to live and breathe safety and success - in the interest of living and breathing.  Make no mistake – safety and success are irrevocably linked – without one there will be less of the other.

CRM “facilitators” don’t try and tell others how to feel – facilitators let others evolve their own feelings at their own pace, because that’s the only way it happens. I use the terms facilitator and instructor interchangeably, because a good one will seamlessly switch from one role to the other during a session -as the situation, level of understanding, and personalities demand. 

You probably have people in your company who would be great CRM instructors, and they may or may not be pilots from your flight-standards section. They may not even be pilots! My friend Randy Mains conducts CRM instructor training courses several times a year. His week-long preparation for the job is excellent – especially for someone unaccustomed to standing up and delivering to a group. Randy employs the crawl-walk-run style of learning, and it works magnificently.

I know this because after being a CRM instructor myself for several years I went and spent a week with him and several other students. I was part of the group-dynamic. I was influenced by the social-setting. And I experienced charismatic leadership as displayed by two young women just starting their careers as pilots. I influenced them, and they influenced me and we were all better for it. Sami and Grace will doubtlessly make our industry more safe and successful thanks to their dedication to CRM principles. 

So could you. Step right up…