(This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)
The Duke Life Flight crash has sat squarely in the middle of my mental desktop since it happened. That aircraft was sophisticated and equipped with redundant power and a well-trained pilot. Duke did everything right. They spent the money. Why? What? How?
|Image courtesy LifeNet 4 in SC. A super crew!|
When I first looked at the headline about the accident report, I was disappointed that the NTSB had--apparently--faulted the pilot for "shutting down the wrong engine." It’s not that simple. The National Transportation Safety Board (NTSB) opined that “the pilot of the Duke Life Flight helicopter that crashed in 2017, killing all four people on board, likely received confusing cockpit indications that led to him shutting down the wrong engine during an in-flight emergency.”
It’s worth discussing the idea that maybe, just maybe, when faced with confusing or conflicting information about an aircraft malfunction, a pilot might consider doing nothing initially beyond landing if that option is available; especially when single pilot.
“Stress-induced limitations on human performance capabilities are often overlooked when considering how crews respond to emergency and abnormal situations. These limitations have important implications (emphasis added) for the most effective designs of non-normal procedures and checklists and the design of training for non-normal situations. Berman, Dismukes, and Loukopoulos (2005) conducted an in-depth analysis of human cognitive performance issues in airline accidents. Their analysis demonstrates that normal cognitive limitations experienced by all humans when dealing with stress, concurrent task demands, and time pressure, underlie those errors made by crews when responding to emergency or abnormal situations.” (https://ntrs.nasa.gov/citations/20060023295)
While a pilot flying a large jet airplane several thousands of feet up in the atmosphere might be forced by circumstance to “do something” in order to get the aircraft safely on the ground, a helicopter pilot flying at one or two thousand feet above a suitable landing surface may have the option of landing and then diagnosing. Too often, while trying to simultaneously fly and diagnose a problem and take steps to mitigate said problem, pilots misdiagnose and make mistakes.
Reading "I wasn’t sure what was happening so I decided to land and then sort it out," in an event report would be better than reading “the pilot was given confusing information by the aircraft systems displays and shut down the wrong engine which led to a fatal crash” in an accident report; don’t you think?
This is just another iteration of “the most conservative response rule.” and is in line with a physician’s pledge to “first, do no harm.” We all have heard the pilot’s guidance to “aviate, navigate, communicate,” and maybe you have heard the phrase “put down the radio and fly the aircraft.” These axioms emphasize the importance of aircraft control, first and foremost.
While discussing this idea in a hangar full of team members in New York, the CEO of Mercy Flight Central reminded the class of what he was taught during Navy flight training. Only half-joking, he said that when a Navy pilot is faced with an emergency the first thing they are taught to do is “wind the clock.” I submit that if you can land the aircraft in the time it takes to wind up the clock you might be better off landing.
While reading the Duke docket, I learned about an Avera McKinnon EC-145 that suffered a similar malfunction to the Duke aircraft. In the pilot's statement, he writes that at one point "I was trying to figure out what was happening with the aircraft." He was confused by what he was seeing and hearing and smelling. I consider this an indictment against the designers for a faulty design and the trainers for a failure of imagination. Plan accordingly.
I was once flying a BK from Savannah to Atlanta, full of fuel and at max gross weight with an isolette and baby on board. Five minutes after takeoff and adjacent to Savannah International, I got a master caution, an engine-low light, and an engine-out audio alarm. Startle effect! The N1 gauge on the left engine was at zero. But the aircraft was still flying normally! What the hell? The team couldn't hear the alarm, but they could hear me thinking out loud. "Ok! What's going on? I have a caution light and an alarm. An N1 is at zero, but the rotor is normal and the torques are matched. The TOTs are matched and normal."
I declared an emergency with Savannah, turned, and landed at the FBO. Only later did we learn that failure of an N1 gauge could provide confusing and alarming indications of an engine failure. I had to learn about this by living through it. No one had ever sat in a cockpit with me, looked me in the eye, and put their finger on that gauge while saying, "if this gauge fails, you will get indications of an engine failure, with lights and audio. But it may just be the gauge!
The Avera Mckennon pilot reported that at one point he smelled smoke and heard the sound of an engine winding down. He was headed to an airport for an emergency landing. It all worked out for him and his team, but the question as to whether or not he should have landed immediately--a normal response to a fire on board--will make for a lively discussion in classes. In any case, he was there, we weren’t and it’s easy and pointless to armchair analyze and criticize. They all lived. Well done sir.
Finally, the idea that perhaps initially doing nothing in response to confusing cockpit indications in an emergency situation and simply landing is no justification to avoid putting forth effort to fully understand your aircraft and its systems and be as familiar as possible with the limitations and published emergency procedures. Indeed, understanding that an aircraft’s indicators could confuse you is all the more reason to gather up every single bit of information you can about your machine. Study the mishaps. Read the reports. Learn from history and imagine it happening to you. Because it very well might and you need, no—scratch that--you must be ready. You must not end up in an NTSB report.
Tonight’s the night!