Saturday, October 28, 2023

Anticipation!

 (This article originally appeared as a safety column submission in Vertical Valor/911. I reshare it here in the hope that it may offer some benefit to the HEMS industry. DCF)

The first two lines of the old Carly Simon song “Anticipation” go like this, 

“We can never know about the days to come.

But we think about them anyway”

Image courtesy Josh Henke

After writing a column in this space on highly reliable organizations, I was discussing high-reliability with another safety-advocate recently. We talked about the importance of “anticipation.” In any high-risk high-consequence endeavor, it pays to anticipate. As well, it’s good to focus on the attributes of high reliability, one of which is “a preoccupation with failure.” If we take it for granted that people will make mistakes; then our next step is to anticipate where and when those mistakes might occur, and devise “safety-nets” to trap the error and lessen or eliminate the ill-effects.

I recently reconsidered a purposeful behavior that I blogged about long ago, “the objective continuous risk assessment process.” (O-Crap).  You continuously and objectively consider the proximate threat to your safety while flying.  You discuss this threat and you formulate a tactic, technique, or procedure to deal with it. What’s going to kill me right now and what am I doing to prevent that from happening?

I am a proponent of helicopters having instrument-flight-rules capability. Even singles. I think we spend enough time flying in marginal weather and at low altitudes to warrant the extra expense that IFR entails. But IFR comes with its own pitfalls, which we should anticipate.  

Sit back and ask yourself, “when is an IFR pilot most likely to encounter problems?” Perhaps you, like me, come to the conclusion that the end of an IFR approach in O-Crap weather could be the time and place when things get wooly. 

At the bottom of the approach, the pilot is understandably interested in breaking out of the weather, seeing the surface, and landing the helicopter. She might continue for a few seconds if she can’t see the ground at the decision point, and she also might drop a few feet lower than the published decision or minimum descent altitude. She might slow the aircraft to expand the time available to react to an opening. Doing these things is human-nature, and this natural tendency has to be aggressively trained out of us IFR pilots. Anticipation and a preoccupation with failure will help justify a training budget that enables the training that prevents misfortune. We don’t train until we can do it right – we train until we can’t do it wrong. Anything less is disaster in the making.

The mind-set of an IFR pilot should be, “I am not going to break out, even though the reported weather at the beginning of the approach points to that happening. I am not going to break out and I am going to perform the missed approach as published. And it’s going to be the best missed approach ever! I will anticipate problems and have my hands and feet ready to take control of this aircraft if need be, because there isn’t much room for error at the bottom. If the aircraft has a tendency to get squirrelly at low speeds I will keep my speed up. I will maintain my scan and fly this aircraft on instruments, and if we do break out in the clear I will be pleasantly surprised.” As a young army instrument pilot I flew 20 hours a year in a UH-1 “Huey” simulator. It had no visuals and no stabilization. Every approach was followed by the missed approach procedure. That was very good training.

Here’s an excerpt from an accident report,

“During the instrument approach to the destination airport, the weather conditions deteriorated. The pilot was using the helicopter's autopilot to fly the GPS approach to the airport, and the pilot and the medical crew reported normal helicopter operations. Upon reaching the GPS approach minimum descent altitude, the pilot was unable to see the airport and executed a go-around. The pilot reported that, after initiating the go-around, he attempted to counteract, with right cyclic input, an un-commanded sharp left 45° bank. Recorded flight data revealed that the helicopter climbed and made a progressive right bank that reached 50°. The helicopter descended as the right bank continued, and the airspeed increased until the helicopter impacted treetops…” What we had here was a failure to anticipate.

During my travels to present Air Medical Resource Management training, I hear and tell stories. I tell on myself. Some of my stories are embarrassing; how could I have been so dumb? But I would rather be embarrassed and hopefully make a life-saving impression on some young man or woman than shelter my ego and perhaps read about how they died in a helicopter crash. Stories can save lives. 

The pilot was performing an instrument approach in dark night IMC conditions. At the missed approach point, he wanted the weather to continue. He wished it so – even though it was not. He did not initiate the missed approach procedure and continued toward the destination, partly on his instruments and partly by looking out the wind screen. He became disoriented and got lost in the goo. While struggling to maintain control of the helicopter and reorient himself he latched onto a patch of good visibility – a “sucker-hole” - that enabled him to get the aircraft down near the ground. 

The team members on board were understandably  upset when they realized that they were at ground level right next to the multi-story hospital building, and that the helipad they were supposed to be approaching was on a roof-top several stories above them,  in the clouds. This is a true story. 

So what do you think happened here? I think a good guy with good intentions – a normally safe and conscientious pilot – made a snap wrong decision at the decision point. 

“It’s almost good enough. Let’s keep going and hope it gets better.” 

It doesn’t matter what we want, it doesn’t matter how hard we wish, it’s straight-up no-kidding what you see is what you get. The training-imperative must be “fly according to the environment.”   Our response must be conditioned, and that conditioning takes time-in-training and anticipation.

Our simulator-training scenarios should be tricky, the way life is, to engender thought and discussion. You can learn almost as much sitting at a table and discussing a flight after the fact as you can while performing one. And it is during calm thoughtful discussions of what actually happened versus what should have happened that values and norms and ingrained behaviors are written into our psyche.

As Carly Simon sang, “I'm no prophet and I don't know nature's ways” but I do know that we should try our best to anticipate human nature, and train for it. 


No comments:

Post a Comment

Tell us what you think. If you are involved in helicopter emergency medical services / air ambulances, this is your community. Please refrain from posting profanity, or comments that might be considered libelous or slanderous.