The guy or gal flying the aircraft you are riding in may not have the best handle on what is going on with the aircraft, the environment, and the situation at any instant.
Don't believe me? Start reading N. T. S. B. reports.
How could this be?
As we contemplate crashes, mishaps, and injuries, it is easy to fall into armchair mode and smugly ask ourselves "how the heck did they do that?" Monday-morning quarter-backing ignores the fact that the people who participated in the adverse event were in the arena.
As safety-expert, AMRM wizard, and former EMS pilot William Winn pointed out in his work on AMRM and situational awareness; we who fly operate in a naturalistic environment. So, what does that mean, "naturalistic environment?"
It means that flying an aircraft into uncertain weather at 2:00 AM with unlit towers in our path, birds everywhere, a patient whose condition is deteriorating, a headwind that wasn't forecast, and a fuel gauge racing to empty is much different than sitting on one's sun-porch and pontificating.
Here is a note on a famous "naturalistic environment."
On January 22nd 1991, during the Gulf War, an eight-man S.A.S. (Special Air Service - Britain's version of - and the precursor of - our Delta Force) team known as Bravo-Two-Zero were sent on a mission behind enemy lines. Their mission was to remain concealed near the main supply route in Western Iraq for 14 days. During this time they would be expected to sever Iraqi fibre-optic cables, and report on the movement of scud missile launchers. However, due to source limitations, they were given suboptimal equipment. They also received vague intelligence reports. This was apparent when they arrived at their drop off point to find it only about 200 meters from an Iraqi anti-aircraft stronghold. This had not been reported by Intelligence. Because this information was crucial to the success of the mission, the group attempted to contact their base via radio and inform them of the new situation. It was at this time that they realised that they had no contact with base. It was later discovered they had been given the wrong radio frequencies. The group then made the decision to sit out until a liaison arrived in 24 hours. Unfortunately, that was too long a wait. They were spotted by an Iraqi goat herder, and from then a malady of errors began its course. The team were separated. They were confronted with intermittent enemy contact, and were completely unsupported by the larger organisation. Over the next three days, three of the eight died, and four were captured. Only one man made his way back across the Syrian border. This planning error resulted in one of the most costly patrols in SAS history.
(David, 1997)
This event was described in a riveting book and movie, and points out how the best-laid plans can go to hell in a handbasket.
This happens to us too, in our helicopters, as we fly sick people from one place to another...
If you want to become more informed about how naturalistic environments affect decision-making, click here (paper by Dr. Taryn Elliot)
In a nutshell, what we have to do at 2:00 AM (or at anytime we are flying) is react to ever-changing conditions by an ongoing process of situation-assessment, pattern-recognition, situational-awareness, and decision-making. Each choice we make affects our future, and typically leads to other choices having to be made, to react to future changes in our situation. All this occurs in a rapidly-changing dynamic environment with various stressors - like fatigue, distractions, and a lack of resources like time.
The pilot said he performed a "high recon" of ...the... helipad and called out his intentions to land. He performed the pre-landing checklists, and started the approach to the helipad from the northwest at an altitude of 700 feet above ground level (agl). Both of the hospital's lighted windsocks were "limp" but were positioned so they were pointing toward the northwest. The pilot, who had landed at this helipad on numerous occasions, said the approach was normal until he got closer to the helipad. He said he felt fast "about 12-15 knots" and a "little high," so he decided to abort the approach. At this point, with about ¼ to ½ -inch of left anti-torque pedal applied, he added power, "tipped the nose over to get airspeed," and "pulled collective." The pilot said that as soon as he brought the collective up, the helicopter entered a rapid right turn. He described the turn as "violent" and that it was the fastest he had ever "spun" in a helicopter. The pilot told the crew to hold on and that he was "going to try and fly out of it." The pilot said he tried hard to get control of the helicopter by applying cyclic and initially "some" left anti-torque pedal "but nothing happened." The pilot said he added more, but not full left anti-torque pedal as the helicopter continued to spin and he was still unable to regain control. He also said the engine had plenty of power and was operating fine. The pilot recalled the helicopter spinning at least five times before impacting the ground. The pilot said the helicopter landed inverted and quickly filled up with smoke. He unbuckled his seatbelt assembly, took off his helmet, punched out the windshield and exited the burning helicopter.
On it's surface, this event started with a decision to abort the approach because it didn't feel right. Although the official cause has not been determined, one possible scenario is a downwind approach, followed by vortices from the main rotor disk interfering with the tail rotor and creating loss of tail rotor effectiveness. The downwind landing would have felt wrong because the speed across the ground would have been faster than normal, and faster than the speed through the air. Being pushed forward by a tail wind would steepen the approach angle and make one feel a "little high" on approach.
At night, it is hard to determine wind direction in flight. And easy to get it wrong.
"It would be expected that the more experience a person had, the more successful they would be at decision-making. However this has been found to be incorrect. It seems that decision error can be attributed to any of:
individual, organisational, or social factors." (Dr. Elliot)
It's no wonder that things go wrong. Indeed, it's a wonder that things don't go wrong more often. The standard operating model in HEMS is a single-pilot, single-engine aircraft with modest capabilities, and "crew-members" who are not
really recognized as such by the FAA or - frequently - the pilots flying the aircraft. Vernacular statements like, "self-loading baggage," "climb-in, strap-in, and shut-up," and "you take care of the sick people and I will fly the aircraft," reveal the state of things in our industry.
And this contributes to fatal crashes.
Although we don't usually don't have a copilot in our helicopters, we do have at least two smart people on board who can be developed into resources able to help us make choices that don't kill us.
We can discuss what we think is going on, with our medical crew, and ask them what they think. We can also ask them to point out things, like flags blowing in the wind, smoke coming from smoke stacks, or wind effects on bodies of water. As an instrument pilot, I used to brief crews on approaches, and if I said, "now what was that decision altitude again?" they would announce it.
In short, I treat my crew like pilots in training, and explain as much and as often as possible. They begin to think like pilots. I haven't crashed yet.
Sometimes a layman comes up with the answer that saves the day. We can't shut anyone out, or alienate anyone to the point that they sit back and shut up and watch us make a mistake. Although my medical crew members are - perhaps - busy taking care of a patient, they are certainly NOT mired down with flying the aircraft. They may see something I don't, or become aware of something I am not, like a new ticking sound or a new vibration, or a new smell...
Note to crew: It's very hard to only crash a part of a helicopter. Never give up on the situation, even if your pilot doesn't play well with others.
I sat in the jump-seat of a C-5 Galaxy once, on a flight from Japan to Korea. As we got ready to take-off, the PIC stopped the aircraft (all 380,000 pounds of it), turned around, and stated to the entire body present, "okay, so what we are going to do is..." After describing his understanding of the future, he made sure that's what we all had in mind too - even me, an army warrant officer helicopter pilot. He wanted to develop a "shared-mental-model," and offer a chance for anyone to detect a plot-flaw.
BRILLIANT!
safe-flights...