Sunday, August 3, 2014

It's still hurts when you get stuck... A reposting of Bill Winn's "Life At The Sharp End."

A pilot who might otherwise feel conflicted at the point of an important in-flight decision will be much more likely to opt for safety when his Chief Pilot, Director of Operations, and Program Director all stand shoulder-to-shoulder and tell him, “There is no excuse for flying in conditions that are below the minimums established for this program. You are expected to divert or abort when possible, or land and call for help if you cannot maintain minimums.” 

Editor's note:  In 2010, when I was selected to be Omniflight's AMRM training coordinator, I did some research and quickly came across work by William T. (Bill) Winn. I studied his seminal work;  A Safe ride to a Soft Bed, a safety primer for air medical crewmembers, as an introductory textbook on  Air Medical Resource Management. I knew instinctively that if I ever got to meet him, I would like him and enjoy learning from him. I did meet Bill at the last Air Medical Transport Conference, after listening to him speak. I am fortunate  now to communicate with him regularly as part of my volunteer-support of the National EMS Pilot's Association (NEMSPA). Bill works with NEMSPA on behalf of all of us in HEMS, everyday, and we are all in his debt.

This article is four years old, but as the sad events of the last year have made clear, we still have lessons to learn; from the CEO to the brand new crew member...

By William T. Winn - Anyone who has read Professor James Reason’s writings on human factors in accident causation is familiar with his well-known model of how causative factors can line up like the seemingly random holes in slices of Swiss cheese to result in a mishap or in a serious accident. Dr. Reason is professor of psychology at the University of Manchester, United Kingdom. He has published books on motion sickness, human factors in transportation accidents, absent-mindedness, human error, and on identifying and managing organizational risk factors.
helipic1In an analysis of the chain of events leading to any accident, the sequence inevitably leads to the individual whose action or inaction most directly results in the occurrence of the event. Professor Reason refers to this unfortunate individual as the “sharp-ender”, since he or she stands at the tip of an expanding sequence of actions and circumstances which lead to the final mishap. Traditionally, the sharp-enders receive credit for causing the accident. In the history of aviation accidents they have typically been pilots, aircraft mechanics, or air traffic controllers. In air medical transport operations, the medical crew members on board are close enough to the sharp end of the accident sequence to be considered “affiliate” sharp-enders.
During the past two and a half decades in the United States and elsewhere, those associated with commercial aviation operations have come to acknowledge a broad range of systemic factors that may affect the quality of a pilot’s decision making processes during flight. In particular, an interest in identifying the root causes of EMS helicopter accidents has prompted research to carefully review and analyze the information gathered by the NTSB in their investigations of those accidents.
A careful review of the accident dockets shows that the NTSB investigative procedures do not typically look very deep into the complex systems in which air medical helicopters operate. Due to the constraints imposed by very limited resources, and faced with the difficulties of processing any evidence other than that which is the easiest to gather and to interpret, the investigations tend to focus on the traditional sharp-enders and upon such tangibles as the weather and the physical evidence of the aircraft wreckage. Aircraft maintenance records and pilot logs and training records are examined and information from any available witnesses is carefully gathered and documented. But eyewitness accounts, even when they are accurate, can only shed more light on the terminal link in the accident chain. There remains a need to identify, classify, and mitigate each of the systemic factors that may have played a significant role in the preceding causal sequence. A formal classification of these factors is helpful because a carefully structured system of classification will facilitate the effective and economical development and implementation of a system of interventions to prevent future occurrences of similar accidents.
Review and analysis of NTSB accident investigation dockets by the Joint Helicopter Safety Advisory Team (JHSAT) and recommendations by the Joint Helicopter Safety Implementation Team (JHSIT) have laid an initial foundation for the identification and mitigation of the causes of helicopter accidents.
In addition, the current efforts of a research effort titled Opportunities for Improvement in Helicopter Emergency Medical Services (OSI-HEMS) are focusing specifically on the air medical helicopter accidents that have occurred in the United States from 1998 through 2009. This on-going project began in January 2008 under the direction of Dr, Ira Blumen of the University of Chicago Air Medical Network. His research group consists of approximately 40 professionals from the air medical transport industry, including pilots, medical crewmembers, air medical communications specialists and representatives from air medical operators and the FAA. Other interested parties from the nationwide air medical transport community have contributed their time and expertise on an occasional basis. The group uses a modified and expanded version of the JHSAT taxonomy. It is projected that a preliminary statistical analysis of the data generated will be presented at the Air Medical Transport Conference in Ft. Lauderdale, FL in October, 2010. It is likely that the findings and recommendations from this research effort will influence changes and improvements to the helicopter EMS industry for years to come.
The 4 Dimensions of Safe Air Medical Operations
A review of the causes of accidents and the recommendations for prevention that have been identified reveal that virtually all of the identified causal factors, as well as their associated interventions, fall into one or more of the following four broad categories.
  • Organizational Culture
  • Individual Psychology (the Sharp-enders)
  • Technology
  • Training
The first two categories pertain to the sources of pressures that may be brought to bear on air medical crew members and which may in some measure compel them to accept or to continue a flight under conditions of elevated risk and in circumstances where they are not comfortable with the conditions.
The culture of an organization may be defined as the set of rarely articulated, largelyhelipic2unconscious beliefs, values, norms, and fundamental assumptions that the organization makes about itself, the nature of people in general, and its environment. In effect, culture is the set of “unwritten rules” that govern “acceptable behavior” within and outside of the organization.
An organization’s safety culture consists of the team’s collective attitudes, awareness, philosophy, and behavior with respect to the importance of safety in day-to-day operations. Ideally, it is manifested through a generally held commitment that team members will not compromise safe practices when there is a conflict between being safe and just ‘getting the job done’. All organizations have both formal and informal cultural conditioners that influence the safety of their operations.
Formal Conditioners: GOMs, SOPs, SMS
The formal conditioners of an organization’s culture are found in the program’s published mission statement, in their General Operation Manual and other written policies, and in the documentation of their formal safety management system. In terms of the old expression, ‘Walk your Talk’, the formal elements of our cultures are the ‘Talk’. Actual day-to-day behavior is the ‘Walking’ part of the expression. In many ways, this behavior may be influenced more by certain informal cultural conditioners than by the formal ones.
Informal Cultural Conditioners
One of the greatest informal influences on culture is the manner in which executives and managers communicate their priorities to members of the flight team. Notwithstanding a clear emphasis on safety in the formal elements of the program, if the messages from managers place undue emphasis on flight volumes, liftoff response times, or “meeting the competition head-on”, then team members may feel pressure to push themselves to satisfy those perceived priorities.  Another informal cultural derives from an individual manager’s personal style of communicating and relating to other members of the organization. An excessively steep authority gradient between managers and staff may  hinder the two-way communication that is essential to safe operations.
In some business models of air medical provider organizations, the mixed-messages that are received from managers are due to the fact that they are required to report to and receive direction from higher level managers who have a limited appreciation of the risks inherent in flight operations. Decisions and policies that have an influence on aviation safety must be made at a level that understands and supports operational safety. Corporate fiscal or HR officers may not always give proper consideration to safety in their analysis of, and demands on, the flight program’s operations.
Pressure to get the job done may also be generated at the level of the individual flight crewmembers.  Pilots and medical crewmembers alike tend to be Type-A personalities who possess a high level of personal desire to perform in an exceptional manner.
We must also recognize that time pressures and a pre-disposition to get the job done are built in to the fabric of the air medical transport industry. Patients of all ages and descriptions with a critical need for rapid transportation rely on these providers to get them to the facility that can best relieve their suffering, or save their lives. Although virtually all flight programs tell their crewmembers that the circumstances of the patient are not to be considered in making aeronautical decisions, it is impossible for members of the transport team to ignore what the outcome for the patient might be if they fail to complete the transport.
Implications for training and management
From the above, it is a given that a flight program should have clear and unambiguous written policies that place considerations of safety at the forefront of all operations. In addition, this emphasis on safety and prudent decision-making needs to be reflected frequently in routine communications from managers.  To borrow a paragraph from a primer on air-medical safety, a pilot who might otherwise feel conflicted at the point of an important in-flight decision will be much more likely to opt for safety when …
…his Chief Pilot, Director of Operations, and Program Director all stand shoulder-to-shoulder and tell him, “There is no excuse for flying in conditions that are below the minimums established for this program. You are expected to divert or abort when possible, or land and call for help if you cannot maintain minimums.” And, of course, they’ve also told him, “The condition or situation of the patient has no bearing on aeronautical decisions.ii
This kind of message should be part of initial training for all new hires, and should be reiterated routinely during each team member’s annual recurrent training.
Individual competency
A high degree of experience and technical competency is expected of all team members in air-medical transport programs. Highly experienced and qualified pilots and medical crew will enjoy a greater measure of ‘cognitive reserve’ while performing their duties. This reserve permits an increased awareness of factors in the flight environment that might be significant to their personal safety. We refer to this as situational awareness, and reduction or loss of situational awareness is a major contributing factor in most human error accidents. It should therefore be part of a flight program’s safety management plan to provide regular training in all areas necessary to insure that crewmembers remain proficient at their technical skills. A review of NTSB accident investigation reports also reveals that pilot training should include emphasis on and frequent practice of the particular skills required by the specific flying environment where he operates. It is expected the final findings of the OSI-HEMS research project will provide a more detailed discussion of the training deficiencies that have contributed to accidents in the past.
Team Competency
In addition to training to insure individual proficiency, Air Medical Resource Management (AMRM) training is specifically designed to teach all members of the patient transport team how to work together to insure safer operations. The patient transport team includes those who are directly involved with flight activities: the pilot and medical crewmembers, as well as those who are indirectly involved: communications specialists, operational controllers, and maintenance technicians, when needed.
Training program structure
A description of the substance of a complete AMRM training program for members of an air medical transport program is beyond the scope of this article. Ultimately, it is up to each program to develop and administer that training and to continuously assess and update the substance of it to insure the competency of both individuals and teams. Each of the other three dimensions of safe operations: the cultural, psychological, and technological, must be addressed in the substance of the training.
In their book Beyond Aviation Human Factors, Maurino, et al make plain the need to look above and beyond the sharp end of the accident sequence to identify and control the broader organizational or systemic factors which influence the safe conduct of air-medical operations. Even so, it is still important to attempt to understand what may have been going on between a pilot’s ears as he and all on board approached the point of impact with the terrain.
Tolerance of risk
One issue that has to be addressed is the individual level of risk tolerance possessed by both pilots and medical crewmembers. In the presence of time pressures, organizational pressures, and a high level of personal motivation, how much risk can, or should, a crew accept?
There are no risk-free air medical flights. That is a reality that all accept. A crewmember’s level of risk tolerance is conditioned largely by his individual personality and by his previous experiences. The majority of rotor-wing pilots in air-medical transport received their initial training and gained their early experience as pilots in the military. In that environment a pilot’s perceived worth, as well as his next promotion, depended to some degree on his readiness to accomplish the mission, or die trying. We don’t need to argue the appropriateness of that philosophy in the arena of national defense, but none would find it acceptable for the air medical transport industry.
There are probably very few pilots, if any, in the industry who would admit to regularly and intentionally ‘pushing the envelope’ to complete a patient transport. Still, 6 or 8 (or 28) years of accepting high levels of risk will have an effect on a person’s decision making, unless it is actively mitigated by more proximate influences. That is why clear policies and strong statements from management, like the one cited above, are so important for every flight program.
And, although some pilots may take offense to this statement, the medical crewmembers need to play an active role in acting as a damper in cases where a pilot is taking a flight beyond the limits established by policy, or regulations, or common sense. While the need for such interventions by crewmembers would rare, crewmembers should be trained and enabled to speak up if a pilot should opt to disregard the rules and push the limits. With very few exceptions, air medical pilots are responsible and mature men and women who demonstrate exceptional aeronautical decision-making each time they fly. But the records show that even well trained and highly experienced pilots can get into trouble when too many causative factors are present and lined up in just the wrong way.
Nor are we pilots the only ones who may be too accepting of risk. In the 2003 fatal accident that cost my program the lives of a pilot and a paramedic, there has been some interesting speculation about the dynamics of that specific crew. The accident occurred during conditions of darkness and fog; conditions that have prevailed in too many fatal helicopter accidents. The pilot was ex-Army, trained for combat operations. The medic was a full-time professional firefighter, and the flight nurse, the only survivor, was an avid backpacker and back-country enthusiast. It’s impossible to know, but some have speculated that had the testosterone on board been diluted by even one of our female nurses, the outcome might have been different. The women in my program are all highly motivated professionals, but as a group they are more conservative with respect to risk taking and they are quicker to speak up when they feel that something is not as it should be. In short, they are more naturally disposed to act in accordance with the principles of Air Medical Resource Management.
Assessing risk tolerance
In an FAA study of pilots and risk it was determined that a pilot’s perception of risk during actual flight was as important as his individual tolerance of risk. In fact, these two dimensions, perception and tolerance, appear to be separate constructs that exist independently, notwithstanding the fact that there are correlations between them. Moreover, it was determined that of these two dimensions of risk, it is risk perception that lends itself more easily to mitigation through training. It seems that it is much easier to improve a pilot’s ability to recognize and appreciate significant in-flight hazards than to change the personality factors that affect willingness to accept risk.
Implications for training and management
Just as for the other dimensions of safe operations, training can accomplish much to mitigate potential problems due to the influence of individual psychology. At the most basic level, this might be done by simply putting the topic on the table for clarification and discussion among all crewmembers. This would be one place to stress the importance of erring on the side of safety whenever there is serious doubt as to the safety of a course of action. It might also be the time to clarify the consequences of exceeding any statutory limits established for flight operations. It is a given in aviation that a pilot may temporarily act in variance to a rule in order to deal with an emergency. But the license to violate the rules is provided so that a pilot can extricate his aircraft and the occupants from an emergency, and not so that he can get them into one.
Everything discussed in this article to this point has been for the purpose of considering how to mitigate human error. We all need to recognize that it is not possible to eliminate errors. We also need to acknowledge that we make dozens of errors every day. We catch most ofhelipic3them immediately and make quick corrections. Many of the ones that we don’t catch may still go unnoticed because they are inconsequential. This is true whether the task at hand is cooking dinner, driving your car, or flying a helicopter. Our concern is for those errors that may lead to a tragic accident if they are not either avoided or immediately trapped and corrected.
A significant number of such errors are the result of inadequate situation awareness. Even under ideal conditions humans may misinterpret or simply fail to perceive critical situational cues in the surrounding environment. A helicopter in flight is not an ideal environment. Even with a healthy safety culture in place and a crew that is trained and committed to AMRM principles, a momentary distraction or lapse of attention can result in critical cues going unnoticed. At night cues are even more difficult to discern due to the reduced ability of the pilot to see. A pilot with 20/20 daytime vision, experiences 20/200 vision on a dark night. That’s legally blind by anybody’s definition.
The use of technology to fill in the gaps in a pilot’s situation awareness seems like a natural fit for some of the current problems in air-medical operations. The devices that best lend themselves to the needs of air-medical transport are night vision goggles, terrain alert and awareness warning systems, traffic alert and collision avoidance systems, and perhaps some kind of cockpit monitoring and cockpit voice recording system. The industry has resisted this technology in the past for three reasons: cost, weight, and effectiveness.
From a strictly business point of view, a capital expenditure is generally justified only when it results in increased revenue which exceeds the costs. While it might easily be argued that the costs of even a single serious accident far exceed the expense of new technology, some managers and financial officers are still having problems connecting those dots.
Nearly all air-medical helicopters frequently operate near the limits of their gross weight capabilities. In years past, the hardware we are considering here was designed for larger fixed wing aircraft. The weight and bulk of such systems could compromise our ability to add a large patient to the flight manifest. The newest generation of these devices promises to be a much better fit for medical transport helicopters.
In addition, most of the current devices were designed for aircraft that flew higher and faster than is typical of medical helicopters. If those devices were used in medical helicopters as initially designed, they would produce a large number of false traffic or terrain alerts. Any system which provides frequent false alerts will soon be ignored or simply switched off by the operator. In order to be effective, the technology has been redesigned to accommodate flight profiles characteristic of HEMS operations.
Implications for training
If new technology is not deployed properly, it has the potential of causing the very accidents that it is intended to prevent. In the first months of flying with night vision goggles, the U.S. Army experienced a series of accidents, major and minor, that were the result of pilots not fully understanding the limitations, as well as the capabilities of the devices. The technology under consideration for use in the air-medical transport industry will also carry the cost of thorough initial and recurrent training for all operators.
There is no denying that this technology will be an added cost and training burden upon operators; but this burden should be viewed in its proper perspective. A backpack containing food, water, and first-aid supplies is certainly a burden to a hiker. But, no prudent back-packer would start up the trail without one.
These considerations of organizational culture, training, individual psychology, and technology are intended to serve as a general guide for self-examination and re-evaluation of how we train, how we manage, and how we conduct our operations in order to accomplish our goals in the air-medical industry.

Mitroff, I.I., Pauchand, T., Finney, M., and Pearson, C. (1990) Do some organizations cause their own crises? The cultural profiles of crisis-prone vs. crisis-prepared organizations. Industrial Crisis Quarterly, 3: 269-283. Quoted in Maurino, et al, Beyond Aviation Human Factors.

Winn, William T., A Safe ride to a Soft Bed, a safety primer for air medical crewmembers. Last accessed at www.williamwinn.com on Feb 28, 2010

Hunter, David R., (September 2002) Risk Perception and Risk Tolerance in Aircraft Pilots. Office of Aerospace Medicine, Washington, DC. Federal Aviation Administration.
William Winn served as a helicopter pilot and instructor pilot in the US Army for 27 years and began flying as a HEMS pilot when he retired from the Army in 1996. After a health problem forced him out of the cockpit in 2005, and with Bachelors and Masters degrees in Education to draw on, he turned his efforts to the development of materials to teach HEMS safety. He is currently the Safety Officer for Intermountain Life Flight in Salt Lake City, Utah. He is a member of the HEMS safety improvement research group working under the direction of Dr. Ira Blumen of the University of Chicago Air medical Network. Bill also serves as the General Manager for the National EMS Pilots Association.


  1. Interesting shot of the CALSTAR 902. The tail number belongs to a Bell 222 that i used to fly in, not a 902 explorer.


Tell us what you think. If you are involved in helicopter emergency medical services / air ambulances, this is your community. Please refrain from posting profanity, or comments that might be considered libelous or slanderous.