pic

pic

Tuesday, June 4, 2013

The Unintended Consequences of Change....

As a new member of the HEMS community, you will undoubtedly spend some time observing how things work, and then, after you get some time behind you - and experience how things are, you will  likely find yourself in a position to effect changes. I would like you to think about this, and about the effects that changes can have on people, systems, and culture. One common result of change is that there are unintended consequences. Things happen that were unforeseen, thorough-coordination ends up being not thorough enough, and some piece of the "puzzle" effected by the change that wasn't considered in advance ends up hurting someone.

Change is hard and inevitable. Every instance of change provides an opportunity for confusion, misunderstanding, and anxiety, and introduces an increased risk of catastrophe. I have too-often seen change in organizations, in practices or procedures, and in policy made for it's own sake (for some guy's resume). My caution for you is - make sure you think it through - and consider all the consequences.

My best story about the unintended consequence of change involves one of my room-mates from flight school, Pierre Desroches. There is a building with his name on it at Fort Campbell today. Pierre and I talked about our past and future lives after "lights out" at Fort Rucker in 1985; we each had separate but complementary skills at preparing our room and gear for inspections. He was a good room-mate and a good friend.. He had a background racing motorcycles, and designing high-performance exhaust systems. He had a vicious wit, an easy laugh, and took great delight in the entire flight-school experience. Well, almost all of it...

After graduation, Pierre and I crossed paths a couple of times, and not long after I volunteered for duty with B Company, 3rd Battalion, 160th Special Operations Aviation Regiment in 1989, he did the same for duty with the 160th's Chinook company at Fort Campbell. He quickly established himself as a great aviator, and was designated as an instructor in short order.

During this time, the Chinook went through an upgrade from a modified and probed D-model to a special-ops-only "Echo Model." One of the plank holders in our Savannah-based band of brothers, Terry Ledford,  was tasked to participate in a users group for this new machine in '89 or early '90 - even though we would never fly one at our station. He travelled to Fort Campbell, and reported back that it was a typical fudge-cluster with everyone involved clamoring for their pet features for inclusion.

This happens also when EMS helicopters are bought - and is why no two EMS helicopters from the days of hospital-vendor relationships are configured the same.

The vendor for the Echo's cockpit avionics suite was, inexplicably, IBM. I don't know how much aviation business IBM has had over the years, and honestly I don't care. Anecdotally I can tell you that it was horrible, non-intuitive, and difficult to navigate through the pages of displayed data to obtain the information needed during a typical mission. This is not to say that the aviators at Fort Campbell didn't learn how to make the aircraft do it's job - it was just made harder than it needed to be. In case you are curious, special-ops cockpits before and since have been provided by Collins, an old aviation company, with intuitive and simple (pilot-friendly) buttons, pages, and methods of displaying data. Collins was also selected for the new conventional-forces F-Model 'hook - the Army got it right!

When I was first learning to fly on instruments, I spent time in a simple but effective UH-1 Huey flight simulator. This device had no visual "out-the-window" features, it was a straight-up instrument trainer (and a good and humbling one at that). The first phase of training is called "B-I" or basic instruments, and teaches a pilot how to maintain control of the aircraft solely by reference to an assortment of - at the time - analog dials and gauges. The one nuggets lean on the most at first is the attitude indicator, or artificial horizon. This because the little earth-and-sky depiction is closest to what the pilot would be looking at if he or she could see out the window. Push the stick forward and you are looking at brown (headed for the dirt), pull back and you see more blue (sky), keep the "wings" level and you avoid a failing-grade slip - or worse, an audience with your angels. The tricky thing about the attitude indicator, or at least it was initially tricky to me, is that when you roll the aircraft into a turn, the "ball" or earth-sky representation appears to roll the opposite direction. This makes perfect sense of course, because when we roll an aircraft, the view out the windshield shows the earth maintaining it's position and us changing attitude. The orange triangle or "bank index" in the picture below, likewise moves in the opposite direction to the direction of roll. I screwed this up more than once when I was new, beginning a turn, looking at the index and seeing it go the "wrong" way, and jerking the cyclic stick back the other way. Time and training taught me how to correctly interpret the information being provided. And every other pilot before and since who has flown by reference to an attitude indicator.





Still with me? Great! So as the guys from IBM, in coordination with the end-users advisory group are designing the Echo Model Chinook's cockpit and deciding how to write the software, legend has it that one smart fellow decides "hey! we're talking digital here and we can provide information any way we want!" They decided to have the horizon representation-line on the rectangular cathode-ray-tube (CRT) multi-function-display (MFD) maintain it's relationship to the actual horizon during a turn.

And then, in a poorly thought-out change, they decided that it made more sense if the bank-index rolled the same way that the aircraft was banking, instead of moving opposite to direction of roll - the way every attitude indicator ever made has worked. This simple little choice killed Pierre Desroches, Wally Fox, and an enlisted crew: as well as a young aviator who was being given an special-ops assessment-flight check ride one night near Fort Campbell. Of course the Army and the 160th guys conducted an investigation into the crash. They involved the wizards from the regiment's systems-integration-management-office (SIMO) and set about sorting things out. They officially decided that the cause of the crash was rainwater or snowmelt getting into the electrical bus bars (large volume circuit-breakers) located on the power distribution panels near the floor on the cockpit sidewalls through leaking side-window seals; shorting out the aircraft's entire electrical system, resulting in a complete electrical failure and total loss of all cockpit instrument displays. Notwithstanding the fact the the bus bars have a bus-tie-disconnect feature that effectively cuts the aircraft's electrical system in half (left versus right) if a problem such as a short-circuit occurs on one side or the other - providing the required system redundancy for instrument flight. The SIMO guys, with straight faces, explained that the short-circuit was hair-trigger fast and managed to fail both electrical systems before the bus-tie feature could snap open leaving one generator to keep some of the lights on. Either that or water somehow entered both side panels at exactly the same instant failing both in unison...

That dog doesn't hunt.


Here is a view looking down upon the panel in the left side of the Chinook cockpit. The right side panel looks the same.




Occam's razor (also written as Ockham's razor from William of Ockham, and in Latin lex parsimoniae) is a principle of parsimony, economy, or succinctness used in logic and problem-solving. It states that among competing hypotheses, the hypothesis with the fewest assumptions should be selected. (wikipedia)

Chinook electrical bus-bars and power-distribution-panels have been located in the same place since the A-model. Chinooks have flown through many rain storms, and window seals have been leaking since SN 001 rolled out of Philly. I myself spent 3500 hours flying the aircraft, through all kinds of weather, and had all kinds of leaks. I study, teach, and spend countless hours thinking about human-factors acccidents, and this crash was a classic human-factors event related to the unintended consequence of change. The pilot being assessed was familiar with flying a military version of the Bell JetRanger (an OH-58), with anolog instruments, much less information to digest from the displays, and a tried-and-true mechanical attitude indicator. He was being radar-vectored for an instrument approach to Fort Campbell as his check ride was being completed, and was instructed to make a turn. The bewildering amount of data available to him overwhelmed him - to be fair he was not familiar with the MFDs - and when he got confused his eyes found the electrical symbol on the CRT that represented the bank index, and it went the wrong way.

The 160th stopped letting pilots being assessed fly the Echo Model after this crash. Even though rain water and a short circuit caused it... That's what is in the accident report.


The crash was caused because someone decided to change something, and no one anticipated that a guy who wasn't a frequent instrument flyer who was unfamiliar with the cockpit layout and displays, would fall back on what he thought was the simplest and most reliable source for turn information, the bank index.

Considering change in your organization? Think it through. Get every smart person available in one room, and war-game the change to death. So that the result avoids it....


No comments:

Post a Comment

Tell us what you think. If you are involved in helicopter emergency medical services / air ambulances, this is your community. Please refrain from posting profanity, or comments that might be considered libelous or slanderous.