logo

Member Login

Call (03) 9336 7155

Normalization of Deviance

In this article Peter Allen from the HGFA links the space shuttle disasters to your personal flying limits. How could these two things be related?

 I draw upon two other online articles . You may want to read these articles first : -

http://www.flightsafetyaustralia.com/2017/05/safety-in-mind-normalisation-of-deviance/
https://www.willswing.com/why-cant-we-get-a-handle-on-this-safety-thing/ 

Deviations can cause accidents

In General Aviation (GA), aircraft take off from a controlled environment (airport) , follow a flight plan to navigate airspace and then land in another controlled environment (airport 2), each phase of flight is executed by the pilot using documented standard procedures.

Procedures such as how to climb, turn, and descend in an aircraft involve a procedure that is followed by the pilot. The procedures ensure the aircraft is in the correct configuration for that phase of flight and minimises risk of an unforeseen event.

Analysis of GA accidents have found that a proportion of accidents occur when the pilot(s) deviate from defined procedures. In order to solve a problem the pilot(s) deviate from defined procedures. If the problem re-occurs, they use the same solution they had improvised before.

This is an example of Normalization of Deviance.

In HGFA flying, quite often we take off (launch) from an uncontrolled environment (e.g. launch site with hazards), fly with no set flight plan, and land in an uncontrolled environment (the LZ).

If HGFA flying doesn’t have controlled environments, set flight sequences, and a planned flight, does the concept of Normalization of Deviance apply?

The answer is “Yes”, in many ways.

In the article on Normalization of Deviance in Flight Safety Australia (see article here), the following example is given:

Your phone bleeps while you’re driving and you can’t resist the temptation to look—after all it could be important! You check your messages and continue driving without incident. Given the frequency and banality of such occurrences, you might even start to tell yourself it’s perfectly safe to regularly perform the behaviour. The increased practice leads to familiarity and ‘habit’ such that the actions become a normal part of your driving routine.

The lack of bad outcomes can reinforce the ‘rightness’ of trusting past practices instead of objectively assessing the risk, resulting in a cultural drift in which circumstances classified as ‘not okay’ slowly come to be reclassified as ‘okay’.

Before using a (non-hands-free) phone while driving was banned, we probably all snuck a look at our phone while driving, and we probably knew it was a distraction, but we had done it before without incident; so why not?

Why can’t we get a handle on this safety thing?

Wills Wing has an article written in 1998 on its website - “Why can’t we get a handle on this safety thing?” (see article here) that explains what Normalization of Deviance looks like in a hang gliding context well before the term was coined by Diane Vaughan.

The author of the Wills Wing article says:

The overriding determinant of pilot safety in hang gliding is the quality of pilot decision making. Skill level, experience, quality of equipment; all those things are not determinants. What those things do is determine one’s upper limits. More skill gives you a higher limit, as does more experience or better equipment. But safety is not a function of how high your limits are, but rather of how well you stay within those limits. And that, is determined by one thing; the quality of the decisions you make. And how good do those decisions have to be? Simply put, they have to be just about perfect.

The Wills Wing article points out that if we base our evaluation of our decision making skills solely on successful outcomes, we fool ourselves into thinking our decisions are good ones.

In an example given, a pilot makes a decision to leave a thermal to fly to a goal, and decides that this action has a 1000’ foot safety buffer, but in performing the maneuver the pilot eats into his safety margin and arrives at his goal with only 400 feet. As there was no negative consequences, the pilot may be tempted to conclude the decision was a good one, instead of realizing the decision was actually bad.

A pilot who realizes the decision was bad can re-evaluate the criteria and personal limits the pilot has set for them self, and improve their judgement and decision making skills.

A pilot who doesn’t do this normalizes the error and thinks the decisions they are making are ok.

So what can we learn from NASA and the Wills Wing article?

The Flight Safety Australia article states that NASA made these key recommendations after the 2 space shuttle disasters: -

• Don’t use past success to redefine acceptable performance.
• Require systems to be proven safe to operate to an acceptable risk level rather than the opposite.
• Appoint people with opposing views or ask everyone to voice their opinion before discussion.
• Keep safety programs independent from those activities they evaluate.

CASA promote Risk Management Analysis in its safety management systems. For example, when the HGFA apply to make a rule change or get an exception of an existing rule, CASA require a Risk Management Analysis to be undertaken.

Example of Risk Management process

riskmanage

If we compare the NASA recommendations to the risk assessment model:

• Don’t use past success to redefine acceptable performance.

So what NASA are talking about here is assessing the risk. Normalization of deviancy impairs our ability to properly assess risk

• Require systems to be proven safe to operate to an acceptable risk level rather than the opposite.

Here, NASA are addressing risk control. In our flying, these are the personal limits you set for yourself.

• Appoint people with opposing views or ask everyone to voice their opinion before discussion.

We need to review the controls we have put in place to control risk. In our personal flying this is where we take a look at the personal limits we set ourselves. Are the limits correct?

• Keep safety programs independent from those activities they evaluate.

Finally we need to be able to identify risk. The Wills Wing article shows us that sometimes we can’t see the assumptions (the lies?) we tell ourselves. In Microlight flying we have the Biennial Flight Review, which should be much more than just a skills check, it’s an opportunity to review all aspects of our flying including the judgement decisions we make.

In PG/PPG/HG flying we have our peer group on the hill, our flying mates. If they are the right sort of flying mates, they will tell you when you are kidding yourself about the safety of some of the flight decisions you’ve made.

Safe flying.