[su_row][su_column size=”2/3″ center=”no” class=””]
Imagine you are in the cockpit of a commercial airplane. Being the First Officer, you are seated next to the Captain. It is your job to assist your Captain, who is in command of the aircraft. Because it is a long-haul flight, the two of you also have access to the Second Officer (also called the Flight Engineer) who ranks lower yet oversees some critical operations, and acts as the relief pilot. In an ideal world, this three-member aircraft leadership team would work closely together and be responsible for all flight-related decisions, resulting in smooth functioning of the aircraft.
However, conversations retrieved from the cockpit recorders of commercial airlines in America many years ago proved this was not the case. A Boeing study indicated that 65% of all airliner accidents between 1959 to 1986 could be attributed to human error. Specifically, there were several instances of mishaps caused due to poor teamwork in the cockpit. Consider this: onboard an Eastern Air Lines flight in 1972, while all three pilots became deeply engrossed in investigating a malfunctioning landing gear indicator, nobody noticed that the autopilot had been inadvertently disengaged. Due to this glaring oversight, the aircraft began its gradual descent and eventually crashed. The situation could have been avoided, had the Captain delegated flight control to one of his co-pilots.
In another instance in 1978, while the pilots of a United Airlines flight were busy investigating a landing-gear problem, the aircraft ran out of fuel after an hour of circling and crashed six miles from the airport. As it turns out, both the First and Second Officers were aware of the grim fuel situation but did not directly challenge the Captain, who continued his preoccupation with the landing gear. In subsequent pilot training sessions organized by United Airlines, conversations from the black box recorders were played back to the officers. A flight manager indicated that “Pilots would cry out in pain and disbelief when the captain asked the flight engineer how much fuel they had, and the flight engineer said several minutes, and the captain still didn’t comprehend the significance of the information.”
In the 1980s, on the back of a series of embarrassing incidents for Delta Air Lines (including pilots landing at the wrong airports), the Federal Aviation Administration (F.A.A) launched an investigation into its training and cockpit procedures, and reported that “…it was observed that crew members are frequently acting as individuals rather than as members of a smoothly functioning team.” In the book The Undoing Project, author Michael Lewis talks about Jack Maher, the executive at Delta Air Lines in charge of training pilots. With pilot errors on the rise, Maher approached cognitive psychologist Amos Tversky, an expert on formulating theories related to decision making and cognitive biases. Tversky took an unconventional look at the problem and explained why pilots sometimes made bad decisions. He said “You’re not going to change people’s decision making under duress. You aren’t going to stop pilots from making these mental errors. You aren’t going to train the decision-making weaknesses out of the pilots.”
Instead, Tversky suggested changing the environment in which the pilots make these decisions. Explains Lewis – ‘People had trouble seeing when their minds were misleading them; on the other hand, they could sometimes see when other people’s minds were misleading them.’ In other words, Maher had to change the autocratic, command and control culture of the cockpit to a more collaborative setup wherein the First and Second Officers could freely speak up to the Captain and question her judgement when needed. Maher did exactly that, by changing the way Delta trained their pilots. The result? According to Maher, “Those mistakes haven’t happened since.”
As we can see, bad decision making at the top can have critical implications, especially in a highly sensitive environment such as an aircraft cockpit. When it comes to mitigating such biases at the leadership level, executives can deploy two distinctive types of interventions.
- Direct change involves changing leadership behaviour through training and coaching in areas relating to delegation of authority, decision making, situational leadership and the like. However, altering leadership behaviour takes time, effort and energy.
- Indirect change involves changing the rules of the game by improving policies, processes, workflows, etc. in a way that facilitates collaborative behaviour.
Airlines in the 1980s likely incorporated a combination of these methods to bring about positive change in the cockpit, thereby reducing chances of human error.
If you were to think back to your own organization and particularly your team, how do you mitigate such risk, and enable better collaboration among team members?
[su_column size=”1/3″ center=”no” class=””]
[su_posts template=”templates/list-loop.php” posts_per_page=”-5″ tax_term=”4″ order=”desc”]
[facebook-page-plugin href=”KNOLSKAPE” width=”450″ height=”400″ cover=”true” facepile=”true” tabs=”timeline” adapt=”true”]