Unintended Consequences due to Automation

 

The implementation of automation is meant to increase efficiency and reduce operator workload. However, because of the increased knowledge retention and complexity, this can sometimes result in confusion and error during operating with a machine with automation features. Therefore, as Boeing and Airbus's philosophy on automation states, the pilots need to be the final decision makers of the aircraft and safety is always the top priority when flying because the pilots are ultimately responsible for the safety of the flight, not the automated machine (Kwak, et al., 2018). Therefore, the pilots need to gain understanding of the autopilot procedures and decrease dependency of the automated system.

One of the recent aviation accidents caused by human error relating to automation is the Boeing 777 Asiana Flight 214 crash on July 6, 2013 from Korea to San Francisco on a sunny, perfect weather day. The PF was a trainee pilot who decided to land by the visual approach. When the aircraft was approaching landing, the aircraft was still well above the optimal glide path even when in autopilot mode. To bring the aircraft into optimal altitude, the pilot changed the 1000 feet per minute descent rate to 1500 feet. However, about six miles from the runway, the descent rate was set back to 1000 feet per minute. About five miles from the runway, the aircraft was 400 feet higher than desired, but the crew began slowing the airplane. The pilot flying (PF) called for flaps 30, but the pilot monitoring indicated the aircraft speed was too fast for it. Therefore, the PF switched to “flight level change mode,” and the autopilot initiated a climb to reach the previously set altitude. To reverse the climb, the PF disengaged the autopilot and moved the thrust lever to idle. This resulted in the autothrottle to be on hold, in which the mode does not control the speed. The plane continued to fly towards the runway, but the airspeed was well below desired. The aircraft was dropping in altitude too fast and the airspeed, because on hold, could not keep up. Eventually, pilot monitoring took over, attempting to perform a go-around, but the intervention was too late and the plane’s tail ultimately hit the seawall (AIRBOYD, 2014).

The AF214 crash was caused by purely human errors: a lack of mode awareness of the automated system and situational awareness. According to Chow et al. (2014), the trainee PF later confessed that he was stressed about landing at an unfamiliar airport and he thought the autothrottle was on. The PF also made the assumption that the on-hold autothrottle would still control airspeed. It is not safe to assume. Pilots need to either check with the other pilots or the manual when they are unsure. This assumption is attributed to the lack of understanding of the autopilot system and its limitations. Based on Kwak et al.’s (2018) analysis, of all the human errors, the top three contributing factors are inadequate understanding of automation, over-reliance on automation, and overconfidence in automation. With the implementation of autopilot, the training that pilots receive will need to be adjusted. 

Other than for the pilots to receive more training early in their career about autopilot, some systematic changes were also implemented. Both U.S. and European aviation regulators wanted Boeing before about the lack of cues regarding holding speed control when the autothrottle is on hold (Asiana Airlines Crash Caused by Pilot Error and Confusion, Investigators Say, 2017). Aircraft makers urged to require audible warning alerts to pilots when the throttle changes to any setting that no longer maintains speed. 

Landing with manual mode is a common practice when in severe weather. In the case of AF214, because it was a sunny and windless day, the PF did not opt for the manual mode option. It could either be attributed to his lack of confidence or lack of knowledge. Well before reaching the runway, the pilots, both the PF and the pilot monitoring should realize that the plane is not under optimal glide path. This is when one of the pilots should have taken over to fly under autopilot or suggest the PF to land with manual mode. The fact that this option was not executed is likely due to an over-reliance and over-confidence of the autopilot system. As the Boeing and Airbus’s automaton philosophy (Kwak et al., 2018) and Chow et al. (2014) comment, even though automation can make the fly safer, even in automated aircracts, the human must be the boss. The machine is not aware of the situation, therefore, it lacks situational awareness and does not know the best option for the current situation. Because of how flexible and adaptable humans are compared to automation, humans should be in charge of responding to unforeseen events, be able to maintain the necessary skills to perform what an automated machine can perform, and be ready to take over whenever the human operator deems necessary.


References

AIRBOYD. (2014, June 24). Asiana Flight 214 Crash - NTSB Animation. YouTube. https://www.youtube.com/watch?v=QVaQYhd_Qy0

Asiana airlines crash caused by pilot error and confusion, investigators say. (2017, July 15). The Guardian. https://www.theguardian.com/world/2014/jun/24/asiana-crash-san-francsico-controls-investigation-pilot

Chow, S., Yortsos, S., & Meshkati, N. (2014). Asiana Airlines Flight 214. Aviation Psychology and Applied Human Factors, 4(2), 113–121. https://doi.org/10.1027/2192-0923/a000066

Kwak, Y.-P., Choi, Y.-C., & Choi, J. (2018). Analysis between Aircraft Cockpit Automation and Human Error Related Accident Cases. International Journal of Control and Automation, 11(3), 179–192. https://doi.org/10.14257/ijca.2018.11.3.16