Human Factors System Discipline in Pilot and System Interactions
Human factors as a system discipline consider humans' interactions and relationships in a system, a nonlinear, dynamic, complex system, and how to design for such a system that incorporates human requirements. Liaghati et al. (2020) introduce the Goals, Operations, Methods, and Selection (GOMS) model to integrate human factors and system design. Goals represent the system and users' goals and their voices. Operations are defined by the cognitive elements and users' mental state, which requires a deeper understanding of human psychology. Methods involve testing the human requirements in the system and the selection rules dictating the correct methods to use. These are used as a basis for incorporating human factors in systems.
Humans cognition
In order to understand pilots' interactions with the aircraft, the tower, and the crew members, an understanding of human cognition, behaviors, limitations, and interpersonal relationships is crucial. It is noticed that takeoffs and landings contribute to most aviation accidents (Federal Aviation Administration, n.d.). Because cognition represents humans' understanding of the surrounding, it also dictates how humans interact with the environment and the system. Therefore, understanding how humans change over time will aid in system design.
Wojcik et al. (2021) note that humans' feelings change according to their knowledge about themselves and the environment, which then dictates behaviors. An example of this is when the Asiana Flight 214 crashed during landing because of the pilot's misjudgments about the system state and the environment (Chow et al., 2014). When the pilot had misaligned situational awareness, his judgments and actions were unsuitable for the current situation. Other emotional factors such as nervousness and stress can also alter human actions, leading to various interactions with the system. For example, the pilot in charge was unfamiliar with the airport and was nervous about the landing. Stokes and Kite (2016) comment that memory can be affected under stress, affecting knowledge retrieval and, ultimately, judgments and behaviors. These internal influences follow the GOMS model, addressing goals and operations, tailoring the pilots' cognition, needs, and requirements into the communication system. By understanding these intrinsic human cognitive influences, system engineers can better design the system considering these possible influences and make adjustments or preventions accordingly.
Complex, nonlinearity, and dynamic system
Humans, the aircraft, the environment, such as weather and unexpected malfunctions, and everything else, including air traffic control and crews, form a complex, nonlinear, and dynamic system. All components are uncertain and dynamic within the system, which exists in the form of chaos (Sharma, 2006). Therefore, the methods for considering system designs need to address and account for the nonlinearity and dynamic aspects of the system. Sharma (2006) finds promising results in predicting chaos in a system when using approximate entropy (ApEn) to quantify dynamics.
Sharma (2006) finds that using nonlinear dynamic properties such as heart interbeat interval (IBI) and altitude tracking error time series data can serve as indicators for pilots' performance. Additionally, Wojcik et al. (2021) also note variations in cardiovascular measurements in a stressful response. As a result, these chaos indicators can be used to judge human limitations and requirements, which leads to the introduction of automation aid in the task. When incorporating these monitoring methods on future aircraft, the system can then alert for any necessary automation aid, significantly reducing the chances of human errors due to overloaded cognition. The usage of ApEn and nonlinear dynamic properties align with the last two aspects of the GOMS model suggested by Liaghati (2020) to incorporate human factors into system design. Because most systems, including the pilots' interactions with surroundings, are complex, nonlinear, and dynamic, it is necessary to understand and observe human changes with a function of time to adjust system design accordingly. As Liaghati (2020) states, using parametric models that address humans will allow these requirements to be satisfied as the engineers build the system.
Integration
Humans are already complex and dynamic. When joined with other components of the system that add complexity and uncertainty into the entire system, the most important thing of human factors as a systems discipline is to understand how humans will interact and change based on the changes. By keeping fuzzy rules that drive the system, the human-system interaction can be modeled (Karwowski, 2012). According to Karwowski (2012), for the human factors to be successful in a system discipline, individual human performance such as sensory and cognition and the technological aspects that such as automation need to be considered one entity. This addresses the importance of the multidisciplinary and integrated nature of human factors in a system design.
Conclusion
As in many other models for system design, the emphasis is placed on human requirements, goals, and needs. Whether it is GOMS, soft and hard systems thinking, complex adaptive system engineering, quantitative or qualitative, these models are part of the systems discipline to address human factors. Human factors start by understanding human psychology and how humans behave under certain circumstances and decision-making processes. Then, consider the complexity and uncertainty of the system, and integrate the two, which is to predict and model how humans will behave and use those findings to shape the system design.
References
Chow, S., Yortsos, S., & Meshkati, N. (2014). Asiana Airlines Flight 214. Aviation Psychology and Applied Human Factors, 4(2), 113–121. https://doi.org/10.1027/2192-0923/a000066
Federal Aviation Administration. (n.d.). Takeoffs and Departure Climbs. Retrieved May 20, 2021, from https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/airplane_handbook/media/07_afh_ch5.pdf
Karwowski, W. (2012b). A Review of Human Factors Challenges of Complex Adaptive Systems. Human Factors: The Journal of the Human Factors and Ergonomics Society, 54(6), 983–995. https://doi.org/10.1177/0018720812467459
Liaghati, C., Mazzuchi, T., & Sarkani, S. (2020). A method for the inclusion of human factors in system design via use case definition. Human-Intelligent Systems Integration, 2(1–4), 45–56. https://doi.org/10.1007/s42454-020-00011-1
Sharma, S. (2006). An exploratory study of chaos in human-Machine system dynamics. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 36(2), 319–326. https://doi.org/10.1109/tsmca.2005.851262
Stokes, A., & Kite, K. (2016). Flight stress: Stress, fatigue, and performance in aviation. Routledge.
Wojcik, D., Moulin, C., & Fernandez, A. (2021). Assessment of metacognition in aviation pilot students during simulated flight training of a demanding maneuver. Applied Ergonomics, 95, 103427. https://doi.org/10.1016/j.apergo.2021.103427