The Human-Centered Requirements Engineering (HUCRE) model Applied to Air Traffic Control (ATC) System

 

An overworked and underworked mental workload (MW) can both pose severe consequences to decisions and performance (Gregoriades & Sutcliffe, 2006). Gregoriades and Sutliffe (2006) propose the Human-Centered Requirements Engineering (HUCRE) model to assess the level of workload required by a system. As the number of aircraft increased over the past decades, the number of accidents and runway incursions also increased (Pape et al., 2001). The ATC system has become the focus of aviation human factor studies. Pape et al. (2001) report that human errors remain the primary cause of ATC-related accidents; of those accidents, skill-based accidents consist the most significant percentage (82%), followed by violations (33%) and decision errors (2.2%). Skill-based errors are, namely, impeded memory retrieval and attention failures (Pape et al., 2001). Therefore, by understanding how much the system demands cognitive load, the organization can better make changes to the system or identify possible solutions to mitigate human errors and accidents. 

The HUCRE and human cognition

The HUCRE model organizes the decision process into five task stages. The pathologies involved can be divided into general categories, including decision pathologies and others are biases. The pathologies involved can be categorized into what Pape et al. (2001) identified as the leading causes of skill-based errors - impeded memory retrieval and attention failures. For example, “attention lapses,” “recognition errors,” and “poor mental models” stem from memory issues, while “skills” and “situational awareness” and “poor validations” are attention issues. The other pathologies are biases that are embedded in human cognition, including confirmation biases and representativeness anchoring, which are inevitable and always present (Doyle, 1997).

Pathologies

The pathologies present in the HUCRE mainly involve memory retrieval and attention, relating to situational awareness. Situational awareness is defined by the perception of stimuli in the surroundings, the compression of their significance, and the prediction of future states. These three aspects of the definition also present the formation of different levels of situational awareness (Stanton et al., 2001). For this reason, the importance of situational awareness cannot be overstated. One research even shows that the primary cause of accidents is caused by poor situational awareness (Stanton et al., 2001). The HUCRE model fits flawlessly with any human factors application because the pathologies suggested in the first three steps of the decision process: recognize the event, interpret the event, and analyze situations, align with the three levels of situational awareness. 

To address the insufficient situational awareness amongst air traffic controllers, it is important to provide early training for air traffic controllers for building the correct mental models and pattern recognition skills. Through experimenting with cross-task cue utilization, Falkland and Wiggins (2019) find that cognitive processes, including cue recognition, memory retention, and pattern recognition, enable participants to make suitable decisions in ATC simulations. Participants with a higher cross-task cue utilization skill result in higher task performance. Because a significant part of air traffic controllers’ work involves perceiving cues, recognizing patterns of these cues early in their career can result in better performance.

Biases

The other category of pathology identified by the HUCRE is the biases, including confirmation bias, representative anchoring, and the halo effect. Humans form biases inevitably (Doyle, 1997). The intention is to reduce unnecessary cognitive load. However, as Murata et al. (2015) comment, these shortcuts introduce risk factors of human errors leading to adverse and unforeseen events. Confirmation bias is found when selecting data that supports their thinking. For example, when controllers expect convective weather (CW) to happen, they will place extra weight on the high probability of CW forecast and likely ignore data showing that CW will not occur (Gibbons et al., 2014). To address this, Gibbons et al. (2014) recommend training air traffic controllers to consider all possibilities and alternatives during decision making. The system should provide all alternative possibilities. Representative anchoring occurs when using known knowledge as an anchor to make decisions and fine-tune until the answer is acceptable. When air traffic controllers receive an initial CW forecast report that indicates a low chance of CW, later reports throughout the day will then be weighted less because the first report is already anchored (Gibbons et al., 2014). To avoid such bias, air traffic controllers need to acknowledge this bias to drop the bias and make a more inclusive decision (Krockow, 2019).

Conclusion

The HUCRE uses the decision process, the pathway from stimulus perception to execution as a framework to evaluate the amount of cognitive workload the system places on operators. Cognitive pathologies are also identified along the process. ATC has been in increasing demands and the controllers’ MW is gaining attention because excessive load leads to accidents. To address pathologies and biases, controllers should utilize system simulations to become familiar with the workload and cues and build a complete mental model for their tasks. To counteract biases, the system needs to provide all available information for the operator to make a complete and inclusive decision. Training is also necessary to provide knowledge on the existence of these biases so controllers can consciously avoid them when making decisions.

References

Doyle, J. K. (1997). The cognitive psychology of systems thinking. System Dynamics Review, 13(3), 253–265. https://doi-org.ezproxy.libproxy.db.erau.edu/10.1002/(SICI)1099-1727(199723)13:3<253::AID-SDR129>3.0.CO;2-H

Falkland, E. C., & Wiggins, M. W. (2019). Cross-task cue utilisation and situational awareness in simulated air traffic control. Applied Ergonomics, 74, 24–30. https://doi.org/10.1016/j.apergo.2018.07.015

Gibbons, B., Jonsson, J., Abelman, S., & Bass, R. (2014). Observed Heuristics and Biases in Air Traffic Management Decision Making Using Convective Weather Uncertainty. 6th AIAA Atmospheric and Space Environments Conference, 1–7. https://doi.org/10.2514/6.2014-2901

Gregoriades, A., & Sutcliffe, A. G. (2006). Automated assistance for human factors analysis in complex systems. Ergonomics, 49(12–13), 1265–1287. https://doi.org/10.1080/00140130600612721

Krockow, E. M. (2019, February 11). Outsmart the Anchoring Bias in Three Simple Steps. Psychology Today. https://www.psychologytoday.com/us/blog/stretching-theory/201902/outsmart-the-anchoring-bias-in-three-simple-steps

Murata, A., Nakamura, T., & Karwowski, W. (2015). Influence of Cognitive Biases in Distorting Decision Making and Leading to Critical Unfavorable Incidents. Safety, 1(1), 44–58. https://doi.org/10.3390/safety1010044

Pape, A. M., Wiegmann, D. A., & Shappell, S. (2001). AIR TRAFFIC CONTROL (ATC) RELATED ACCIDENTS AND INCIDENTS: A HUMAN FACTORS ANALYSIS. Federal Aviation Administration, 1–4. https://www.faa.gov/about/initiatives/maintenance_hf/library/documents/media/human_factors_maintenance/air_traffic_control_(atc)_related_accidents_and_incidents.a_human_factors_analysis.pdf

Stanton, N., Chambers, P., & Piggott, J. (2001). Situational awareness and safety. Safety Science, 39(3), 189–204. https://doi.org/10.1016/s0925-7535(01)00010-8