Navigation auf uzh.ch
Assessing how visual search entropy and engagement predict performance in a multiple-objects tracking air traffic control task
Elsevier
https://doi.org/
Sara Lanini-Maggi, Ian T.Ruginski, Thomas F.Shipley, Christophe Hurterc, Andrew T. Duchowski, Benny B. Briesemeisteref, Jihyun Lee, Sara I. Fabrikant
Abstract
Behavioral performance metrics employed to assess the usability of visual displays are increasingly coupled with eye tracking measures to provide additional insights into the decision-making processes supported by visual displays. Eye tracking metrics can be coupled with users' neural data to investigate how human cognitioninterplays with emotions during visuo-spatial tasks. To contribute to these efforts, we present results of a study in a realistic air traffic control (ATC) setting with animated ATC displays, where ATC experts and novices were presented with an aircraft movement detection task. We find that higher stationary gaze entropy – which indicates a larger spatial distribution of visual gaze on the display – and expertise result in better response accuracy, and that stationary entropy positively predicts response time even after controlling for animation type and expertise. As a secondary contribution, we found that a single component comprised of engagement, measured by EEG and self-reported judgments, spatial abilities, and gaze entropy predicts task accuracy, but not completion time. We also provide MATLAB open source code for calculating the EEG measures utilized in the study. Our findings suggest designing spatial information displays that adapt their content according to users’ affective and cognitive states, especially for emotionally laden usage contexts.