Fig 3 - uploaded by Mickael Causse
Content may be subject to copyright.
Overview of the ten different AOIs: (1) Attitude indicator, (2) Speed tape, (3) Vertical speed tape, (4) Flight mode annunciator, (5) Heading tape, (6) Navigation display, (7) ND zone (displays the distance to recall during the two landing scenarios with the supplementary task), (8) Flight control unit, (9) Electronic centralized aircraft monitoring, (10) Out of the window. https://doi.org/10.1371/journal.pone.0247061.g003

Overview of the ten different AOIs: (1) Attitude indicator, (2) Speed tape, (3) Vertical speed tape, (4) Flight mode annunciator, (5) Heading tape, (6) Navigation display, (7) ND zone (displays the distance to recall during the two landing scenarios with the supplementary task), (8) Flight control unit, (9) Electronic centralized aircraft monitoring, (10) Out of the window. https://doi.org/10.1371/journal.pone.0247061.g003

Source publication
Article
Full-text available
During a flight, pilots must rigorously monitor their flight instruments since it is one of the critical activities that contribute to update their situation awareness. The monitoring is cognitively demanding, but is necessary for timely intervention in the event of a parameter deviation. Many studies have shown that a large part of commercial avia...

Contexts in source publication

Context 1
... purpose of this supplementary task was to increase the level of visuo-attentional effort: participants had to regularly check the ND Zone in the ND screen to say aloud the value at the right time. In the "easy dual task scenario", participants were asked to say aloud the distance between the aircraft and the airfield threshold every 0.5 Nm (information provided by a radio beacon localized near the airfield and displayed in the ND Zone, see Fig 3). In the "difficult dual task scenario", they were asked to say aloud this distance every 0.2 Nm. ...
Context 2
... image edge information with 3-D models of the eye and eyelids. As presented in Fig 3, the system uses five cameras integrated into the cockpit. A major advantage of using several cameras is that eye and head tracking can be maintained despite significant head motions (translation and rotation) or occlusion of one of the cameras by the participant (e.g., by its hand). ...
Context 3
... model and area of interest. The cockpit was split into 10 AOIs, corresponding to the different flight instruments and displays that pilots can examine during a flight, see Fig 3. We choose to restrict our analysis to instruments that display information directly related to the flight parameter (altitude, speed etc.) and external view (i.e., Out of the Window). ...
Context 4
... 4 shows the entire eye tracking pipeline analysis. Each AOI was coded using numbers from 1 to 10 corresponding to the flight instruments (see Fig 3). Only AOI-based data were extracted in this experiment and concatenated to obtain two chronological vectors containing the indices of the visited AOIs (from 1 to 10) and the time spent on them. ...
Context 5
... purpose of this supplementary task was to increase the level of visuo-attentional effort: participants had to regularly check the ND Zone in the ND screen to say aloud the value at the right time. In the "easy dual task scenario", participants were asked to say aloud the distance between the aircraft and the airfield threshold every 0.5 Nm (information provided by a radio beacon localized near the airfield and displayed in the ND Zone, see Fig 3). In the "difficult dual task scenario", they were asked to say aloud this distance every 0.2 Nm. ...
Context 6
... image edge information with 3-D models of the eye and eyelids. As presented in Fig 3, the system uses five cameras integrated into the cockpit. A major advantage of using several cameras is that eye and head tracking can be maintained despite significant head motions (translation and rotation) or occlusion of one of the cameras by the participant (e.g., by its hand). ...
Context 7
... model and area of interest. The cockpit was split into 10 AOIs, corresponding to the different flight instruments and displays that pilots can examine during a flight, see Fig 3. We choose to restrict our analysis to instruments that display information directly related to the flight parameter (altitude, speed etc.) and external view (i.e., Out of the Window). ...
Context 8
... 4 shows the entire eye tracking pipeline analysis. Each AOI was coded using numbers from 1 to 10 corresponding to the flight instruments (see Fig 3). Only AOI-based data were extracted in this experiment and concatenated to obtain two chronological vectors containing the indices of the visited AOIs (from 1 to 10) and the time spent on them. ...

Similar publications

Article
Full-text available
Most accidents in the aviation, maritime, and construction industries are caused by human error, which can be traced back to impaired mental performance and attention failure. In 1596, Du Laurens, a French anatomist and medical scientist, said that the eyes are the windows of the mind. Eye tracking research dates back almost 150 years and it has be...

Citations

... Gaze transition entropy quantifies the complexity of the gaze trajectory in 'bits', (Lounis et al., 2021). GTE is defined as: ...
Preprint
Full-text available
We study an individual’s propensity for rational thinking; the avoidance of cognitive biases (unconscious errors generated by our mental simplification methods) using a novel augmented reality (AR) platform. Specifically, we developed an odd-one-out game-like task in AR designed to try to induce and assess confirmatory biases. Forty students completed the AR task in the laboratory, and the short form of the comprehensive assessment of rational thinking (CART) online via the Qualtrics platform. We used distance correlation approaches and stepwise regression to identify objective markers (based on eye, hand and head movements) associated with the psychometric measures of propensity for rational thinking. We show that the proposed markers are associated with the short CART score – more rational thinkers have slower head and hand movements, faster gaze movements and more consistent hand-eye-head coordination patterns across conditions.
... On the visual attention area in flight, research results in 2020 revealed that the pilot experience level had a significant impact on the fixation time ratio and dwell time [16]. Compared to novices, professional pilots had a higher perceptual efficiency (more numerous and shorter dwells), a better distribution of attention, an ambient mode of visual attention, and more complex and elaborate visual scanning patterns [17]. When the autopilot is off, pilots should allocate visual resources (e.g., out-of-the-window) [16]. ...
Article
Full-text available
is paper introduces a novel visual landing system applicable to the accurate landing of commercial aircraft utilizing human depth perception algorithms, named a 3D Model Landing System (3DMLS). e 3DMLS uses a simulation environment for visual landing in the failure of navigation aids/avionics, adverse weather conditions, and limited visibility. To simulate the approach path and surrounding area, the 3DMLS implements both the inertial measurement unit (IMU) and the digital elevation model (DEM). While the aircraft is in the instrument landing system (ILS) range, the 3DMLS simulates more details of the environment in addition to implementing the DOF depth perception algorithm to provide a clear visual landing path. is path is displayed on a multifunction display in the cockpit for pilots. As the pilot's eye concentrates mostly on the runway location and touchdown point, "the runway" becomes the center of focus in the environment simulation. To display and evaluate the performance of the 3DMLS and depth perception, a landing auto test is also designed and implemented to guide the aircraft along the runway. e ight path is derived simultaneously by comparison of the current aircraft and the runway position. e Unity and MATLAB software are adopted to model the 3DMLS. e accuracy and the quality of the simulated environment in terms of resolution, the eld of view, frame per second, and latency are con rmed based on FSTD's visual requirements. Finally, the saliency map toolbox shows that the depth of eld (DOF) implementation increases the pilot's concentration resulting in safe landing guidance.
... Ñïîðåä ñïåöèàëèñòèòå â àâèàöèîííàòà ñôåðà åôåêòèâíîòî ðàçïðåäåëåíèå íà çðèòåëíîòî âíèìàíèå ÷ðåç ñêàíèðàíå íà ïèëîòàaeíèòå ïðèáîðè â îïðåäåëåíà ïîñëåäîâàòåëíîñò è ñ îïòèìàëíà ïðîäúëaeèòåëíîñò å ñïåöè-ôè÷íà òåõíèêà, êîÿòî âàðèðà ïðè èçïúëíåíèå íà ðàçëè÷íèòå çàäà÷è. Ïðåäñòàâëÿâà âàaeíî óìåíèå, îñèãóðÿâàùî ïðåöèçíî óïðàâëåíèå íà ëåòàòåëíèÿ àïàðàò, êîåòî çàâèñè îò ïðîôåñèîíàëíàòà ïîäãîòîâêà è ïðèäîáèòèòå åêñïåðòíè êà÷åñòâà çà ïèëîòèðàíå [5,8]. ...
Article
The contemporary eye tracking technology is an innovative oculographic method, successfully used in aviation to record and measure the human eye movements. It allows non-invasive analysis of pilots visual behaviour in a real or simulated flight environment without interrupting the flight process. This method is used for the objective measurement of parameters such as saccades, fixations; blink frequency and pupil diameter, which provide valuable data regarding the operators attention distribution across the different flight instruments, and how flight information is processed under different flight conditions. In the current study, operators of an unmanned aerial vehicle (UAV) performed simulated flights. Eye-tracking data was collected and graphical representations of the measured fixations were used to demonstrate the effective and ineffective visual attention distribution of operators with different level of expertise during the training process.
... Finally, few advanced gaze metrics and analysis techniques such as K coefficient, Lempel-Ziv complexity or Gaze entropy (Lounis, Peysakhovich, and Causse 2021) have been found to be used in the selected studies. Such methods could likely help to provide a deeper understanding about the visual exploration behavioural trends emerging from the current literature. ...
Article
Among highly automated driving research topics, transition of control, especially from automated to manual driving (i.e., takeover) have been the object of numerous studies. Drivers’ gaze behaviours are of particular interest as proper information processing by the driver is a key element to safe manual control resumption. In this study, 22 research articles were selected and reviewed to extract main gaze behaviours tendencies during takeover transitions. Despite a great variety in protocols and gaze metrics, categories of factors and gaze parameters have been established. The resulting data shows that at the onset of the takeover request gaze tend to be quickly reoriented towards the road ahead. Despite this, an increase in gaze dispersion on the road environment was found in automated driving. In addition, clear evidence for detrimental effects of visually engaging non-driving related activities on visual exploration strategies emerge (e.g., delay in reorienting the gaze towards the road ahead). Altogether, gaze data reveal the complexity of information processing during takeovers. A synthetical view of gaze parameters evolution during the takeover process has been established and associated with visual information gathering and processing. Finally, a broader view underlying the importance of combining technological approaches with human centred ones is proposed.
... For instance, aeronautical studies have shown that pilots' visual scanning strategies develop with level of expertise (Haslbeck et al., 2012;Haslbeck & Zhang, 2017;Yang et al., 2013;Ziv, 2016), and generally reflect a move towards shorter and more frequent fixations indicative of more rapid scanning of the instruments. Lounis et al., (2021) found shorter dwell times, and more complex transition sequences as indexed by entropy and Lempel-Ziv complexity. Consequently, as well as looking at the relationship between real-world expertise and simulator performance (i.e., construct validity), we aimed to assess whether real-world experts showed different gaze behaviours in the simulation. ...
... Entropy, as a concept from information theory (Shannon, 1948), describes uncertainty about outcomes. When applied to eye tracking it indexes the level of randomness or variability in eye movements and has been previously used in assessing aviation performance (Allsop & Gray, 2014;Lounis et al., 2021;Vine et al., 2015) and can index whether a pilot is performing a structured and systematic scanning pattern, or a highly variable and inefficient one (Lounis et al., 2021). We adopted a simple measure of entropy described by Shannon and Weaver (1949), sometimes referred to as Gaze Transition Entropy (Lounis et al., 2021) which quantifies the complexity of the scan in 'bits', according to the formula: ...
... Entropy, as a concept from information theory (Shannon, 1948), describes uncertainty about outcomes. When applied to eye tracking it indexes the level of randomness or variability in eye movements and has been previously used in assessing aviation performance (Allsop & Gray, 2014;Lounis et al., 2021;Vine et al., 2015) and can index whether a pilot is performing a structured and systematic scanning pattern, or a highly variable and inefficient one (Lounis et al., 2021). We adopted a simple measure of entropy described by Shannon and Weaver (1949), sometimes referred to as Gaze Transition Entropy (Lounis et al., 2021) which quantifies the complexity of the scan in 'bits', according to the formula: ...
Preprint
Full-text available
Objective: The aim of this work was to examine the fidelity and validity of an aviation simulation using eye tracking. Background: Commercial head-mounted virtual reality (VR) systems offer a convenient and cost-effective alternative to existing aviation simulation (e.g., for refresher exercises). We performed pre-implementation testing of a novel aviation simulation, designed for head-mounted VR, to determine its fidelity and validity as a training device. Method: Eighteen airline pilots, with varying levels of flight experience, completed a sequence of training ‘flows’. Self-reported measures of presence and workload and users’ perceptions of fidelity were taken. Pilots’ eye movements and performance were recorded to determine whether more experienced pilots showed distinct performance and eye gaze profiles in the simulation, as they would in the real-world. Results: Real-world expertise correlated with eye gaze patterns characterised by fewer, but longer, fixations and a scan path that was more structured and less random. Multidimensional scaling analyses also indicated differential clustering of strategies in more versus less experienced pilots. Subjective ratings of performance, however, showed little relationship with real-world expertise or eye movements. Conclusion: We adopted an evidence-based approach to assessing the fidelity and validity of a VR flight training tool. Pilot reports indicated the simulation was realistic and potentially useful for training, while direct measurement of eye movements was useful for establishing construct validity and psychological fidelity of the simulation.
... Effective cockpit monitoring is a key element of manual flight control. It helps to keep pilots in the loop by continuously updating their situational awareness about the current state of the aircraft [17]. Cockpit monitoring can be described as the methodical and meaningful visual scanning of instruments, necessary to control the aircraft trajectory or to make appropriate changes in aircraft attitude. ...
... Visual scanning strategies (frequency and duration of fixations, visual scanning patterns, etc.) are greatly influenced by pilots' expertise, e.g., [17][18][19][20]. Gegenfurtner, Lehtinen, and Säljö [21] conducted a review of eye-tracking research in various fields such as radiology, driving, sport, and chess. ...
Article
Full-text available
Poor cockpit monitoring has been identified as an important contributor to aviation accidents. Improving pilots’ monitoring strategies could therefore help to enhance flight safety. During two different sessions, we analyzed the flight performance and eye movements of professional airline pilots in a full-flight simulator. In a pre-training session, 20 pilots performed a manual approach scenario as pilot flying (PFs) and were classified into three groups according to their flight performance: unstabilized, standard, and most accurate. The unstabilized pilots either under- or over-focused various instruments. Their number of visual scanning patterns was lower than those of pilots who managed to stabilize their approach. The most accurate pilots showed a higher perceptual efficiency with shorter fixation times and more fixations on important primary flight instruments. Approximately 10 months later, fourteen pilots returned for a post-training session. They received a short training program and performed a similar manual approach as during the pre-training session. Seven of them, the experimental group, received individual feedback on their own performance and visual behavior (i.e., during the pre-training session) and a variety of data obtained from the most accurate pilots, including an eye-tracking video showing efficient visual scanning strategies from one of the most accurate pilots. The other seven, the control group, received general guidelines on cockpit monitoring. During the post-training session, the experimental group had better flight performance (compared to the control group), and its visual scanning strategies became more similar to those of the most accurate pilots. In summary, our results suggest that cockpit monitoring underlies manual flight performance and that it can be improved using a training program based mainly on exposure to eye movement examples from highly accurate pilots.
Article
Situational awareness is the ability of pilots to master flight status, which is of great significance to aviation flight safety and flight effect. According to the information processing model, the pilot’s main steps of processing information are feeling, perception and execution. There are many problems in situation awareness analysis guided by visual gaze, such as large analysis deviation and high delay due to various influencing factors and complex characteristics. In order to solve this problem, this paper proposes a situation awareness assessment method based on artificial intelligence neural network and integrating visual gaze and flight control. First, this paper carries out simulated flight training experiments for flight cadets, and collects the data of eye movement, line of sight tracking, flight control and flight parameters of pilot cadets. Then, aiming at the flight subjects, a situation awareness analysis method based on events is established, and the situation awareness state in the experiment is evaluated and analyzed through the flight parameter data. Then, the visual gaze and flight control data are sliced in the unit of situational awareness events, and the data set is constructed. Finally, this paper designs a multi-channel sequence data classification and analysis model based on transformer, in which the situation awareness characteristics of visual gaze and operation behavior are analyzed through the attention mechanism. The experimental results show that the accuracy of situation awareness classification of the designed neural network model to the experimental data set is 96%, and can classify and evaluate the pilot’s situation awareness state in 5[Formula: see text]s.