
Gloria Calhoun- United States Air Force Research Laboratory
Gloria Calhoun
- United States Air Force Research Laboratory
About
92
Publications
10,936
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,904
Citations
Current institution
Publications
Publications (92)
A key challenge facing automation designers is how to achieve an ideal balance of system automation with human interaction for optimal operator decision making and system performance. A performance-based adaptive automation algorithm was evaluated with two versus six monitored task types. Results illustrate the importance of level of automation cho...
Human-autonomy teaming is essential for future military operations, but guidance on how to design controls and displays to support joint decision-making and task completion is scarce. Transportation planning experts, drawing from their design experience, recently compiled questions that they postulated would be useful to guide human-autonomy intera...
Autonomous capabilities are reaching a point where they can fulfill the role of a teammate for the command and control of unmanned vehicles. Individual characteristics of a human operator may influence how an autonomous teammate is utilized and the team’s performance. Twenty-four participants completed a questionnaire that included the Ten-Item Per...
Four color-coded overlays that augment a tactical map were evaluated as candidates for multi-unmanned vehicle (UV) sensor management by a single operator. Each overlay provided six levels of temporal information to indicate how long ago each map location had been viewed by a UV. Twelve participants completed 140 trials, each trial posing a question...
Future unmanned aerial systems (UAS) operations will require control of multiple vehicles. Operators are vulnerable to cognitive overload, despite support from system automation. This study tested whether attentional resource theory predicts impacts of cognitive demands on performance measures, including automation-dependence and stress. It also in...
Future human-autonomy teams will benefit from intelligent agents that can quickly deliberate across multiple parameters to generate candidate courses of action (COAs). This experiment evaluated the design of an interface to communicate agent-generated COAs to a human operator. Twelve participants completed 14 trials, each consisting of a series of...
Autonomous tools that can evaluate a course of action (COA) are being developed to assist military leaders. System designers must determine the most effective method of presenting these COAs to operators. To address this challenge, an experimental testbed was developed in which participants were required to achieve the highest score possible in a s...
Objective:
This simulation study investigated factors influencing sustained performance and fatigue during operation of multiple Unmanned Aerial Systems (UAS). The study tested effects of time-on-task and automation reliability on accuracy in surveillance tasks and dependence on automation. It also investigated the role of trait and state individu...
Increasing applications of automation in system designs raise issues on how to achieve an ideal balance of automation with human interaction for optimal operator situation awareness and performance. This experiment examined four automation control schemes applied to surveillance tasks performed in a multi-task simulation environment. Participants c...
Future applications are envisioned in which a single human operator manages multiple heterogeneous unmanned vehicles (UVs) by working together with an autonomy teammate that consists of several intelligent decision-aiding agents/services. This article describes recent advancements in developing a new interface paradigm that will support human-auton...
An extended play-based human-autonomy interface paradigm was designed to support rule-based and knowledge-based behaviors for human operators tasked with managing twelve unmanned vehicles (UVs). The present research evaluates how autonomy might provide additional decision support that helps the operator correlate new mission events with the require...
Advances in automation technology are leading to the development of operational concepts in which a single operator teams with multiple autonomous vehicles. This requires the design and evaluation of interfaces that support operator-autonomy collaborations. This paper describes interfaces designed to support a base defense mission performed by a hu...
The U.S. Air Force envisions future applications in which a single human operator manages multiple heterogeneous unmanned vehicles (UVs). To support this vision, a range of play-based interfaces were designed by which an operator can team with autonomy (consisting of several intelligent agents/services) to manage twelve air, ground, and sea surface...
Multi-unmanned air vehicle (UAV) operation requires a unique set of skills and high demand for new operators requires selection from populations without previous flight training. To support developing criteria for multi-UAV operator selection, the present study investigated the role of multiple individual difference factors in performance under dif...
Reliability of automation is known to influence operator reliance on automation. What is less understood is how the influence of reliability and the effects of operator fatigue might interact. The present study investigated the impact of automation reliability on accuracy and reliance and how this impact changes with level of fatigue during simulat...
Video game interfaces featuring multiple distinct icons that enable a player to quickly select specific actions from a larger set of possible actions have the potential to inform the development of interfaces that enable a single operator to control multiple unmanned vehicles (UVs). The goal of this research was to examine the design of a video gam...
This chapter addresses human-autonomy collaboration and coordination in the RPAS domain. Specifically, this chapter discusses the human system integration needs that arise when multiple RPAs are required for routine on-mission interactions. Design challenges for enabling a collaborative mixed-initiative approach to coordinate the different human an...
In order to assist a single operator tasked with controlling multiple unmanned vehicles, intelligent autonomous capabilities are being developed that recommend plans to meet operator goals. The goal of this research was to develop an effective visualization for allowing an operator to compare across autonomy recommended plans. Three plan comparison...
To keep pace with increasing applications of Unmanned Aerial Vehicles (UAVs), recruitment of operators will need to be expanded to include groups not traditionally engaged in UAV pilot training. The present study may inform this process as it investigated the relationship between video game experience and gender on performance of imaging and weapon...
Advances in automation technology are leading to development of unmanned aerial vehicle (UAV) concepts in which a single pilot is responsible for multiple UAVs. With these highly automated UAVs, the operator’s role will become more supervisory in nature. In response, a new interface paradigm is required for effective supervisory control of multiple...
Unmanned aerial systems (UAS) are starting to access manned airspace today and this trend will grow substantially as the number of UAS and their associated missions expand. A key challenge to safely integrating UAS into the National Airspace System (NAS) is providing a reliable means for UAS to sense and avoid (SAA) other aircraft. The US Air Force...
A key challenge to integrating unmanned aerial systems (UASs) into the National Airspace is providing a means for UASs to sense and avoid (SAA) other aircraft. Additionally, successful applications of a SAA system will depend on the degree to which the operator understands the rationale for its maneuvers/decision aids and can interact with the syst...
Adaptive automation has been proposed, where changes in automation are triggered based upon operator state to mitigate automation-induced problems such as complacency. This research sought to understand the effect of a weighted method, as compared to a method in which all tasks are weighed equally, when triggering changes in automation within a mul...
With advances in automation technologies, systems are now being considered wherein a single operator supervises multiple unmanned aerial vehicles. Supervisory control of highly autonomous systems will require a new interface design. The present effort extends a delegation control concept to enable a pilot to flexibly change the role of automation d...
This paper provides a detailed description of a concept demonstration that illustrates an adaptable operator-automation interface concept for applications involving multiple Unmanned Aerial Systems (UAS). Key hardware and software components are described, as well as the changes made on a previous prototype that focused on single UAS applications....
Operators of unmanned aerial vehicles (UAVs) may soon be controlling multiple sophisticated autonomous systems in complex, dynamic mission contexts. A vital element for successful collaboration between human and autonomous agents is communication, which could be improved through the use of formal methods such as model checking. In model checking, t...
A "delegation approach" to human interaction with automation should strive to achieve all of the flexibility that a human supervisor has in instructing, managing, redirecting and overriding well-trained human subordinates. But in the absence of human-like, natural-language understanding "androids", what would such interaction look like? This multi-...
Advances in automation technology are leading to development of operational concepts in which a single pilot is responsible for multiple remotely piloted aircraft (RPAs). This requires design and evaluation of pilot-RPA interfaces that support these new supervisory control requirements. This paper focuses on a method by which an RPA’s near-term fut...
Supervisory control of multiple autonomous vehicles raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. An unmanned vehicle simulation designed to manipulate the application of automation was used to evaluate participants’ performance on image analysis...
Adaptive automation may help balance system autonomy with human interaction in supervisory control environments. The present experiment evaluated application of performance-based adaptive automation for an image analysis/decision task in a multiple autonomous vehicle simulation. An asymmetrical algorithm was employed in which the performance thresh...
Advances in automation technology are leading to development of unmanned aerial system (UAS) concepts in which a single pilot supervises multiple UASs. The present effort explores a delegation control concept whereby the pilot can flexibly change the automation's tasking and role during the course of a mission by specifying various constraints, sti...
Achieving an optimal balance of system autonomy with human interaction is critical for envisioned applications of multi-unmanned vehicle supervisory control. While research to date has helped inform control station interface design in terms of what to automate, when to automate, and how much to automate, more research is needed examining automation...
A prototype demonstration was created to illustrate a delegation control approach for future unmanned aerial systems (UAS) applications. With the goal of being able to demonstrate a flexible, natural, multi-level architecture for UAS control, the simulation illustrates four different control modes, ranging from manual (pilot controls the vehicle's...
Supervisory control of multiple autonomous vehicles raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. An unmanned vehicle simulation designed to manipulate the application of automation was used to evaluate participants’ performance on image analysis...
Supervisory control of multiple unmanned vehicles raises many questions concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. In the research reported here, a previously used experimental protocol was extended such that both automation level and reliability of two tasks wer...
Supervisory control of multiple unmanned aerial vehicles (UAVs) raises many questions concerning the balance of system autonomy with human interaction for effective operator situation awareness and system performance. The reported experiment used a UAV simulation environment to evaluate two applications of autonomy levels across two primary control...
Control applications involving multiple Unmanned Aerial Systems (UASs) will often require the operator to switch attention between UASs and their respective camera views. An automated aid that transitions between camera views is under evaluation. Instead of discretely switching from the camera view for one UAS to the camera view for another, a tran...
Supervisory control of multiple unmanned aerial vehicles (UAVs) raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. A UAV simulation environment designed to manipulate the application of automation was used to evaluate participants' performance on routi...
This chapter will trace how controls and displays used in fast jet research simulators changed from 1970 through the present to effectively evaluate new crew station technology for Air Force fighter aircraft. The early 1970s marked the dawn of the electro-optical (E-O) era in aviation simulators. Actually, there were investigations utilizing E-O in...
Control applications involving multiple Unmanned Aerial Vehicles (UAVs) will require the operator to switch attention between UAVs, each potentially involving very different scenario environments and task requirements. A transition display format that employs synthetic vision technology designed to enhance an operator's situation awareness when swi...
Single operator control of multiple unmanned aerial vehicles (UAVs) will often require attention switching between UAVs, each po-tentially involving distinct context and task requirements. This switching of attention between UAVs can disrupt operator performance and cause negative transfer of context. The present experiment evaluated a transi-tion...
Research at the Air Force Research Laboratory has focused on determining the value of combining computer-generated information with live camera video presented on an unmanned air vehicle (UAV) station display. In a previous study, a picture-in-picture (PIP) concept (where video imagery is surrounded by synthetic-generated terrain imagery, increasin...
UAV video imagery quality can be compromised by narrow field-of-view, environmental conditions, bandwidth limitations, or a highly cluttered scene. Advanced display concepts (e.g., synthetic vision) can potentially ameliorate video characteristics and enhance UAV operations. This study evaluated three display concepts for improving UAV sensor opera...
As unmanned aerial vehicles (UAVs) increase in autonomy, operators will be increasing their span of control. Most UAV systems require two or more operators to fly and operate payloads, but systems are being developed with the concept of a single operator monitoring multiple UAVs. This supervisory control of multiple UAVs raises many issues concerni...
The Air Force Research Laboratory's Human Effectiveness Directorate supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. One research thrust focuses on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information...
Of all the information displays in an Unmanned Aerial Vehicle (UAV) control station, video imagery from UAV- mounted cameras is particularly valuable. Pilots use imagery from UAV cameras to verify a clear path for ground operations, scan for air traffic, and identify navigational landmarks, and potential obstructions. Sensor operators use video ima...
Table 1 shows a summary of the multi-sensory interfaces evaluated by AFRL in a ROV task environment. For each interface, the candidate application is listed as well as the key results. An assessment of the maturity of the technology for application to ROV applications is also provided. This assessment is solely the viewpoint of the authors and base...
Tactile displays may alleviate visual workload in complex UAV control stations, cueing operators to high priority events via the haptic channel. Previous results suggest that tactile alerts (vibration on wrists) can substitute for aural alerts, as a redundant cue to visual alerts in relatively short test sessions. The present experiment investigate...
The Air Force Research Laboratory's Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camer...
While speech recognition technology has long held the potential for improving the effectiveness of military operations, it has only been within the last several years that speech systems have enabled the realization of that potential. Commercial speech recognition technology developments aimed at improving robustness for automotive and cellular pho...
In complex UAV control stations, it is important to alert operators to actionable information in a timely manner. Tactile displays may alleviate visual workload by transmitting information through the skin, cueing operators to high priority, unexpected events. The utility of tactile alerts (vibration on wrists) in substitution for aural alerts, as...
An evaluation was conducted on a generic UAV operator interface simulation testbed to explore the effects of levels-of-automation (LOAs) and automation reliability on the number of simulated UAVs that could be supervised by a single operator. LOAs included Management-by-Consent (operator consent required) and Management-by-Exception (action automat...
In this paper, the testbed which is built upon the Multi-Modal Immersive Intelligent Interface for Remote Operation (MIIIRO) to support UAV control is presented. The testbed implements a client/server architecture in which UAV operations are simulated on a server that maintains the states of the UAVs. The testbed supports both the route planning an...
Unmanned aerial vehicle (UAV) control stations feature multiple menu pages with systems accessed by keyboard presses. Use of speech-based input may enable operators to navigate through menus and select options more quickly. This experiment examined the utility of conventional manual input versus speech input for tasks performed by operators of a UA...
In complex systems, it is difficult to efficiently provide operators with timely and meaningful information. Tactile displays may alleviate visual workload by transmitting information through the skin. This study examined the utility of active tactile alerts versus salient visual and/or auditory alerts in an unmanned aerial vehicle ground control s...
Future computers will be more mobile, which will require new interaction methods. Accordingly, one might harness electroencephalographic
(EEG) activity for computer control. Such devices exist, but all have limitations. Therefore, a novel EEG-based control was
tested, which monitors the Steady-State Visual Evoked Response (SSVER). Selections are at...
Tactile displays have been proposed as a multisensory interface technology that can relieve the typically overburdened visual channel of operators. This study compared the ability of operators, while simultaneously completing a tracking task, to detect and identify system faults in a monitoring task with three types of alert cues: tactile, visual,...
This demonstration features a novel control suite that enables hands-free command, click, text and spatial entries for computer control. The Voice/Head Input Controller is particularly suited for wearable computer operation while operator's hands are simultaneously performing field operations.
The Air Force Research Laboratory has implemented and evaluated
two brain-computer interfaces (BCI's) that translate the steady-state
visual evoked response into a control signal for operating a physical
device or computer program. In one approach, operators self-regulate the
brain response; the other approach uses multiple evoked responses
Agile aircraft introduce new requirements and performance standards for the pilot-vehicle interface. This lecture will address these ergonomic issues as they pertain to agile aircraft. Specifically, controls and displays will be discussed, followed by design issues relevant to intelligent interfaces. The concepts and technologies proposed as candid...
Past research on wearable computers for maintenance applications has focused on developing displays and presentation formats. This study emphasized wearable computer control technologies. Alternative control technologies were compared with standard and voice controls. Twelve subjects performed a synthetic maintenance task using three control device...
Electroencephalographic (EEG)-based control devices are one of several emerging technologies that will provide operators with a variety of new hands-free control options. In general, EEG-based control translates brain electrical activity into a control signal. The system evaluated in this study uses the steady-state visual evoked response for syste...
A multimodal interface that combines voice recognition and eye-tracking to allow users to interact with visual displays are developed. The overall goal of the eye/voice aware cockpit interface (EVACI) program is to demonstrate how a pilot can use an eye/voice aware interface to control aviation displays, without the use of hands. This technology of...
The advent of wearable computers marks a potential revolution in human-machine interaction and necessitates an expansion of control and display capabilities. Several emerging technologies can provide operators with a variety of new channels for interacting with wearable computers. Enabling technologies use signals from the brain, muscles, voice, li...
An interface whereby brain responses can control machines has been developed by the Armstrong Laboratory. This EEG-based control uses the magnitude of the steady-state visual evoked response (SSVER) as a control signal. The SSVER is identified and monitored using non-invasive scalp electrodes and advanced signal processing technology. With biofeedb...
Pilots and operators of advanced military systems need better ways of interacting with their systems, including more efficient human-machine dialog and better physical interface devices and interaction techniques. The goal of the Eye/Voice Mission Planning Interface (EVMPI) research is to integrate voice recognition and eye-tracking technology with...
Gestural interfaces have the potential of enhancing control operations in numerous applications. For Air Force systems, machine-recognition of whole-hand gestures may be useful as an alternative controller, especially when conventional controls are less accessible. The objective of this effort was to explore the utility of a neural network-based ap...
Eye-controlled switching has been proposed as a biocybernetic control approach which may increase system effectiveness while reducing pilot workload. In this experiment, six subjects selected discrete switches on the front panel of a cockpit simulator while manually tracking a target. In two eye-controlled methods, the subjects directed their gaze...
This study examined whether eye and head responses can be used to evaluate attention cue effectiveness. The subjects' tasks were to complete a centrally-located tracking task while periodically responding to cues to identify targets at four peripheral locations. Five directional cues were evaluated: visual symbol, coded sound, speech cue, three dim...
This report summarizes three studies designed to measure and compare the ability of subjects to localize sounds in azimuth, via headphones, generated by two prototype auditory localization cue synthesizers. In the first study, performance differences were found between the two synthesizers in certain areas of the azimuth plane. Additionally, the de...
Past studies involving oculomotor responses have typically been limited to refixations along the horizontal plane, small sample sizes, and little data pertaining to head movement. The study reported herein addresses these data voids by collecting both eye and head latency data for refixations in the horizontal and vertical planes. The subjects' tas...
Natural aural directional cueing in the cockpit should relieve the demands placed on the visual modality, reduce display clutter and alleviate cognitive attention needed to process and extract meaning from coded formats. This experiment compared the effectiveness of three-dimensional (3-D) auditory cues to conventional visual and auditory methods o...
While the nature of eye and head displacements to target acquisitions in the horizontal plane have been frequently studied, such investigations in the vertical plane are somewhat scarce. In the experiment reported herein the final displacements of the head, eye, and gaze were examined for target acquisitions in the vertical and horizontal planes. T...
A three-dimensional (3-D) auditory display can increase the pilot's situational awareness without requiring visual fixation. When visual acquisition is required, the directional sound can give the pilot a more rapid cue to aim the eyes or head. In order to determine the utility and performance of a 3-D auditory display for cockpit applications, a m...
Eye-controlled switching has been proposed as a biocybernetic control approach which may increase system effectiveness while reducing pilot workload. In this experiment, six subjects selected discrete switches on the front panel of a cockpit simulator while manually tracking a target. In two eye-controlled methods, the subjects directed their gaze...
Eight subjects were used to characterize eye and head movements in response to a cue to refixate. The subjects' tasks were to complete a centrally located task and identify vertically peripheral targets. Eye and head reaction time as well as conventional performance measures were recorded. The results showed that the eye and head movement pattern o...
Integration of eye and head position monitoring devices may enable practical control of systems using the operator's eye line-of-sight (LOS) under conditions of free head and eye movement. This paper describes the components of an eye-control system developed to examine the use of eye LOS as an alternate control interface for crew station design. T...
Four methods of presenting stores management information were evaluated in a flight simulation: 1) alphanumeric, 2) color pictorial, 3) black and white pictorial, and 4) alphanumeric/color pictorial. Results indicated that pilots performed equally well with the alphanumeric, color pictorial, and alphanumeric/color pictorial formats but that perform...
The introduction of color cathode ray tubes into aircraft crewstations has reemphasized the debate on the value of color codes as a performance aid. This paper discusses the value of color in electro-optical displays which use computer generated imagery formats. The conclusion is that color displays will be needed by airborne systems operators beca...
Three methods of presenting aircraft engine information were evaluated in a flight simulation: 1) conventional electro-mechanical instruments, 2) monochrome cathode-raytube (CRT) format and 3) color CRT format. Results indicated that pilots identified failed engine parameters faster and more accurately when engine information is integrated onto a s...
This narrated, color, 16 1/2 minute film describes the type of experiments being performed in the Air Force's Flight Dynamics Laboratory's Digital Synthesis Simulator. The purpose of these experiments is to develop new controls/displays concepts for the next generation fighter aircraft. The simulator is fixed-base and its cockpit is the size of an...
Aircraft which utilize computers will require the use of multifunction displays and keyboards (MFK). Eight arrangements of formats which present flight control, navigation, and status information were examined. All were found useable but arrangements in which attitude information was placed above navigation information and those utilizing electro-o...
Since multifunction switching is a relatively new concept in controls technology, many design questions unique to this type of control need to be addressed. Research was conducted to determine pilot acceptability and useability of a panel of multifunction switches for aircraft applications. Many of the design criteria identified during the studies...
Two types of multifunction keyboards (MFKs), projection switches, and plasma panel were designed to consolidate many of the aircraft controls/displays into a single, easily-reachable control panel. Pilot performance while operating each type of MFK during simulated flight was examined. Also examined were four different arrangements of the task step...