Christiane Glatz’s research while affiliated with University of Tübingen and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (11)


Figure 1. Interaction of Auditory Cue and CTOA. Cued reaction times are fastest for the CTOA level of 250 ms, relative to the simultaneous presentation (0 ms) of cue and target. This reaction time benefit decreases at 500 ms, particularly for static and receding cues. Error bars represent 95% confidence intervals according to 70 .
Figure 2. Interaction of Auditory Cue and Intensity. Median RTs of looming cues do not vary with Intensity levels. In contrast, loud static cues induce faster RTs than soft static cues. The conditions indicated by larger icons were identical to the static and looming conditions in Experiment 1. Error bars represent 95% confidence intervals according to 70 .
Figure 3. Experiment procedure and stimuli. (A) Four instances of trials of equal probability that could require participants to perform tilt-discrimination on a peripheral visual target (depicted larger than actual, for visibility). (B) Auditory cues used in Experiment 2, whereby the soft-static cue and loud-looming cue were the static and looming cue of Experiment 1. (C) Visual targets could appear at the onset of the auditory cue or after the onset.
The time course of auditory looming cues in redirecting visuo-spatial attention
  • Article
  • Full-text available

January 2019

·

126 Reads

·

6 Citations

Christiane Glatz

·

By orienting attention, auditory cues can improve the discrimination of spatially congruent visual targets. Looming sounds that increase in intensity are processed preferentially by the brain. Thus, we investigated whether auditory looming cues can orient visuo-spatial attention more effectively than static and receding sounds. Specifically, different auditory cues could redirect attention away from a continuous central visuo-motor tracking task to peripheral visual targets that appeared occasionally. To investigate the time course of crossmodal cuing, Experiment 1 presented visual targets at different time-points across a 500 ms auditory cue’s presentation. No benefits were found for simultaneous audio-visual cue-target presentation. The largest crossmodal benefit occurred at early cue-target asynchrony onsets (i.e., CTOA = 250 ms), regardless of auditory cue type, which diminished at CTOA = 500 ms for static and receding cues. However, auditory looming cues showed a late crossmodal cuing benefit at CTOA = 500 ms. Experiment 2 showed that this late auditory looming cue benefit was independent of the cue’s intensity when the visual target appeared. Thus, we conclude that the late crossmodal benefit throughout an auditory looming cue’s presentation is due to its increasing intensity profile. The neural basis for this benefit and its ecological implications are discussed.

Download

Figure 1. The experimental set-up: A participant is positioned in the simulated cockpit of an automated vehicle with headphones and EEG cap. Three horizontally-aligned displays present the simulation of the vehicle scenario.
Figure 4. Results of the DALI showing for each warning warning condition the perceived workload in all 6 workload sub-indices and overall mean workload (right). Error bars indicate the standard error of the mean.
Looming Auditory Collision Warnings for Semi-Automated Driving: An ERP Study

September 2018

·

227 Reads

·

14 Citations

Looming sounds can be an ideal warning notification for emergency braking. This agrees with studies that have consistently demonstrated preferential brain processing for looming stimuli. This study investigates and demonstrates that looming sounds can similarly benefit emergency braking in managing a vehicle with adaptive cruise control (ACC). Specifically, looming auditory notifications induced the faster emergency braking times relative to a static auditory notification. Next, we compare the event-related potential (ERP) evoked by a looming notification, relative to its static equivalent. Looming notifications evoke a smaller fronto-central N2 amplitude than their static equivalents. Thus, we infer that looming sounds are consistent with the visual experience of an approaching collision and, hence, induced a corresponding performance benefit. Subjective ratings indicate no significant differences in the perceived workload across the notification conditions. Overall, this work suggests that auditory warnings should have congruent physical properties with the visual events that they warn for.


Figure 1. Experiment 1's ERP responses (left) with scalp topography plots (right) of statistically significant differences across time and electrodes respectively. Left: ERP waveforms are averaged across the frontal (pink) and parietal (green) electrodes and deflections are labeled for N1, P2, P3a, and P3b. The shaded areas between the two waveforms indicate time-regions that are significantly different. Right: The scalp topographies show the EEG activity to verbal commands and auditory icons at time-ranges A and B. Electrodes that are significantly different are represented by white dots. 
Use the Right Sound for the Right Job: Verbal Commands and Auditory Icons for a Task-Management System Favor Different Information Processes in the Brain

April 2018

·

444 Reads

·

24 Citations

Design recommendations for notifications are typically based on user performance and subjective feedback. In comparison, there has been surprisingly little research on how designed notifications might be processed by the brain for the information they convey. The current study uses EEG/ERP methods to evaluate auditory notifications that were designed to cue long-distance truck drivers for task-management and driving conditions, particularly for automated driving scenarios. Two experiments separately evaluated naive students and professional truck drivers for their behavioral and brain responses to auditory notifications, which were either auditory icons or verbal commands. Our EEG/ERP results suggest that verbal commands were more readily recognized by the brain as relevant targets, but that auditory icons were more likely to update contextual working memory. Both classes of notifications did not differ on behavioral measures. This suggests that auditory icons ought to be employed for communicating contextual information and verbal commands, for urgent requests.




Figure 1: Participant wearing a mobile EEG device whilst performing a spatial navigation task in a CAVE-like virtual environment. Photo: Lewis Chuang.
Reading the mobile brain: from laboratory to real-world electroencephalography

November 2017

·

66 Reads

·

6 Citations

Christiane Glatz

·

·

Thomas Kosch

·

[...]

·

It is increasingly viable to measure the brain activity of mobile users, as they go about their everyday business in their natural world environment. This is due to: (i) modern signal processing methods, (ii) lightweight and cost-effective measurement devices, and (iii) a better, albeit incomplete, understanding of how measurable brain activity relates to mental processes. Here, we address how brain activity can be measured in mobile users and how this contrasts with measurements obtained under controlled laboratory conditions. In particular, we will focus on electroencephalography (EEG) and will cover: (i) hardware and software implementation, (ii) signal processing techniques, (iii) interpretation of EEG measurements. This will consist of hands-on analyses of real EEG data and a basic theoretical introduction to how and why EEG works.


Figure 1. Left: A student participant in a psychophysical laboratory (Department for Human Perception, Cognition, and Action, MPI for Biological Cybernetics, Tübingen, Germany). Right: A professional commercial driver in a truck driving simulator (Styling and Vehicle Ergonomics, Scania CV AB, Södertälje, Sweden) 
Figure 2. Six examples of clusters of dipoles (blue) and their mean position (red), their projected scalp activity, and power spectral density (inset: left to right), derived from EEG recordings in the driving simulator. First row: Cortical dipoles that are likely to be associated with auditory processing (left) and motor response generation (right) respectively. Second row: Non-cortical dipoles that are associated with muscle activity (left) and eye-movements and-blinks (right). Third row: Non-cortical dipoles that are due to electrical line noise (left) and unresolved variance in EEG recording (right). 
Figure 3. LEFT: Stimulus ERPs are illustrated by and labelled in three difference waveforms that depict averaged EEG activity of electrode groups from anterior, central, and posterior regions. RIGHT: MUA results plot statistically significant t-values between the two participant groups for every electrode and time-point. The analysis reveals that there are no statistically significant electrode-time regions that proceed the auditory notification. 
Figure 4. LEFT: Stimulus ERPs are illustrated by three waveforms that depict averaged EEG activity of electrode groups from anterior, central, and posterior regions. The BP peaks are indicated by arrows. RIGHT: MUA results reveal statistically significant differences between driving simulator and laboratory recordings in two time periods (i.e., 600-430 and 220-0 msec before responses). 
Using EEG to Understand why Behavior to Auditory In-vehicle Notifications Differs Across Test Environments

September 2017

·

732 Reads

·

13 Citations

In this study, we employ EEG methods to clarify why auditory notifications, which were designed for task management in highly automated trucks, resulted in different performance behavior, when deployed in two different test settings: (a) student volunteers in a lab environment, (b) professional truck drivers in a realistic vehicle simulator. Behavioral data showed that professional drivers were slower and less sensitive in identifying notifications compared to their counterparts. Such differences can be difficult to interpret and frustrates the deployment of implementations from the laboratory to more realistic settings. Our EEG recordings of brain activity reveal that these differences were not due to differences in the detection and recognition of the notifications. Instead, it was due to differences in EEG activity associated with response generation. Thus, we show how measuring brain activity can deliver insights into how notifications are processed, at a finer granularity than can be afforded by behavior alone.


Towards Adaptive Ambient In-Vehicle Displays and Interactions: Insights and Design Guidelines from the 2015 AutomotiveUI Dedicated Workshop

February 2017

·

142 Reads

·

15 Citations

Informing a driver of a vehicle’s changing state and environment is a major challenge that grows with the introduction of in-vehicle assistant and infotainment systems. Even in the age of automation, the human will need to be in the loop for monitoring, taking over control, or making decisions. In these cases, poorly designed systems could lead to needless attentional demands imparted on the driver, taking it away from the primary driving task. Existing systems are offering simple and often unspecific alerts, leaving the human with the demanding task of identifying, localizing, and understanding the problem. Ideally, such systems should communicate information in a way that conveys its relevance and urgency. Specifically, information useful to promote driver safety should be conveyed as effective calls for action, while information not pertaining to safety (therefore less important) should be conveyed in ways that do not jeopardize driver attention. Adaptive ambient displays and peripheral interactions have the potential to provide superior solutions and could serve to unobtrusively present information, to shift the driver’s attention according to changing task demands, or enable a driver to react without losing the focus on the primary task. In order to build a common understanding across researchers and practitioners from different fields, we held a “Workshop on Adaptive Ambient In-Vehicle Displays and Interactions” at the AutomotiveUI‘15 conference. In this chapter, we discuss the outcomes of this workshop, provide examples of possible applications now or in the future and conclude with challenges in developing or using adaptive ambient interactions.


Tutorial on Design and Evaluation Methods for Attention Directing Cues

Managing drivers' distraction and directing their attention has been a challenge for automotive UI researchers both in industry and academia. The objective of this half-day tutorial is to provide an overview of methodologies for design, development, and evaluation of in-vehicle attention-directing user interfaces. The tutorial will introduce specifics and challenges of shifting drivers' attention and managing distractions in semi- and highly automated driving context. The participants will be familiarized with methods for requirement elicitation, participatory design, setting up experiments, and evaluation of interaction concepts using tools such as eye-tracker and EEG/ERP.


Warning Signals With Rising Profiles Increase Arousal

September 2015

·

13 Reads

·

2 Citations

Proceedings of the Human Factors and Ergonomics Society Annual Meeting

Auditory warnings are often used to direct a user’s attention from a primary task to critical peripheral events. In the context of traffic, in-vehicle collision avoidance systems could, for example, employ spatially relevant sounds to alert the driver to the possible presence of a crossing pedestrian. This raises the question: What is an effective auditory alert in a steering environment? Ideally, such warning signals should not only arouse the driver but also result in deeper processing of the event that the driver is being alerted to. Warning signals can be designed to convey the time to contact with an approaching object (Gray, 2011). That is, sounds can rise in intensity in accordance with the physical velocity of an approaching threat. The current experiment was a manual steering task in which participants were occasionally required to recognized peripheral visual targets. These visual targets were sometimes preceded by a spatially congruent auditory warning signal. This was either a sound with constant intensity, linearly rising intensity, or non-linearly rising intensity that conveyed time-to-contact. To study the influence of warning cues on the arousal state, different features of electroencephalography (EEG) were measured. Alpha frequency, which ranges from 7.5 to 12.5 Hz, is believed to represent different cognitive processes, in particular arousal (Klimesch, 1999). That is, greater desynchronization in the alpha frequency reflects higher levels of attention as well as alertness. Our results showed a significant decrease in alpha power for sounds with rising intensity profiles, indicating increased alertness and expectancy for an event to occur. To analyze whether the increased arousal for rising sounds resulted in deeper processing of the visual target, we analyzed the event related potential P3. It is a positive component that occurs approximately 300 ms after an event and is known to be associated with recognition performance of a stimulus (Parasuraman & Beatty, 1980). In other words, smaller P3 amplitudes indicate worse identification than larger amplitudes. Our results show that sounds with time-to-contact properties induced larger P3 responses to the targets that they cued compared to targets cued by constant or linearly rising sounds. This suggests that rising sounds with time-to-contact intensity profiles evoke deeper processing of the visual target and therefore result in better identification than events cued by sounds with linearly rising or constant intensity.


Citations (9)


... The organizing committee of the third IMI workshop consists of the following researchers and professional musicians. Each of them contributes long-term experiences in organizing workshops including Handling IoT in HCI (IoT '17), Reading the Mobile Brain (MUM '17) [5], Designing Assistive Environments for Manufacturing (PETRA '17 -'21) 4 , SmartObjects '18 (CHI '18) [21] and SmartObjects '22 (ISS '22) [25], a series of workshops and events about vulnerable road users [13,14,19,23,24,30] as well as several local workshops for bands and musicians. ...

Reference:

Intelligent Music Interfaces: When Interactive Assistance and Adaptive Augmentation Meet Musical Instruments
Reading the mobile brain: from laboratory to real-world electroencephalography

... Note that moving cues have been found to direct spatial attention in visual, auditory, and tactile modalities particularly effectively (Glatz & Chuang, 2019;Hillstrom & Yantis, 1994;Ho et al., 2014). Ho and colleagues (2014) presented moving tactile cues at the waist to direct the participant's attention in a driving scenario. ...

The time course of auditory looming cues in redirecting visuo-spatial attention

... The dynamic-pitch cue provides a more accurate mapping to the imminent visual stimuli. Glatz et al. (2019) found that looming auditory cues induced greater brain activity to visual targets and faster reaction times than constant auditory cues, which indicates that dynamic sounds are more referent to looming visual targets. Only proper dynamic urgency mapping can enhance driving performance. ...

Why Do Auditory Warnings During Steering Allow for Faster Visual Target Recognition?
  • Citing Chapter
  • January 2018

... Considering a previous study showing that the driving context affects preferences for AV decisions [10] and the present findings demonstrating that the driving context affects the driver's intention to take over, cars should come with personalizable intelligent systems and takeover decisions should be fostered by informing drivers that the emergency affects the safety of drivers. For instance, in future AVs, the warning sound for a takeover request could use a looming sound to enhance the perceived risk of drivers who showed effectiveness in emergency braking [60]; additionally, information about the AV's driving intention, when only the AV's decision influences the safety of the driver, could be provided to enhance their situational awareness and willingness to take over without notifying the driver of the AV's driving intention when safety is guaranteed to prevent drivers from leaving the decision to the AV. Additionally, the findings of the present study suggest that in the future, even in Level-4 autonomous driving, AVs should be designed to hand over control to the driver when decisions are directly related to their safety; if the driver does leave a decision to the AV, they should be asked to allow the car to decide on its own even in life-threatening situations. ...

Looming Auditory Collision Warnings for Semi-Automated Driving: An ERP Study

... This aligns with Yatani et al. [89], who found that handheld tactile maps combining tactile feedback with audio instructions offer superior spatial orientation compared to audio-only feedback. Additionally, the study revealed differences in the effectiveness of verbal audio vs. auditory icons, aligning with the findings of Glatz et al. [41], who found auditory icons to be more effective for conveying contextual information, while verbal audio was better for urgent requests. Further, by comparing the effectiveness of auditory, visual, and combined audio-visual feedback, the combination of audio and visual feedback improved participants' situation awareness more than visual feedback alone [66]. ...

Use the Right Sound for the Right Job: Verbal Commands and Auditory Icons for a Task-Management System Favor Different Information Processes in the Brain

... interaction between human and machines can be established via a microphone (based on speech) to provide a more natural, convenient, and efficient speech-based communication (Chuang et al. 2017;Munir et al. 2019). Since a smart home is made up of connected smart objects, a controller should be embedded for each object of the smart home such as light, sound, and door. ...

Using EEG to Understand why Behavior to Auditory In-vehicle Notifications Differs Across Test Environments

... Although the effect on driver reaction times is not as great as when using touch screens, it is worse than the impact of driving under the influence of alcohol or drugs [39]. Infotainment systems bring many desirable features to users, but they could take away attentional demand from the primary driving task if they are poorly designed [27]. According to Lentz et al. [24], the problems users might encounter with poorly designed UIs are difficulty locating the correct option they need, unintended invocation of actions, tedious sequences Marinissen & Bazilinskyy of interactions, error-prone repetitive actions or being overwhelmed by too many choices. ...

Towards Adaptive Ambient In-Vehicle Displays and Interactions: Insights and Design Guidelines from the 2015 AutomotiveUI Dedicated Workshop
  • Citing Chapter
  • February 2017

... • Auditory Alerts: These use sound, such as beeps or alarms, to warn drivers of hazards, with their effectiveness depending on factors like sound frequency, amplitude, and duration [10]. Properly designed auditory alerts improve safety by enhancing driver focus and response [18,21]. ...

Warning Signals With Rising Profiles Increase Arousal
  • Citing Article
  • September 2015

Proceedings of the Human Factors and Ergonomics Society Annual Meeting

... When listeners are asked to predict the arrival time of a looming sound, they exhibit a systematic anticipatory error and perceive that the source has arrived when it is still some distance away 1 (Neuhoff, Hamilton, Gittleson, & Mejia, 2014;Neuhoff, Long, & Worthington, 2012;Neuhoff, Planisek, & Seifritz, 2009;Riskind, Kleiman, Seifritz, & Neuhoff, 2014;Rosenblum et al., 1987Rosenblum et al., , 1993Schiff & Oldak, 1990). This bias can provide a selective advantage by creating a temporal "margin of safety" that affords slightly more time than expected to initiate defensive behaviors in response to the looming object (Freiberg, Tually, & Crassini, 2001;Glatz, Bulthoff, & Chuang, 2014;Neuhoff, 1998Neuhoff, , 1999Neuhoff, , 2001. Studies on sex differences have shown that women tend to exhibit a larger looming bias than men (Grassi, 2010;Schiff & Oldak, 1990) and these findings are likely due to sex differences in the ability to deal with approaching threat (e.g. ...

Looming auditory warnings initiate earlier eventrelated potentials in a manual steering task
  • Citing Article
  • September 2014

Cognitive Processing