Bruce N. Walker

Georgia Institute of Technology, Atlanta, Georgia, United States

Are you Bruce N. Walker?

Claim your profile

Publications (129)24 Total impact

  • Michael Nees, Bruce N. Walker
    [Show abstract] [Hide abstract]
    ABSTRACT: An experiment examined performance with sonifications—a general term for nonspeech auditory displays—as a function of working memory encoding and the demands of three different types of interference tasks. Participants encoded the sonifications as verbal representations, visuospatial images, or auditory images. After encoding, participants engaged in brief verbal, visuospatial, or auditory interference tasks before responding to point estimation queries about the sonifications. Results were expected to show selective impact on sonification task performance when the interference task demands matched the working memory encoding strategy, but instead a pattern of general working memory interference emerged in addition to auditory modal interference. In practical applications, results suggested that performance with auditory displays will be impacted by any interference task, though auditory tasks likely will cause more interference than verbal or visuospatial tasks.
    Human Factors and Ergonomics Society Annual Meeting, Chicago, IL; 10/2014
  • [Show abstract] [Hide abstract]
    ABSTRACT: Displaying multiple variables or data sets within a single sonification has been identified as a challenge for the field of auditory display research. We discuss our recent study that evaluates the usability of a sonification that contains multiple variables presented in a way that encouraged perception across multiple auditory streams. We measured listener comprehension of weather sonifications that include the variables of temperature, humidity, wind speed, wind direction, and cloud cover. Listeners could accurately identify trends in five concurrent variables presented together in a single sonification. This demonstrates that it is indeed possible to include multiple variables together within an auditory stream and thus a greater number of variables within a sonification.
    10/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: We describe work-in-progress prototypes of auditory displays for fuel efficiency driver interfaces (FEDIs). Although research has established that feedback from FEDIs can have a positive impact on driver behaviors associated with fuel economy, the impact of FEDIs on driver distraction has not been established. Visual displays may be problematic for providing this feedback; it is precisely during fuel-consuming behaviors that drivers should not divert attention away from the driving task. Auditory displays offer a viable alternative to visual displays for communicating information about fuel economy to the driver without introducing visual distraction.
    International Conference on Auditory Display, New York, New York; 06/2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Auditory display research for driving has mainly examined a limited range of tasks (e.g., collision warnings, cell phone tasks). In contrast, the goal of this project was to evaluate the effectiveness of enhanced auditory menu cues in a simulated driving context. The advanced auditory cues of ‘spearcons’ (compressed speech cues) and ‘spindex’ (a speech-based index cue) were predicted to improve both menu navigation and driving. Two experiments used a dual task paradigm in which users selected songs on the vehicle’s infotainment system. In Experiment 1, 24 undergraduates played a simple, perceptual-motor ball-catching game (the primary task; a surrogate for driving), and navigated through an alphabetized list of 150 song titles—rendered as an auditory menu—as a secondary task. The menu was presented either in the typical visual-only manner, or enhanced with text-to-speech (TTS), or TTS plus one of three types of additional auditory cues. In Experiment 2, 34 undergraduates conducted the same secondary task while driving in a simulator. In both experiments, performance on both the primary task (success rate of the game or driving performance) and the secondary task (menu search time) was better with the auditory menus than with no sound. Perceived workload scores, as well as user preferences favored the enhanced auditory cue types. These results show that adding audio, and enhanced auditory cues in particular, can allow a driver to operate the menus of in-vehicle technologies more efficiently while driving more safely. Results are discussed with multiple resources theory.
    International Journal of Human-Computer Interaction 05/2014; · 1.13 Impact Factor
  • Masoud Gheisari, Javier Irizarry, Bruce N. Walker
    Construction Research Congress 2014; 05/2014
  • Source
    Myounghoon Jeon, Bruce N. Walker, Jung-Bin Yim
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of this paper was to explore effects of specific emotions on subjective judgment, driving performance, and perceived workload. The traditional driving behavior research has focused on cognitive aspects such as attention, judgment, and decision making. Psychological findings have indicated that affective states also play a critical role in a user’s rational, functional, and intelligent behaviors. Most applied emotion research has concentrated on simple valence and arousal dimensions. However, recent findings have indicated that different emotions may have different impacts, even though they belong to the same valence or arousal. To identify more specific affective effects, seventy undergraduate participants drove in a vehicle simulator under three different road conditions, with one of the following induced affective states: anger, fear, happiness, or neutral. We measured their subjective judgment of driving confidence, risk perception, and safety level after affect induction; four types of driving errors: Lane Keeping, Traffic Rules, Aggressive Driving, and Collision while driving; and the electronic NASA-TLX after driving. Induced anger clearly showed negative effects on subjective safety level and led to degraded driving performance compared to neutral and fear. Happiness also showed degraded driving performance compared to neutral and fear. Fear did not have any significant effect on subjective judgment, driving performance, or perceived workload. Results suggest that we may need to take emotions and affect into account to construct a naturalistic and generic driving behavior model. To this end, a specific-affect approach is needed, beyond the sheer valence and arousal dimensions. Given that workload results are similar across affective states, examining affective effects may also require a different approach than just the perceived workload framework. The present work is expected to guide emotion detection research and help develop an emotion regulation model and adaptive interfaces for drivers.
    Transportation Research Part F Traffic Psychology and Behaviour 05/2014; 24:197–209. · 1.99 Impact Factor
  • Myounghoon Jeon, Bruce N. Walker, Thomas M. Gable
    [Show abstract] [Hide abstract]
    ABSTRACT: Abstract Research has suggested that emotional states have critical effects on various cognitive processes, which are important components of situation awareness (Endsley, 1995b). Evidence from driving studies has also emphasized the importance of driver situation awareness for performance and safety. However, to date, little research has investigated the relationship between emotional effects and driver situation awareness. In our experiment, 30 undergraduates drove in a simulator after induction of either anger or neutral affect. Results showed that an induced angry state can degrade driver situation awareness as well as driving performance as compared to a neutral state. However, the angry state did not have an impact on participants' subjective judgment or perceived workload, which might imply that the effects of anger occurred below their level of conscious awareness. One of the reasons participants showed a lack of compensation for their deficits in performance might be that they were not aware of severe impacts of emotional effects on driving performance.
    Presence Teleoperators &amp Virtual Environments 02/2014; 23(1):71-89. · 0.91 Impact Factor
  • Conference Paper: Auditory weather reports
    the 9th Audio Mostly; 01/2014
  • Journal of Blindness Innovation and Research. 01/2014; 4(2).
  • Michael A Nees, Bruce N Walker
    [Show abstract] [Hide abstract]
    ABSTRACT: Dual-process accounts of working memory have suggested distinct encoding processes for verbal and visual information in working memory, but encoding for nonspeech sounds (e.g., tones) is not well understood. This experiment modified the sentence􏰀picture verification task to include nonspeech sounds with a complete factorial examination of all possible stimulus pairings. Participants studied simple stimuli􏰀pictures, sentences, or sounds􏰀and encoded the stimuli verbally, as visual images, or as auditory images. Participants then compared their encoded representations to verification stimuli􏰀again pictures, sentences, or sounds􏰀in a two-choice reaction time task. With some caveats, the encoding strategy appeared to be as important or more important than the external format of the initial stimulus in determining the speed of verification decisions. Findings suggested that: (1) auditory imagery may be distinct from verbal and visuospatial processing in working memory; (2) visual perception but not visual imagery may automatically activate concurrent verbal codes; and (3) the effects of hearing a sound may linger for some time despite recoding in working memory. We discuss the role of auditory imagery in dual-process theories of working memory.
    Journal of Cognitive Psychology 11/2013; · 1.20 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In-vehicle technologies can create dangerous situations through driver distraction. In recent years, research has focused on driver distraction through communications technologies, but others, such as scrolling through a list of songs or names, can also carry high attention demands. Research has revealed that the use of advanced auditory cues for in-vehicle technology interaction can decrease cognitive demand and improve driver performance when compared to a visual-only system. This paper discusses research investigating the effects of applying advanced auditory cues to a search task on a mobile device while driving, particularly focusing on visual fixation. Twenty-six undergraduates performed a search task through a list of 150 songs on a cell phone while performing the lane change task, wearing eye-tracking glasses. Eye-tracking data, performance, workload, and preferences for six conditions were collected. Compared to no sound, visual fixation time on driving and preferences were found to be significantly higher for the advanced auditory cue of spindex. Results suggest more visual availability for driving when the spindex cue is applied to the search task and provides further evidence that these advanced auditory cues can lessen distraction from driving while using mobile devices to search for items in lists.
    Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; 10/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Three novel interfaces for navigating a hierarchical menu while driving were experimentally evaluated. Prototypes utilized redundant visual and auditory feedback (multimodal), and were compared to a conventional direct touch interface. All three multimodal prototypes employed an external touchpad separate from the infotainment display in order to afford simple eyes-free gesturing. Participants performed a basic driving task while concurrently using these prototypes to perform menu selections. Mean lateral lane deviation, eye movements, secondary task speed, and self-reported workload were assessed for each condition. Of all conditions, swiping the touchpad to move one-by-one through menu items yielded significantly smaller lane deviations than direct touch. In addition, in the serial swipe condition, the same time spent looking at the prototype was distributed over a longer interaction time. The remaining multimodal conditions allowed users to feel around a pie or list menu to find touchpad zones corresponding to menu items, allowing for either exploratory browsing or shortcuts. This approach, called GRUV, was ineffective compared to serial swiping and direct touch, possibly due its uninterruptable interaction pattern and overall novelty. The proposed explanation for the performance benefits of the serial swiping condition was that it afforded flexible sub tasking and incremental progress, in addition to providing multimodal output.
    Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; 10/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we address the lack of accessibility in fantasy sports for visually impaired users and discuss the accessible fantasy sports system that we have designed using auditory displays. Fantasy sports are a fun and social activity requiring users to make decisions about their fantasy teams, which use real athletes' weekly performance to gain points and compete against other users' fantasy teams. Fantasy players manage their teams by making informed decisions using statistics about real sports related data. These statistics are usually presented online in a spreadsheet layout, however online fantasy sports are usually inaccessible to screen readers due to the use of Flash on most sites. Our current system, described in this paper, utilizes auditory display techniques such as auditory alerts, earcons, spearcons, general text-to-speech, and auditory graphs to present sports statistics to visually impaired fantasy users. The current version of our system was designed based on feedback from current fantasy sports users during a series of think-aloud walkthroughs.
    Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility; 10/2013
  • Jonathan H. Schuett, Bruce N. Walker
    [Show abstract] [Hide abstract]
    ABSTRACT: When the goal of an auditory display is to provide inference or intuition to a listener, it is important for researchers and sound designers to gauge users' comprehension of the display to determine if they are, in fact, receiving the correct message. This paper discusses an approach to measuring listener comprehension in sonifications that contain multiple concurrently presented data series. We draw from situation awareness research that has developed measures of comprehension within environments or scenarios, based on our view that an auditory scene is similar to a virtual or mental representation of the listener's environment.
    Proceedings of the 8th Audio Mostly Conference; 09/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We describe a computer application for relaxation that is based on music generation following users' actions with a simulated drop of mercury. Rationale for the approach as well as architectural, algorithmic and technical details of the implementation are included. We also provide results of a user survey evaluating qualities of application usage experience.
    Proceedings of the 8th Audio Mostly Conference; 09/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The Architecture, Engineering, Construction, and Owner/Operator (AECO) industry is constantly searching for new methods for increasing efficiency and productivity. Facility Managers (FMs), as a part of the owner/operator role, work in complex and dynamic environments where critical decisions are constantly made. This decision-making process and its consequent performance can be improved by enhancing Situation Awareness (SA) of the FMs through new digital technologies. In this paper, InfoSPOT (Information Surveyed Point for Observation and Tracking), is recommended to FMs as a mobile Augmented Reality (AR) tool for accessing information about the facilities they maintain. AR has been considered as a viable option to reduce inefficiencies of data overload by providing FMs with a SA-based tool for visualizing their “real-world” environment with added interactive data. A prototype of the AR application was developed and a user participation experiment and analysis conducted to evaluate the features of InfoSPOT. This innovative application of AR has the potential to improve construction practices, and in this case, facility management.
    Automation in Construction 08/2013; 33:11–23. · 1.82 Impact Factor
  • Myounghoon Jeon, Bruce N. Walker, Carrie M Bruce
    The International Conference on Auditory Display (ICAD2013), Poland; 07/2013
  • Posters, Part I, HCII 2013, CCIS; 07/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.
    Human Factors The Journal of the Human Factors and Ergonomics Society 02/2013; 55(1):157-82. · 1.29 Impact Factor
  • Jared M. Batterman, Bruce N. Walker
    [Show abstract] [Hide abstract]
    ABSTRACT: Clear representation of uncertainty or error is crucial in graphs and other displays of data. Error bars are quite common in visual graphs, even though they are not necessarily well-designed, and often are not well understood, even by those who use them often (e.g., scientists, engineers). There has been little study of how to represent uncertainty in auditory graphs, such as those used increasingly by students and scientists with vision impairment. This study used conceptual magnitude estimation to determine how well different auditory dimensions (frequency, tempo) can represent error and uncertainty. The results will lead to more effective auditory displays of quantitative information and data.
    Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility; 10/2012

Publication Stats

1k Citations
24.00 Total Impact Points

Institutions

  • 2–2014
    • Georgia Institute of Technology
      • • School of Psychology
      • • School of Building Construction
      Atlanta, Georgia, United States
  • 2–2008
    • Rice University
      • Department of Psychology
      Houston, Texas, United States
  • 2001
    • Alimetrics Ltd, Espoo, Finland
      Helsinki, Southern Finland Province, Finland