Preprint

Navigation Method Enhancing Music Listening Experience by Stimulating Both Neck Sides with Modulated Music Vibration

Authors:
Preprints and early-stage research may not have been peer reviewed yet.
To read the file of this research, you can request a copy directly from the authors.

Abstract

We propose a method that stimulates music vibration (generated from and synchronized with musical signals), modulated by the direction and distance to the target, on both sides of a user's neck with Hapbeat, a necklace-type haptic device. We conducted three experiments to confirm that the proposed method can achieve both haptic navigation and enhance the music listening experience. Experiment 1 consisted of conducting a questionnaire survey to examine the effect of stimulating music vibrations. Experiment 2 evaluated the accuracy (deg) of users' ability to adjust their direction toward a target using the proposed method. Experiment 3 examined the ability of four different navigation methods by performing navigation tasks in a virtual environment. The results of the experiments showed that stimulating music vibration enhanced the music listening experience, and that the proposed method is able to provide sufficient information to guide the users: accuracy in identifying directions was about 20\textdegree, participants reached the target in all navigation tasks, and in about 80\% of all trials participants reached the target using the shortest route. Furthermore, the proposed method succeeded in conveying distance information, and Hapbeat can be combined with conventional navigation methods without interfering with music listening.

No file available

Request Full-text Paper PDF

To read the file of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
MUSIC OFTEN TRIGGERS A PLEASURABLE URGE IN listeners to move their bodies in response to the rhythm. In music psychology, this experience is commonly referred to as groove. This study presents the Experience of Groove Questionnaire, a newly developed self-report questionnaire that enables respondents to subjectively assess how strongly they feel an urge to move and pleasure while listening to music. The development of the questionnaire was carried out in several stages: candidate questionnaire items were generated on the basis of the groove literature, and their suitability was judged by fifteen groove and rhythm research experts. Two listening experiments were carried out in order to reduce the number of items, to validate the instrument, and to estimate its reliability. The final questionnaire consists of two scales with three items each that reliably measure respondents' urge to move (Cronbach's ¼ :92) and their experience of pleasure (¼ :97) while listening to music. The two scales are highly correlated (r ¼ :80), which indicates a strong association between motor and emotional responses to music. The scales of the Experience of Groove Questionnaire can independently be applied in groove research and in a variety of other research contexts in which listeners' subjective experience of music-induced movement and enjoyment need to be addressed: for example the study of the interaction between music and motivation in sports and research on therapeutic applications of music in people with neurological movement disorders.
Article
Full-text available
Music is both heard and felt-tactile sensation is especially pronounced for bass frequencies. Although bass frequencies have been associated with enhanced bodily movement, time perception, and groove (the musical quality that compels movement), the underlying mechanism remains unclear. In 2 experiments, we presented high-groove music to auditory and tactile senses and examined whether tactile sensation affected body movement and ratings of enjoyment and groove. In Experiment 1, participants (N = 22) sat in a parked car and listened to music clips over sound-isolating earphones (auditory-only condition), and over earphones plus a subwoofer that stimulated the body (auditory-tactile condition). Experiment 2 (N = 18) also presented music in auditory-only and auditory-tactile conditions, but used a vibrotactile backpack to stimulate the body and included 2 loudness levels. Participants tapped their finger with each clip, rated each clip, and, in Experiment 1, we additionally video recorded spontaneous body movement. Results showed that the auditory-tactile condition yielded more forceful tapping, more spontaneous body movement, and higher ratings of groove and enjoyment. Loudness had a small, but significant, effect on ratings. In sum, findings suggest that bass felt in the body produces a multimodal auditory-tactile percept that promotes movement through the close connection between tactile and motor systems. We discuss links to embodied aesthetics and applications of tactile stimulation to boost rhythmic movement and reduce hearing damage. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Article
Full-text available
In this article, we consider music and noise in terms of vibrational and transferable energy as well as from the evolutionary significance of the hearing system of Homo sapiens. Music and sound impinge upon our body and our mind and we can react to both either positively or negatively. Much depends, in this regard, on the frequency spectrum and the level of the sound stimuli, which may sometimes make it possible to set music apart from noise. There are, however, two levels of description: the physical-acoustic description of the sound and the subjective-psychological reactions by the listeners. Starting from a vibrational approach to sound and music, we first investigate how sound may activate the sense of touch and the vestibular system of the inner ear besides the sense of hearing. We then touch upon distinct issues such as the relation between low-frequency sounds and annoyance, the harmful effect of loud sound and noise, the direct effects of overstimulation with sound, the indirect effects of unwanted sounds as related to auditory neurology, and the widespread phenomenon of liking loud sound and music, both from the point of view of behavioral and psychological aspects.
Conference Paper
Full-text available
Haptic feedback is used in cars to reduce visual inattention. While tactile feedback like vibration can be influenced by the car's movement, thermal and cutaneous push feedback should be independent of such interference. This paper presents two driving simulator studies investigating novel tactile feedback on the steering wheel for navigation. First, devices on one side of the steering wheel were warmed, indicating the turning direction, while those on the other side were cooled. This thermal feedback was compared to audio. The thermal navigation lead to 94.2% correct recognitions of warnings 200m before the turn and to 91.7% correct turns. Speech had perfect recognition for both. In the second experiment, only the destination side was indicated thermally, and this design was compared to cutaneous push feedback. The simplified thermal feedback design did not increase recognition, but cutaneous push feedback had high recognition rates (100% for 200 m warnings, 98% for turns).
Conference Paper
Full-text available
Continuous advances in personal audio technology (e.g. head-phones), led to efficient noise cancellation and allowed users to build and influence their personal acoustic environment. Despite the high adoption and ubiquitous character of the technology , we do not fully understand which particular factors influence and form usage patterns. As a step towards understanding the usage of personal audio technology, we conducted two focus groups (n = 10) to investigate current headphone usage and users' wishes regarding current and future personal audio technology. Based on this data, we derive a model for what we call personal soundscape curation. This model was assessed with the data of a crowdsourced survey on Amazon Mechanical Turk (n = 194) on state of the art practices. Personal soundscape curation allows to describe usage strategies (curation, adaptation, renunciation) and break down influencing factors of context and environment as well as illustrate which consequences may arise from the users' behavior.
Conference Paper
Full-text available
With the recent advance in computing technology, more and more environments are becoming interactive. For interacting with these environments, traditionally 2D input and output elements are being used. However, recently interaction spaces also expanded to 3D space, which enabled new possibilities but also led to challenges in assisting users with interacting in such a 3D space. Usually, this challenge of communicating 3D positions is solved visually. This paper explores a different approach: spatial guidance through vibrotactile instructions. Therefore, we introduce TactileGlove, a smart glove equipped with vibrotactile actuators for providing spatial guidance in 3D space. We contribute a user study with 15 participants to explore how a different number of actuators and metaphors affect the user performance. As a result, we found that using a Pull metaphor for vibrotactile navigation instructions is preferred by our participants. Further, we found that using a higher number of actuators reduces the target acquisition time than when using a low number.
Chapter
Full-text available
We listen to music not only with our ears. The whole body is present in a concert hall, during a rock event, or while enjoying music reproduction at home. This chapter discusses the influence of audio-induced vibrations at the skin on musical experience. To this end, sound and body vibrations were controlled separately in several psychophysical experiments. The multimodal perception of the resulting concert quality is evaluated, and the effect of frequency, intensity, and temporal variation of the vibration signal is discussed. It is shown that vibrations play a significant role in the perception of music. Amplifying certain vibrations in a concert venue or music reproduction system can improve the music experience. Knowledge about the psychophysical similarities and differences of the auditory and tactile modality help to develop perceptually optimized algorithms to generate music-related vibrations. These vibrations can be reproduced, e.g., using electrodynamic exciters mounted to the floor or seat. It is discussed that frequency shifting and intensity compression are important approaches for vibration generation.
Conference Paper
Full-text available
As humans, we have the natural capability of localizing the origin of sounds. Spatial audio rendering leverages this skill by applying special filters to recorded audio to create the impression that a sound emanates from a certain position in the physical space. A main application for spatial audio on mobile devices is to provide non-visual navigation cues. Current systems require users to either listen to artificial beacon sounds, or the entire audio source (e.g., a song) is re-positioned in space, which impacts the listening experience. We present NavigaTone, a system that takes advantage of multi-track recordings and provides directional cues by moving a single track in the auditory space. While minimizing the impact of the navigation component on the listening experience, a user study showed that participants could localize sources as good as with stereo panning while the listening experience was rated to be closer to common music listening.
Article
Full-text available
This work presents a cutaneous haptic device able to provide navigation cues to the forearm through lateral skin stretch haptic feedback. Four cylindrical rotating end effectors, placed on the forearm of the human user, can generate independent skin stretches at the palmar, dorsal, ulnar, and radial sides of the arm. When all the end effectors rotate in the same direction, the cutaneous device is able to provide cutaneous cues about a desired pronation/supination of the forearm. When two opposite end effectors rotate in different directions, the cutaneous device is able to provide cutaneous cues about a desired translation of the forearm. To evaluate its effectiveness in providing navigation information, we carried out two experiments of haptic navigation. In the first experiment, subjects were asked to translate and rotate the forearm toward a target position and orientation, respectively. In the second experiment, subjects were asked to control the motion and orientation of a 6-DoF robotic manipulator to grasp and lift a target object. Our haptic device improved the performance of both tasks with respect to providing no haptic feedback. Moreover, it shows similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.
Article
Full-text available
Movement detection for a virtual sound source was measured during the listener’s horizontal head rotation. Listeners were instructed to do head rotation at a given speed. A trial consisted of two intervals. During an interval, a virtual sound source was presented 60° to the right or left of the listener, who was instructed to rotate the head to face the sound image position. Then in one of a pair of intervals, the sound position was moved slightly in the middle of the rotation. Listeners were asked to judge the interval in a trial during which the sound stimuli moved. Results suggest that detection thresholds are higher when listeners do head rotation. Moreover, this effect was found to be independent of the rotation velocity.
Conference Paper
Full-text available
We propose a new vibroacoustic device that consists of a string and two motors, called a wearable tension-based vibroacoustic device (WTV). To demonstrate the superior performance of the WTV over conventional wearable devices, which contain vibrators, we conducted two experiments. First, we measured the amplitudes of vibration of the skin while subjects wore the WTV and Haptuators. We found out that WTV is better than Haptuators at transmitting low-frequency waves over a wide range throughout the body. Second, we examined subjective evaluations of acoustic vibration for both devices. Almost all participants considered the WTV to be a better option as a vibroacoustic device. We thus conclude that the WTV is a good option for applications requiring high-quality and strong stimuli, such as listening to music and virtual-reality gaming.
Article
Full-text available
Shape-changing interfaces are a category of device capable of altering their form in order to facilitate communication of information. In this work we present a shape-changing device that has been designed for navigation assistance. ‘The Animotus’(previously, ‘The Haptic Sandwich’),resembles a cube with an articulated upper half that is able to rotate and extend (translate) relative to the bottom half, which is fixed in the user’s grasp. This rotation and extension, generally felt via the user’s fingers, is used to represent heading and proximity to navigational targets. The device is intended to provide an alternative to screen or audio based interfaces for visually impaired, hearing impaired, deaf blindand sighted pedestrians. The motivation and design of the haptic device is presented, followed by the results of a navigation experiment that aimed to determine the role of each device DOF, in terms of facilitating guidance. An additional device, ‘The Haptic Taco’, which modulated its volume in response to target proximity(negating directional feedback),was also compared. Results indicate that while the heading (rotational) DOF benefited motion efficiency, the proximity (translational) DOF benefited velocity. Combination of the two DOF improved overall performance. The volumetric Taco performed comparably to the Animotus’ extension DOF.
Article
Full-text available
Music listening and navigation are both common tasks for mobile device users. In this study, we integrated music listening with a navigation service, allowing users to follow the perceived direction of the music to reach their destination. This navigation interface provided users with two different guidance methods: route guidance and beacon guidance. The user experience of the navigation service was evaluated with pedestrians in a city center and with cyclists in a suburban area. The results show that spatialized music can be used to guide pedestrians and cyclists toward a destination without any prior training, offering a pleasant navigation experience. Both route and beacon guidance were deemed good alternatives, but the preference between them varied from person to person and depended on the situation. Beacon guidance was generally considered to be suitable for familiar surroundings, while route guidance was seen as a better alternative for areas that are unfamiliar or more difficult to navigate.
Conference Paper
Full-text available
In the current paper psychophysical aspects of a vibrotactile feedback device were investigated and its potential of signal modulation was analyzed. We identified magnitude calibration factors for equal perceptions of the different stimulation locations of the device and determined the spatial acuity with which the user is able to correctly detect the stimulation’s location. Furthermore we investigated different approaches of vibrotactile stimulation for communicating direction and distance information to the human arm (motion guidance) and also explored different approaches of signal modulation for the transmission of additional information content. The knowledge of these vibrotactile perception aspects is a fundamental requirement in order to design and evaluate different parameters for the design and application-oriented optimization of vibrotactile stimulation patterns.
Conference Paper
Full-text available
Audio guides are a common way to provide museum visitors with an opportunity for personalized, self-paced information retrieval. However, this personalization conflicts with some of the reasons many people go to museums, i.e., to socialize, to be with friends, and to discuss the exhibit as they experience it [1]. We developed an interactive museum experience based on audio augmented reality that lets the visitor interact with a virtual spatial audio soundscape. In this paper, we present some new interaction metaphors we use in the design of this audio space, as well as some techniques to generate a group experience within audio spaces.
Conference Paper
Full-text available
We describe a virtual “tether” for mobile devices that allows groups to have quick, simple and privacy-preserving meetups. Our design provides cues which allow dynamic coordination of rendezvous without revealing users’ positions. Using accelerometers and magnetometers, combined with GPS positioning and non-visual feedback, users can probe and sense a dynamic virtual object representing the nearest meeting point. The Social Gravity system makes social bonds tangible in a virtual world which is geographically grounded, using haptic feedback to help users rendezvous. We show dynamic navigation using this physical model-based system to be efficient and robust in significant field trials, even in the presence of low-quality positioning. The use of simulators to build models of mobile geolocated systems for pre-validation purposes is discussed, and results compared with those from our trials. Our results show interesting behaviours in the social coordination task, which lead to guidelines for geosocial interaction design. The Social Gravity system proved to be very successful in allowing groups to rendezvous efficiently and simply and can be implemented using only commercially available hardware.
Conference Paper
Full-text available
This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues.
Conference Paper
Full-text available
We combine the functionality of a mobile Global Positioning System (GPS) with that of an MP3 player, implemented on a PocketPC, to produce a handheld system capable of guiding a user to their desired target location via continuously adapted music feedback. We illustrate how the approach to presentation of the audio display can benefit from insights from control theory, such as predictive 'browsing' elements to the display, and the appropriate representation of uncertainty or ambiguity in the display. The probabilistic interpretation of the navigation task can be generalised to other context-dependent mobile applications. This is the first example of a completely handheld location- aware music player. We discuss scenarios for use of such systems.
Conference Paper
Full-text available
This paper describes an investigation into how haptic output can be used to deliver guidance to pedestrians, who do not have any particular disability, to find their way to a particular destination indoors, e.g., a room in a hospital. A prototype device called GentleGuide was designed iteratively, resolving several design issues for the use of haptic output. GentleGuide has been assessed experimentally. Our conclusion is that haptic output offers significant promise both in improving performance and in reducing the disruptiveness of technology. A negative aspect of exclusively relying on a device like GentleGuide is the reduced location and orientation awareness by some participants.
Conference Paper
Full-text available
This multiple phase research examines the utility of the thigh as a placement for a vibrotactile display in the cockpit. The initial phase of this research is presented hereby. Vibrotactile displays designed to convey horizontal directional waypoints or warnings are commonly situated on the torso of the pilot. Here, an eight-tactors belt prototype fixed around the thigh of a seated operator was used to convey vertical directional waypoints. Localization accuracy was examined. Analysis revealed that vibrotactile cues embracing the thigh are discriminated in a similar manner to the torso, providing initial evidence that vibrotactile signaling on the thigh can provide directional cues in the vertical plane.
Article
Full-text available
A path-following experiment, using a global positioning system, was conducted with participants who were legally blind. On- and off-course confirmations were delivered by either a vibrotactile or an audio stimulus. These simple binary cues were sufficient for guidance and point to the need to offer output options for guidance systems for people who are visually impaired.
Article
Full-text available
In this paper we consider a prototype audio user interface for a Global Positioning System (GPS) that is designed to allow mobile computer users to carry out a location task while their eyes, hands and attention are often otherwise engaged. Audio user interfaces for GPS have typically been designed to meet the needs of visually handicapped users, and generally (though not exclusively) employ speech-audio. In this paper, we consider a prototype audio GPS user interface designed primarily for sighted mobile computer users who may have to attend simultaneously to other tasks, and who may be holding conversations at the same time. The system is considered in the context of being one component of a user interface for mobile computer users. The prototype system uses a simple form of spatial audio. Various candidate audio mappings of location and distance information are analysed. A variety of tasks, design considerations, technological opportunities and design trade-offs are considered. Preliminary findings are reported. Opportunities for improvements to the system, and future empirical testing are explored.
Article
Full-text available
Auditory and vibrotactile stimuli share similar temporal patterns. A psychophysical experiment was performed to test whether this similarity would lead into an intermodal bias in perception of sound intensity. Nine normal-hearing subjects performed a loudness-matching task of faint tones, adjusting the probe tone to sound equally loud as a reference tone. The task was performed both when the subjects were touching and when they were not touching a tube that vibrated simultaneously with the probe tone. The subjects chose on average 12% lower intensities (p < 0.01) for the probe tone when they touched the tube, suggesting facilitatory interaction between auditory and tactile senses in normal-hearing subjects.
Article
Virtual Reality (VR) has a great potential to improve skills of Deaf and Hard-of-Hearing (DHH) people. Most VR applications and devices are designed for persons without hearing problems. Therefore, DHH persons have many limitations when using VR. Adding special features in a VR environment, such as subtitles, or haptic devices will help them. Previously, it was necessary to design a special VR environment for DHH persons. We introduce and evaluate a new prototype called "EarVR" that can be mounted on any desktop or mobile VR Head-Mounted Display (HMD). EarVR analyzes 3D sounds in a VR environment and locates the direction of the sound source that is closest to a user. It notifies the user about the sound direction using two vibro-motors placed on the user's ears. EarVR helps DHH persons to complete sound-based VR tasks in any VR application with 3D audio and a mute option for background music. Therefore, DHH persons can use all VR applications with 3D audio, not only those applications designed for them. Our user study shows that DHH participants were able to complete a simple VR task significantly faster with EarVR than without. The completion time of DHH participants was very close to participants without hearing problems. Also, it shows that DHH participants were able to finish a complex VR task with EarVR, while without it, they could not finish the task even once. Finally, our qualitative and quantitative evaluation among DHH participants indicates that they preferred to use EarVR and it encouraged them to use VR technology more.
Conference Paper
In this poster, we propose a new haptic rendering algorithm that dynamically modulates wave parameters to convey distance, direction, and object type by utilizing neck perception and the Hapbeat-Duo, a haptic device composed of two actuators linked by a neck strap. This method is useful for various VR use cases because it provides feedback without disturbing users' movement. In our experiment, we presented haptic feedback of sine waves which were dynamically modulated according to direction and distance between a player and a target. These waves were presented to both sides of the users' necks independently. As a result, players could reach invisible targets and immediately know they had reached the targets. The proposed algorithm allows the neck to become as important a receptive part of body as eyes, ears, and hands.
Conference Paper
Resonance Audio is an open source project designed for creating and controlling dynamic spatial sound in Virtual & Augmented Reality (VR/AR), gaming or video experiences. It also provides integrations with popular game development platforms and digital audio workstations (as a preview plugin). Resonance Audio binaural decoder is used in YouTube to provide binaural rendering of 360/VR videos. This paper describes the core sound spatialization algorithms used in Resonance Audio and can be treated as a companion to the Resonance Audio C++ / MATLAB library source code.
Conference Paper
In this paper, we present a Haptic Collar prototype, a neck worn band with vibrotactile actuators for eyes-free haptic navigation. We evaluate the system for guidance applications on over 11 users, analyzing 4 different tactile patterns regarding comfort and ease of understanding as well as the number of actuators to encode 8 directions (4, 6 and 8). Overall, users can recognize the directional signs well (up to 95 % recognition rates for over 528 triggers). We also present a use case applying our prototype for a haptic navigation walk.
Conference Paper
While a great deal of work has been done exploring non-visual navigation interfaces using audio and haptic cues, little is known about the combination of the two. We investigate combining different state-of-the-art interfaces for communicating direction and distance information using vibrotactile and audio music cues, limiting ourselves to interfaces that are possible with current off-the-shelf smartphones. We use experimental logs, subjective task load questionnaires, and user comments to see how users' perceived performance, objective performance, and acceptance of the system varied for different combinations. Users' perceived performance did not differ much between the unimodal and multimodal interfaces, but a few users commented that the multimodal interfaces added some cognitive load. Objective performance showed that some multimodal combinations resulted in significantly less direction or distance error over some of the unimodal ones, especially the purely haptic interface. Based on these findings we propose a few design considerations for multimodal haptic/audio navigation interfaces.
Article
Sound and vibrations are often perceived via the auditory and tactile senses simultaneously, e.g., in a car or train. During a rock concert, the body vibrates with the rhythm of the music. Even in a concert hall or a church, sound can excite vibrations in the ground or seats. These vibrations might not be perceived separately because they integrate with the other sensory modalities into one multi-modal perception. This paper discusses the relation between sound and vibration for frequencies up to 1 kHz in an opera house and a church. Therefore, the transfer function between sound pressure and acceleration was measured at different exemplary listening positions. A dodecahedron loudspeaker on the stage was used as a sound source. Accelerometers on the ground, seat and arm rest measured the resulting vibrations. It was found that vibrations were excited over a broad frequency range via airborne sound. The transfer function was measured using various sound pressure levels. Thereby, no dependence on level was found. The acceleration level at the seat corresponds approximately to the sound pressure level and is independent of the receiver position. Stronger differences were measured for vibrations on the ground.
Article
The body surface vibration induced by low frequency noise (noise-induced vibration) was measured at the forehead, the anterior chest and the anterior abdomen. At all the measuring locations, the increase steps in the vibration acceleration levels of the noise-induced vibrations was in good agreement with the increase steps, in the sound pressure levels of the noise stimuli. The vibration acceleration level measured at the forehead was found to increase suddenly at around 31.5-40 Hz, while the acceleration levels measured at the chest and abdomen increased with frequency at approximately constant rates in the 20- to 50Hz range. Our results showed no clear evidence of the effect of posture or bilateral asymmetry in the noise-induced vibration. We found that the noise-induced vibrations measured at the chest and abdomen were correlated negatively with the body fat percentage.
Article
Normal subjects were given two auditory tests, one consisting of spoken digits presented dichotically, the other of melodies presented dichotically. On the Digits test, the score for the right ear was higher than for the left (as previously established), and on the Melodies test the score for the left ear was higher than for the right. These findings were related to the different roles of the right and left hemispheres of the brain in verbal and nonverbal perception.
Conference Paper
When visually impaired pedestrians walk from one place to another by themselves, they must update their orientation and position to find their way and avoid obstacles and hazards. We present the design of a new haptic direction indicator, whose purpose is to help blind pedestrians travel a path and avoid hazards intuitively and safely by means of haptic navigation. The haptic direction indicator uses a novel kinesthetic perception method called the "pseudo-attraction force" technique, which exploits the nonlinear relationship between perceived and physical acceleration to generate a force sensation. In an experiment performed to evaluate with the haptic direction indicator, we found that visually impaired users could safely walk along a predefined route at their usual walking pace, independent of the existence of auditory information. These results demonstrate the utility and usability of the haptic direction indicator, but there is still room for improvement.
Conference Paper
This study is based on a user scenario where augmented reality targets could be found by scanning the environment with a mobile device and getting a tactile feedback exactly in the direction of the target. In order to understand how accurately and quickly the targets can be found, we prepared an experiment setup where a sensor-actuator device consisting of orientation tracking hardware and a tactile actuator were used. The targets with widths 5°, 10°, 15°, 20°, and 25° and various distances between each other were rendered in a 90° -wide space successively, and the task of the test participants was to find them as quickly as possible. The experiment consisted of two conditions: the first one provided tactile feedback only when pointing was on the target and the second one included also another cue indicating the proximity of the target. The average target finding time was 1.8 seconds. The closest targets appeared to be not the easiest to find, which was attributed to the adapted scanning velocity causing the missing the closest targets. We also found that our data did not correlate well with Fitts' model, which may have been caused by the non-normal data distribution. After filtering out 30% of the least representative data items, the correlation reached up to 0.71. Overall, the performance between conditions did not differ from each other significantly. The only significant improvement in the performance offered by the close-to-target cue occurred in the tasks where the targets where the furthest from each other.
Conference Paper
In this study, we propose a method of displaying unaware-usage haptic sensation of navigation. People view a map when they visit an unfamiliar place, but when they are taken away their eyes from the map, their attention is diverted and margin of the heart. However, when we are relaxed and do not worry about getting lost, we can discover the intrinsic beauty of the unfamiliar land. Therefore, we have focused on the sense of touch and ensured security by using a support such as a wall or a handrail in the streets, along with the sense of touch. Therefore, we propose a haptic navigation system to release human eyes from the requirement of constantly looking into a map in order to enhance the experience of our daily walk or sightseeing.
Article
The implementation of haptic interfaces in vehicles has important safety and flexibility implications for lessening visual and auditory overload during driving. The present study aims to design and evaluate haptic interfaces with vehicle seats. We conducted three experiments by testing a haptic seat in a simulator with a total of 20 participants. The first experiment measured reaction time, subjective satisfaction, and subject workloads of the haptic, visual, and auditory displays for the four signals primarily used by vehicle navigation systems. The second experiment measured reaction time, subjective satisfaction, and subjective workloads of the haptic, auditory, and multimodal (haptic + auditory) displays for the ringing signal used by in-vehicle Bluetooth hands-free systems. The third experiment measured drivers' subjective awareness, urgency, usefulness, and disturbance levels at various vibration intensities and positions for a haptic warning signal used by a driver drowsiness warning system. The results indicated that haptic seat interfaces performed better than visual and auditory interfaces, but the unfamiliarity of the haptic interface caused a lower subjective satisfaction for some criteria. Generally, participants showed high subjective satisfaction levels and low subjective workloads toward haptic seat interfaces. This study provided guidance for implementing haptic seat interfaces, and identified the possible benefits of their use. We expect to improve safety and the interaction between driver and vehicle through haptic seats implemented in vehicles.
Article
Presenting waypoint navigation on a visual display is not suited for all situations. The present experiments investigate if it is feasible to present the navigation information on a tactile display. Important design issue of the display is how direction and distance information must be coded. Important usability issues are the resolution of the display and its usefulness in vibrating environments. In a pilot study with 12 pedestrians, different distance-coding schemes were compared. The schemes translated distance to vibration rhythm while the direction was translated into vibration location. The display consisted of eight tactors around the user's waist. The results show that mapping waypoint direction on the location of vibration is an effective coding scheme that requires no training, but that coding for distance does not improve performance compared to a control condition with no distance information. In Experiment 2, the usefulness of the tactile display was shown in two case studies with a helicopter and a fast boat.
Article
Listening to music on personal, digital devices whilst mobile is an enjoyable, everyday activity. We explore a scheme for exploiting this practice to immerse listeners in navigation cues. Our prototype, ONTRACK, continuously adapts audio, modifying the spatial balance and volume to lead listeners to their target destination. First we report on an initial lab-based evaluation that demonstrated the approach’s efficacy: users were able to complete tasks within a reasonable time and their subjective feedback was positive. Encouraged by these results we constructed a handheld prototype. Here, we discuss this implementation and the results of field-trials. These indicate that even with a low-fidelity realisation of the concept, users can quite effectively navigate complicated routes.
Eyesound: single-modal mobile navigation using directionally annotated music
  • S Yamano
  • T Hamajo
  • S Takahashi
  • K Higuchi
S. Yamano, T. Hamajo, S. Takahashi, and K. Higuchi, "Eyesound: single-modal mobile navigation using directionally annotated music," in Proceedings of the 3rd Augmented Human International Conference, 2012, pp. 1-4.
Audiotactile simultaneity perception of musical-produced whole-body vibrations
  • M Daub
  • M E Altinsoy
M. Daub and M. E. Altinsoy, "Audiotactile simultaneity perception of musical-produced whole-body vibrations," in Proceedings of the Joint Congress CFA/DAGA, 2004.
Implementation of tension-based compact necklace-type haptic device achieving widespread transmission of low-frequency vibrations
--, "Implementation of tension-based compact necklace-type haptic device achieving widespread transmission of low-frequency vibrations," IEEE Transactions on Haptics, 2022.
Algorithms to measure audio programme loudness and truepeak audio level
  • B Series
B. Series, "Algorithms to measure audio programme loudness and truepeak audio level," 2011.
Hearing with your body: the influence of whole-body vibrations on loudness perception
  • S Merchel
  • A Leppin
  • E Altinsoy
S. Merchel, A. Leppin, and E. Altinsoy, "Hearing with your body: the influence of whole-body vibrations on loudness perception," in Proceedings of the 16th International Congress on Sound and Vibration (ICSV), Kraków, Poland, vol. 4, 2009.