
Aleksander VäljamäeUniversity of Tartu · Johan Skytte Institute of Political Studies
Aleksander Väljamäe
Ph.D.
About
69
Publications
24,367
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,720
Citations
Citations since 2017
Introduction
Additional affiliations
January 2016 - present
March 2015 - December 2015
February 2014 - February 2015
Publications
Publications (69)
This Research Topic is composed of 11 accepted papers: seven dedicated to original research, a perspective, a mini review and two opinion pieces, and are dedicated to various themes and perspectives. These contributions address the multi-faceted nature of non-clinical BCIs, ranging from ethical ramifications of these neurotechnologies, applications...
Editorial to a Frontiers' Research Topic. The Research Topic is composed of 11 accepted papers: seven dedicated to original research, a perspective, a mini review and two opinion pieces, and are dedicated to various themes and perspectives. These contributions address the multi-faceted nature of non-clinical BCIs, ranging from ethical ramifications...
Technological innovations like physiological computing offer new possibilities when exploring audience-performer interaction. To avoid technological solutionism that often accompanies biosensor applications in performing art, an artistic interventions approach was used. This paper describes a recent art-science residency consisting of three artisti...
Recent science funding initiatives have enabled participants from a diverse array of disciplines to engage in common spaces for developing solutions for new wearables. These initiatives include collaborations between the arts and sciences, fields which have traditionally contributed very different forms of knowledge, methodology, and results. Howev...
Technologies change rapidly our perception of reality, moving from augmented to virtual to magical. While e-textiles are a key component in exergame or space suits, the transformative potential of the internal side of garments to create embodied experiences still remains largely unexplored. This paper is the result from an art-science collaborative...
People, through their bodily actions, engage in sensorimotor loops that connect them to the world and to their own bodies. People's brains integrate the incoming sensory information to form mental representations of their body appearance and capabilities. Technology provides exceptional opportunities to tweak sensorimotor loops and provide people w...
Magic Lining draws attention to the unused internal side of the garments and textile as a space to alter people's self-perception for more positive behavior. It builds on the existing characteristics of textiles, fashion and garment design, and combines the design process with the scientific insights of MAGICSHOES project. The research question gui...
D2.5 Report of the art-science collaboration experiences during the preparation to BrainDance performance. Deliverable 2.5 of the BrainHack project funded under the European Union’s Horizon 2020 research and innovation programme GA No: 686987
Our body can be seen as an anchor that tightly connects us with the surrounding physical world. Concepts of body-centred interaction have been successfully applied in virtual and augmented reality, however, mainly in the visual domain. Still, we interact with the environment through our body, and these interactions mostly always produce sounds. The...
The main goal of the BrainHack project is to engage the international artistic community experimenting with Brain Neural Computer Interaction (BNCI) technologies and link it to the BNCI scientific community. BrainHack explores hackathons format to enhance the experimentation with non-clinical and artistic uses of BNCI. Here we briefly summarize the...
In this paper we evaluate b-Reactable, a digital music instrument that combines implicit physiology-based interaction through EEG and ECG, and explicit gestural interaction for sound generation and control. This multimodality is embodied in tangible objects named physiopucks, which are driven by biosignals. We hypothesize that multimodality increas...
Our previous research showed that vertical vection could modulate human mood. We further examined this possibility by using memory recognition task of positive, negative and neutral emotional images with high and low arousal levels. Those images were remembered accidentally while the participants did visual dummy task, and later presented together...
This paper presents a first step in the development of a methodology to compare the ability of different sonifications to convey the fine temporal detail of the Electroencephalography (EEG) brainwave signal in real time. In EEG neurofeedback a person " s EEG activity is monitored and presented back to them, to help them to learn how to modify their...
In order to survive in a complex environment, inhabited by potentially threatening and noxious objects or living beings, we need to constantly monitor our surrounding space, especially in the vicinity of our body. Such a space has been commonly referred to as one's 'peripersonal space' (PPS). In this study we investigated whether emotion-inducing a...
In the absence of other congruent multisensory motion cues, sound contribution to illusions of self-motion (vection) is relatively weak and often attributed to purely cognitive, top-down processes. The present study addressed the influence of cognitive and perceptual factors in the experience of circular, yaw auditorily-induced vection (AIV), focus...
In the present study, we investigated how the electrical activity in the sensorimotor cortex contributes to improved cognitive processing capabilities and how SMR (sensorimotor rhythm, 12-15Hz) neurofeedback training modulates it. Previous evidence indicates that higher levels of SMR activity reduce sensorimotor interference and thereby promote cog...
The field of physiology-based interaction and monitoring is developing at a fast pace. Emerging applications like fatigue monitoring often use sound to convey complex dynamics of biological signals and to provide an alternative, non-visual information channel. Most Physiology-to-Sound mappings in such auditory displays do not allow customization by...
Over the last few decades there has been steady growth in research that addresses the real-time sonification of electroencephalographic (EEG) data. Diverse application areas include medical data screening, Brain Computer Interfaces (BCI), neurofeedback, affective computing and applications in the arts. The present paper presents an overview and cri...
In the present study we implemented a real-time feedback system based on multichannel near-infrared spectroscopy (NIRS). Prior studies indicated that NIRS-based neurofeedback can enhance motor imagery related cortical activation. To specify these prior results and to confirm the efficacy of NIRS based neurofeedback training, we examined changes in...
We outline general theoretical and practical implications of what we promote as enactive cinema for the neuroscientific study of online socio-emotional interaction. In a real-time functional magnetic resonance imaging (rt-fMRI) setting, participants are immersed in cinematic experiences that simulate social situations. While viewing, their physiolo...
Almost every bodily movement, from the most complex to the most mundane, such as walking, can generate impact sounds that contain 360° spatial information of high temporal resolution. Given the strong connection of auditory cues to our body actions, and the dependency of body-awareness on the interaction between peripheral sensory inputs and mental...
During the past decade, brain–computer interfaces (BCIs) have rapidly developed, both in technological and application domains. However, most of these interfaces rely on the visual modality. Only some research groups have been studying non-visual BCIs, primarily based on auditory and, sometimes, on somatosensory signals. These non-visual BCI approa...
Almost every bodily movement, from the most complex to the most mundane, such as walking, can generate impact sounds that contain spatial information of high temporal resolution. Despite the conclusive evidence about the role that the integration of vision, touch and proprioception plays in updating body-representations, hardly any study has looked...
Physiological Computing has been applied in different disciplines, and is becoming popular and widespread in Human-Computer In-teraction, due to device miniaturization and improvements in real-time processing. However, most of the studies on physiology-based interfaces focus on single-user systems, while their use in Computer-Supported Collaborativ...
Hearing the blare of an ambulance siren often impels us to trace the location of the emergency vehicle with our gaze so we can quickly decide which way to pull the car over. In doing so, we must combine motion information from the somehow imprecise but omnidirectional auditory system with the far more precise, albeit spatially bounded, visual syste...
Humans have the ability to use a complex code of non-verbal behavior to communicate their internal states to others. Conversely, the understanding of intentions and emotions of others is a fundamental aspect of human social interaction. In the study presented here we investigate how people perceive the expression of emotional states based on the ob...
With the advance of novel brain imaging technology more correlations between complex human properties and the neuronal substrate can be assessed. However, thus far, not many well-validated paradigms exist that would allow for a systematic and quantitative exploration of these phenomena. For instance, despite the rapid technological advances in the...
The eXperience Induction Machine (XIM) is one of the most advanced mixed-reality spaces available today. XIM is an immersive
space that consists of physical sensors and effectors and which is conceptualized as a general-purpose infrastructure for
research in the field of psychology and human–artifact interaction. In this chapter, we set out the epi...
Presence, the “perceptual illusion of non-mediation,” is often a central goal in mediated and mixed environments, and sound
is believed to be crucial for inducing high-presence experiences. This chapter provides a review of the state of the art within
presence research related to auditory environments. Various sound parameters such as externalizati...
The perceived location of events occurring in a mediated environment modulates the users' understanding and involvement in these events. Previous research has shown that when spatially discrepant information is available at various sensory channels, the perceived location of unisensory events might be altered. Tactile "capture" of audition has been...
When people hear a sound (a "sound object" or a "sound event") the perceived auditory space around them might modulate their emotional responses to it. Spaces can affect both the acoustic properties of the sound event itself and may also impose boundaries to the actions one can take with respect to this event. Virtual acoustic rooms of different si...
Research has shown the existence of perceptual and neural bias toward sounds perceived as sources approaching versus receding a listener. It has been suggested that a greater biological salience of approaching auditory sources may account for these effects. In addition, these effects may hold only for those sources critical for our survival. In the...
The aim of this paper is to provide a first review of studies related to auditorily-induced self-motion (vection). These studies have been scarce and scattered over the years and over several research communities including clinical audiology, multisensory perception of self-motion and its neural correlates, ergonomics, and virtual reality. The revi...
Abstract Although the architecture of mixed reality spaces is becoming increasingly more complex, our understanding of human behavior
in such spaces is still limited. Despite the sophisticated methods deployed in ethology and behavioral biology to track and
analyze the actions and movements of animals, we rarely find studies that focus on the under...
Virtual and mixed reality environments (VMRE) often imply full-body human-computer interaction scenarios. We used a public multimodal mixed reality in-stallation, the Synthetic Oracle, and a between-groups design to study the ef-fects of implicit (e.g., passively walking) or explicit (e.g., pointing) interaction modes on the users' emotional and en...
Virtual and mixed reality environments (VMRE) often imply full-body human-computer interaction scenarios. We used a public multimodal mixed reality installation, the Synthetic Oracle, and a between-groups design to study the effects of implicit (e.g., passively walking) or explicit (e.g., pointing) interaction modes on the users' emotional and enga...
The increasing number of signal processing tools for highly parallel neurophysiological recordings opens up new avenues for connecting technologies directly to neuronal processes. As the understanding is taking a better shape, lot more work to perform is coming up. A simple brain-machine interface may be able to reestablish the broken loop of the p...
While rotating visual and auditory stimuli have long been known to elicit self-motion illusions (“circular vection”), audiovisual interactions have hardly been investigated. Here, two experiments investigated whether visually induced circular vection can be enhanced by concurrently rotating auditory cues that match visual landmarks (e.g., a fountai...
The design of motion simulators traditionally relies on low-level perceptual cues for inducing the illusion of self-motion (vection). The current study examined cognitive influences of purely auditory or auditory-vibrotactile- induced circular vection. The ecological categorization of sounds used for the synthesis of rotating acoustic fields (such...
Previous research has provided inconsistent results regarding the spatial modulation of auditory–somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or...
Information about the motion of objects can be extracted by multiple sensory modalities, and, as a consequence, object motion perception typically involves the integration of multi-sensory information. Often, in naturalistic settings, the flow of such information can be rather discontinuous (e.g. a cat racing through the furniture in a cluttered ro...
Handheld multimedia devices could benefit from multisensory technologies. The authors discuss audio, visual, and tactile cues designed to maximize presence and the illusion of self-motion.
Music is well known for affecting human emotional states, yet the relationship between specific musical parameters and emotional responses is still not clear. With the advent of new human-computer interaction (HCI) technologies, it is now possible to derive emotion-related information from physiological data and use it as an input to interactive mu...
Sounds with rising and falling intensity are often perceived, respectively, as approaching or receding sound sources. Research has shown the existence of biases both at perceptual and neural levels towards approaching versus receding sounds. It has been suggested that these effects might be accounted to a greater biological salience of approaching...
Virtual and augmented reality applications provide us with increasingly compelling multisensory worlds. Although spatial sound technologies are often used in such applications, headphone based sound reproduction may result in an undesired "mediation awareness" for an end-user. An alternative can be provided by bone-conducted sound technologies, tra...
In 1890, William James hypothesized that emotions are our perception of physiological changes. Many different theories of emotion have emerged since then, but it has been demonstrated that a specifically induced physiological state can influence an individual's emotional responses to stimuli. In the present study, auditory and/or vibrotactile heart...
Sound is an important, but often neglected, component for creating a self-motion illusion (vection) in Virtual Reality applications, for example, motion simulators. Apart from auditory motion cues, sound can provide contextual information representing self-motion in a virtual environment. In two experiments we investigated the benefits of hearing a...
A fundamental issue in presence research is how we can quantify "presence". A standard approach has been to use questionnaires and self-report measures. However, it has been well established that human's capabilities to access and externalize their internal states is limited. Hence, we have investigated the question whether more objective measures...
It is generally believed that the effectiveness of Virtual Environments (VEs) relies on their ability
of faithfully reproducing the multisensory experience of the physical world. An important aspect
of this experience is the perception of size and distance. In e.g. an architectural application it is
of course of great interest that the user gets th...
The entertainment industry frequently uses vibroacoustic stimulation, where chairs with embedded loudspeakers and shakers enhance the experience. Scientific investigations of the effect of such enhancers on illusory self-motion (vection) and spatial presence are largely missing. The current study examined whether auditory-induced vection (AIV) may...
Current auditory vision sensory substitution (AVSS) systems might be improved by the direct mapping of an image
into a matrix of concurrently active sound sources in a virtual acoustic space. This mapping might be similar to the
existing techniques for tactile substitution of vision where point arrays are successfully used. This paper gives an
over...
At the present time, 5 channel surround sound has become standard for a high quality audio reproduction. In the nearest future new rendering systems with higher number of audio channels will be introduced to the consumer market. One of emerging audio rendering technologies is Wave Field Synthesis (WFS), which creates a perceptually correct spatial...
It is likely that experiences of presence and self-motion elicited by binaurally simulated and reproduced rotating sound fields can be degraded by the artifacts caused by the use of generic Head-Related Transfer Functions (HRTFs). In this paper, an HRTF measurement system which allows for fast data collection is discussed. Furthermore, effects of g...
Emotional events may interrupt ongoing cognitive processes and automatically grab attention, modulating the subsequent perceptual processes. Hence, emotional eliciting stimuli might effectively be used in warning applications, where a fast and accurate response from users is required. In addition, conveying information through an optimum multisenso...
Creating a sense of illusory self-motion is crucial for many Virtual Reality applications and the auditory modality is an essential, but often neglected, component for such stimulations. In this paper, perceptual optimization of auditory-induced, translational self-motion (vection) simulation is studied using binaurally synthesized and reproduced s...
In 1890 William James hypothesized that emotions are our perception of physiological changes. Many different theories of emotion have emerged since then, but it has been demonstrated that a specifically induced physiological state can influence one's emotional responses to stimuli (e.g. Schachter and Singer (1962)). We tested how the presentation o...