Article

Tactile sensory substitution: Models for enaction in HCI

Authors:
If you want to read the PDF, try requesting it from the authors.

Abstract

To apply enactive principles within human–computer interaction poses interesting challenges to the way that we design and evaluate interfaces, particularly those that possess a strong sensorimotor character. This article surveys the field of tactile sensory substitution, an area of science and engineering that lies at the intersection of such research domains as neuroscience, haptics, and sensory prosthetics. It is argued that this area of research is of high relevance to the design and understanding of enactive interfaces that make use of touch, and is also a fertile arena for revealing fundamental issues at stake in the design and implementation of enactive interfaces, ranging from engineering, to human sensory physiology, and the function and plasticity of perception. A survey of these questions is provided, alongside a range of current and historical examples.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Sustitución sensorial es el nombre que recibe una estrategia por la cual se ofrece al usuario de dispositivos interactivos información por la vía de una modalidad sensorial diferente de aquella por la que esa información es habitualmente percibida, por ejemplo convierte datos visuales en estímulos hápticos [145,241]. ...
... A derecha una versión posterior con un array de pines en el abdomen y la cámara montada en la cabeza, de [241] los cambios en los patrones de estimulación táctil en relación con el movimiento de la cámara dentro del entorno), aunque en la práctica la diferencia no es absoluta. ...
... Una descripción más detallada del concepto, mecanismos y experiencias de sustitución-aumento sensorial puede encontrarse en los trabajos de Lenay y Visell [145,241]. Simplemente enfatizo aquí el punto general que, en contraste con algunos de estos ejemplos, cuando se utiliza una IE el dispositivo en sí, como el bastón de una persona ciega, debería volverse transparente en relación con la experiencia perceptiva aumentada del mundo. Por lo tanto, también se podría decir que el objetivo de diseñar una IE es crear un medio para facilitar nuevos modos de percepción, en lugar de crear un objeto que sea el objetivo de un modo de percepción existente. ...
Thesis
Full-text available
El continuo desarrollo de tecnologías interactivas y la mayor comprensión de la participación del cuerpo en los procesos cognitivos ha impulsado al diseño de interacciones en el marco de las investigaciones HCI a la necesidad de resolver la relación del usuario con una multitud de dispositivos que se extienden más allá de los escritorios. Estos ámbitos de diseño abren nuevos desafíos a la hora de disponer de procesos, métodos y herramientas para alcanzar experiencias de uso adecuadas. En la medida que nuevos dispositivos y sistemas involucran los aspectos corporales y sociales del ser humano, se hace más relevante la consideración de paradigmas, teorías y modelos de soporte que excedan la selección de nodos de navegación y organización visual apropiada de widgets y pantallas. El diseño de interacción debe ocuparse no sólo de conseguir que se construya el producto de manera adecuada, sino además que se construya el producto correcto. Esta tesis se constituye en el cruce de tres temas: el diseño de sistemas interactivos que combinan un pie en lo digital y uno en lo físico, las teorías de la cognición corporizada y enactiva y las prácticas creativas soportadas por el bocetado, en particular los procesos de generación, evaluación y comunicación de ideas o propuestas de diseño. Este trabajo incluye contribuciones de diferente carácter. Se realiza un estudio profundo de las teorías sobre cognición corporizada y enactiva, del diseño de interacción con dispositivos digitales y del bocetado como herramienta básica del diseño creativo. Sobre la base de este análisis de la bibliografía existente y con una caracterización de la práctica de bocetado de interacciones enactivas basada en estudios etnometodológicos se plantea un framework para organizar conceptualmente esa práctica y una herramienta de soporte a esa actividad concebida como una composición creativa. Se discuten las contribuciones y se plantean posibles líneas de trabajo futuro.
... Cutaneous feedback can be in the form of vibration produced by mini-vibration motors [12][13][14][15][16], skin stretch or tangential motion to the skin [17], and tactile displays that produce tapping motion creating vertical movements towards the skin forming a matrix of tactile pixels (taxel) [18] that triggers tactile mechanoreceptors: Meissner's corpuscles and Merkel's cells that are sensitive to edge pressure and flutter tap as reported by Visell [19]. Tactile matrix displays are often used as vision substitution but can also be used as a tactile augmentation to a vision system similar to in VR and telerobotics applications. ...
... One of the goals of haptics research is to develop an effective and efficient tactile "display" for a human-machine interface that can reproduce as closely as possible the natural feel of an object. Tactile displays are not only for vision substitution systems [20,25], but they can also be used to enhance the immersive experience in telecommunications or teleoperations [36,37], biomedical engineering [6], telerobotics [38], material recognition [39], online shopping [7], human-computer interaction (HCI) [19,40,41], and VR environments [18,24]. ...
... The tactile pins of P20 Braille cells can be activated in a tapping manner and controlled using a pulse signal at different frequencies and duration. We tried vibrating the tactile pins using 5 Hz, 10 Hz, 15 Hz, and 20 Hz frequencies, which are within the range of frequencies that triggers fingertip's tactile mechanoreceptors: Meissner's corpuscles and Merkel's cells that are sensitive to edge pressure and flutter tap as reported by Visell [19]. Our prototype can produce point tapping by activating one pin, line tapping by activating a row, a column, and area tapping by activating all the pins in the 4 × 4 fingertip tactile matrix as shown in Figure 12a(i-iv), respectively. ...
Article
Full-text available
During open surgery, a surgeon relies not only on the detailed view of the organ being operated upon and on being able to feel the fine details of this organ but also heavily relies on the combination of these two senses. In laparoscopic surgery, haptic feedback provides surgeons information on interaction forces between instrument and tissue. There have been many studies to mimic the haptic feedback in laparoscopic-related telerobotics studies to date. However, cutaneous feedback is mostly restricted or limited in haptic feedback-based minimally invasive studies. We argue that fine-grained information is needed in laparoscopic surgeries to study the details of the instrument’s end and can convey via cutaneous feedback. We propose an exoskeleton haptic hand wearable which consists of five 4 × 4 miniaturized fingertip actuators, 80 in total, to convey cutaneous feedback. The wearable is described as modular, lightweight, Bluetooth, and WiFi-enabled, and has a maximum power consumption of 830 mW. Software is developed to demonstrate rapid tactile actuation of edges; this allows the user to feel the contours in cutaneous feedback. Moreover, to demonstrate the idea as an object displayed on a flat monitor, initial tests were carried out in 2D. In the second phase, the wearable exoskeleton glove is then further developed to feel 3D virtual objects by using a virtual reality (VR) headset demonstrated by a VR environment. Two-dimensional and 3D objects were tested by our novel untethered haptic hand wearable. Our results show that untethered humans understand actuation in cutaneous feedback just in a single tapping with 92.22% accuracy. Our wearable has an average latency of 46.5 ms, which is much less than the 600 ms tolerable delay acceptable by a surgeon in teleoperation. Therefore, we suggest our untethered hand wearable to enhance multimodal perception in minimally invasive surgeries to naturally feel the immediate environments of the instruments.
... La modalité tactile présente de nombreux avantages à être utilisée pour communiquer de l'information (Jones & Sarter, 2008;Visell, 2009 La modalité tactile est également reconnue comme étant efficace pour transmettre des informations à la fois spatiales et temporelles (van Erp, 2000). La rétine, possédant une résolution spatiale très fine, permet à la vision d'être une modalité privilégiée pour les jugements spatiaux. ...
... Par ailleurs, la modalité tactile apparait plus adaptée que la modalité auditive pour suppléer la vision en vue de transmettre des informations de nature spatiale (Geldard, 1960) van Erp, 2001). Par ailleurs, les patrons vibrotactiles restent identifiables par les sujets même en cas de forte charge mentale (Chan, MacLean, & McGrenere, 2005) ou encore lorsque le sujet est soumis à d'autres vibrations environnantes (van Erp, van Veen, Jansen, & Dobbins, 2005 Eléments centraux des dispositifs vibrotactiles, les tacteurs sont généralement de type inertiel (Figure 1.9, a), c'est-à-dire qu'ils produisent des vibrations par rotation d'une masse autour d'elle-même (Jones & Sarter, 2008;Visell, 2009). La masse est généralement encapsulée dans une surface protectrice et peut prendre une forme plate ou cylindrique (Figure 1.8, a). ...
... L'usage du terme « substitution » parait inadéquat dans la mesure où le dispositif permet de réaliser certaines tâches impossibles sans la vision, sans pour autant fournir toutes les informations perçues par la rétine, comme par exemple la couleur, la luminosité, etc. Selon Bach-y-rita (2002) Cette remise en question, incarnée principalement par les travaux d 'Auvray (2006;2009), ne concerne pas uniquement l'emploi du terme « substitution ». ...
Thesis
Ce travail de recherche vise à étudier la transmission d’informations vibrotactiles pour l’aide à la navigation et plus particulièrement pour améliorer la régulation des phases d’approche et la gestion des contacts avec l’environnement. L’un des défis majeurs de ce domaine de recherche est de comprendre comment rendre compte d’informations, parfois complexes, avec une modalité sensorielle n’étant pas naturellement utilisée pour les traiter. Ainsi, ce travail doctoral avait pour but de montrer la possibilité de suppléer la vision et à spécifier les caractéristiques de la stimulation vibrotactile qui influencent l’accès aux informations d’approche. Les différentes études qui étayent cette thèse ont été réalisées à partir d’un dispositif expérimental couplant un environnement virtuel et un dispositif tactile comprenant différents vibrateurs placés à la surface de la peau. Les deux premiers chapitres expérimentaux se sont appuyés sur des tâches d’estimation de temps de pré-contact (time-to-contact, TTC) classiquement utilisées pour étudier les processus visuels mis en jeu dans la régulation des situations d’approche. Le premier chapitre expérimental (expériences 1, 2 et 3) constituait une étude préliminaire qui a notamment montré que le jugement était plus précis lorsque le dispositif tactile renvoyait des informations concernant la distance d’approche (par rapport à des informations sur la taille angulaire). Les résultats du second chapitre expérimental (expériences 4 et 5) ont montré que la modalité tactile permettait d’estimer le TTC mais de manière moins précise que la modalité visuelle. Toutefois, lorsque la modalité visuelle est occultée, transmettre des informations tactiles durant la période d’occultation permet d’améliorer la précision du jugement. Le dernier chapitre expérimental (expériences 6 et 7) s’est intéressé plus précisément à l’influence des informations vibrotactiles sur la régulation d’une approche au sol dans une situation simulée d’atterrissage en hélicoptère. Les deux expérimentations ont montré que l’utilisation d’informations tactiles permettait une diminution significative de la vitesse de contact au sol lorsque l’environnement visuel était dégradé et que cette diminution dépendait de la variable informationnelle transmise par le dispositif. Au final, les résultats de ce travail de recherche sont discutés au regard des théories fondamentales sur la perception et l’action. Ils permettent de montrer comment des informations d’approche peuvent être perçues à travers la modalité tactile et ainsi suppléer la vision lorsqu’elle est dégradée.
... In the absence of any sensory modality, a person develops other sensory capacities or abilities, which means that in the absence of vision, touch is an excellent mechanism of sensory substitution, as observed in the use of communication methods such as Braille or Tadoma. Furthermore, touch can also be an important sensory mechanism for people without visual impairments as pointed by [6], which makes an analysis about optical sensory substitution from the perspective of neurosciences and sensory prostheses. In this respect, the author lists the characteristics that should be of great relevance to the design and implementation of tactile interfaces, striving for human sensory physiological adequacy and the plasticity of perception, in order to evidence the fluidity regarding sense-motor skills [6]. ...
... Furthermore, touch can also be an important sensory mechanism for people without visual impairments as pointed by [6], which makes an analysis about optical sensory substitution from the perspective of neurosciences and sensory prostheses. In this respect, the author lists the characteristics that should be of great relevance to the design and implementation of tactile interfaces, striving for human sensory physiological adequacy and the plasticity of perception, in order to evidence the fluidity regarding sense-motor skills [6]. ...
... In this context, it is worth highlighting the works related to this theme, which use touch as a communication mechanism in a linguistic code perspective (e.g., [6][7][8][9][10][11][12][13][14][15][16]). ...
Chapter
The paper reports an experimental study that was carried out in Manaus (Amazonas, Brazil) with the participation of eight visually impaired athletes on 100 m sprint Paralympic races. A trajectory correction system was used, based on an accelerometer and a gyroscope for motion detection, an algorithm to track the athlete’s trajectories and a haptic actuator for the interaction with the athletes. The experimental results show the relevance in the use of this type of systems in Paralympic 100 m races for visually impaired athletes, mainly with the purpose of increasing their autonomy by mimicking their guides.
... Electrotactile perception and applications are widespread, and related research is diverse and complex. Previous related reviews have focused on haptic displays [2], perceptual substitution technologies [11], haptic interfaces for virtual reality augmentation [12], etc. There are few summaries focused on electrotactile feedback, except for [13]- [15]. ...
... The four mechanoreceptors are located at specific depths in the skin tissue. Some scholars have collated the characteristics of the individual receptors and tactile perception forms in more detail and will not be described here [11], [16]- [18]. ...
... Sensory substitution refers to the translation of sensory information that is normally available via one sense to another [11]. Sensory substitution can occur across sensory systems, such as touch-to-sight, or within a sensory system such as touch-to-touch [1]. ...
Article
Full-text available
With the increased demands of human-machine interaction, haptic feedback is becoming increasingly critical. However, the high cost, large size and low efficiency of current haptic systems severely hinder further development. As a portable and efficient technology, cutaneous electrotactile stimulation has shown promising potential for these issues. This paper presents a review on and insight into cutaneous electrotactile perception and its applications. Research results on perceptual properties and evaluation methods have been summarized and discussed to understand the effects of electrotactile stimulation on humans. Electrotactile applications are presented in categories to understand the methods and progress in various fields such as prostheses control, sensory substitution, sensory restoration and sensorimotor restoration. State of the art has demonstrated the superiority of electrotactile feedback, its efficiency and its flexibility. However, the complex factors and the limitations of evaluation methods made it challenging for precise electrotactile control. Groundbreaking innovation in electrotactile theory is expected to overcome challenges such as precise perception control, information capacity increasing, comprehension burden reducing and implementation costs.
... The lower and upper arm did not have significantly different sensitivity thresholds. The data presented in [22] shows that the two-point discrimination distance is greater on the upper arm than the lower arm. This suggests that there are more sensory nerve endings present on the forearm and therefore we would expect the sensory threshold for the forearm to be lower. ...
... The normal pressure distributed across the arm generated by the auxetic nature of the armband did not interfere with the participants' ability to distinguish between actuation sites. The two-point discrimination threshold as stated by [22] for the forearm is approximately 38mm. The distance between the SMA actuators on the B:Ionic armband is 32mm, so increasing this distance may also improve user's ability to distinguish between different sites of stimulation. ...
Article
Full-text available
Upper limb robotic prosthetic devices currently lack adequate sensory feedback, contributing to a high rejection rate. Incorporating affective sensory feedback into these devices reduces phantom limb pain and increases control and acceptance. To address the lack of sensory feedback we present the B:Ionic glove, wearable over a robotic hand which contains sensing, computation and actuation on board. It uses shape memory alloy (SMA) actuators integrated into an armband to gently squeeze the user's arm when pressure is sensed in novel electro-fluidic fingertip sensors and decoded through soft matter logic. We found that a circular electro-fluidic sensor cavity generated the most sensitive fingertip sensor and considered a computational configuration to convey different information from robot to user. A user study was conducted to characterise the tactile interaction capabilities of the device. No significant difference was found between the skin sensitivity threshold of participants' lower and upper arm. They found it easier to distinguish stimulation locations than strengths. Finally, we demonstrate a proof-of-concept of the complete device, illustrating how it could be used to grip an object, solely from the affective tactile feedback provided by the B:Ionic glove. The B:Ionic glove is a step towards the integration of natural, soft sensory feedback into robotic prosthetic devices.
... Several substitutions for sensory perception have been developed, which do not require implantable interfaces (Lundborg and Rosen, 2001;Visell, 2009;Khasnobish et al., 2016): Sensory substitution is a technique to provide an alternative path for necessary sensory information to the body using other sensory passages that are different from those naturally used. For instance, Kaczmarek et al. (1991) revealed that hearing and vibration can serve as a substitute for touch and pressure, respectively. ...
... The use of voltage-regulated stimulation can minimize the chance of skin burns. Visell (2009) showed that the change in impedance and load at the site of interface might not affect the value of current in electro-tactilecurrent-regulated stimulation. Multiple features of this modality can be controlled to elicit sensory percepts. ...
Article
Full-text available
For those individuals with upper-extremity amputation, a daily normal living activity is no longer possible or it requires additional effort and time. With the aim of restoring their sensory and motor functions, theoretical and technological investigations have been carried out in the field of neuroprosthetic systems. For transmission of sensory feedback, several interfacing modalities including indirect (non-invasive), direct-to-peripheral-nerve (invasive), and cortical stimulation have been applied. Peripheral nerve interfaces demonstrate an edge over the cortical interfaces due to the sensitivity in attaining cortical brain signals. The peripheral nerve interfaces are highly dependent on interface designs and are required to be biocompatible with the nerves to achieve prolonged stability and longevity. Another criterion is the selection of nerves that allows minimal invasiveness and damages as well as high selectivity for a large number of nerve fascicles. In this paper, we review the nerve-machine interface modalities noted above with more focus on peripheral nerve interfaces, which are responsible for provision of sensory feedback. The invasive interfaces for recording and stimulation of electro-neurographic signals include intra-fascicular, regenerative-type interfaces that provide multiple contact channels to a group of axons inside the nerve and the extra-neural-cuff-type interfaces that enable interaction with many axons around the periphery of the nerve. Section Current Prosthetic Technology summarizes the advancements made to date in the field of neuroprosthetics toward the achievement of a bidirectional nerve-machine interface with more focus on sensory feedback. In the Discussion section, the authors propose a hybrid interface technique for achieving better selectivity and long-term stability using the available nerve interfacing techniques.
... The fusion of these two elements, dictated by the theories of the development of enactive interfaces, will allow the analysis and use of the theory of sensorimotor contingencies as the basis for the development of the symbiosis process. (Visell, 2009) Learning and experience of the world through actions, theoretical support of enactive interfaces, is, in fact, a prerequisite for overcoming the limits of integration between individual and machine dictated by simple sensory substitution systems. (Lenay, Canu & Villon, 1997) Motor affordances are indeed essential to close the perceptual interaction loop; without the same motor affordances, the user would have the information necessary to create a relationship scheme to human+machine afferent sense sensorimotor contingencies synaesthesia learning abstraction receiver sense brain 164 165 genesis of symbiosis 006 01 theoretical process between himself, artificial perception and the surrounding space. ...
... The use of the sensorimotor contingencies will consequently modify the ways and subjects of the perception of the artificial sense by closing, as expected, the perceptual interaction loop. (O'Regan & Noë, 2001;Visell, 2009) This schematisation can be considered as an initial point of reference in the design of tools and interfaces related to the creation of synaesthetic sensorial spaces, both for the exploration of new possibilities of approach to the interaction between the individual and the external world, and as a stimulus speculative for questioning the current hierarchisation of the natural senses. 006 02 physical setup conversion process ...
Thesis
Full-text available
The role that the human sensory system plays in the daily interaction of individuals with the external world has far more articulated ramifications than expected from a component of human nature that appears to be so linear and straightforward. Starting from the conditions of sensory impairment as a vehicle for the analysis of the field, the multidisciplinary study of the perceptual process highlights how sight plays a dominant role compared to the other senses, and how this dominance fits into a context of dichotomies inherent in the today’s social structure, with negative impacts on the personal, psychological and social context of a part of humanity. The technical-scientific developments of the last decades, and in particular the innovations aimed at the enhancement of the human being through technology, have proved not only to be an instrument of emancipation from these hierarchical dictates, but also a space of opportunity for a reconsideration of design and research constraints, constraints generated by a bounded consideration of the sensory spectrum. The objective of the thesis in question consists in the development of an intervention in the field of visual impairments capable of suggesting an alternative to these principles through the intersection between human and technological elements in the restructuring of the perceptual approach and the analysis of the potential of the discipline of design in the exploration of alternative intervention methods. This goal will be achieved through the development of a support device for individuals with visual impairments. This device will allow the creation of a sensorial space alternative to the vision-centric standard, through the synesthetic interaction between an artificial vision system and natural systems of visual and haptic perception. In the course of the research path, the design process will be positioned in a role complementary to its usual connotation, which sees it as a tool for developing solutions and products dedicated to the specific area of intervention. It will also play a role as a creative force for new exploratory spaces where the design itself can have a functional, social and political value; spaces created in the intersection of science, speculation and interaction between the natural components and the artificial ones of organisms.
... Most often, an amputee does not view the prosthetic device as part of his or her body (De Vignemont, 2011) and will be inclined to use their contralateral arm rather than the device itself to overcome their disability (Gouzien et al., 2017). However, several research investigations have shown that the sense of embodiment with prosthetic devices could be enhanced with the availability of sensory feedback (Berg, Tenore, Jessica, Vogelstein, & Bensmaia, 2014;Gonzalez, Soma, Sekine, & Yu, 2012;Visell, 2009). ...
... This feature is very useful as most of the tactile sensory systems are located at the skin of fingers and palm. This sensory system is responsible for touch sensation, and is much superior as compared to visual and auditory senses (Mano & Ohka, 1999;Pritchard & Alloway, 1999;Visell, 2009). The design of prosthetic device normally focuses on the mechanics or movement of the system without considering the need to replicate the tactile sensation. ...
Article
Full-text available
The absence of tactile sensory feedback in transradial prosthetic hands is one of the major contributing factors for these devices to be rejected by users. This paper reports human psychophysical response to vibrotactile sensations in discriminating surface textures, as a possible non-invasive method to supplement sensory feedback for prosthetic hand users. The vibrotactile sensations were supplied by a specially fabricated actuator that vibrates according to signals obtained by a prosthetic finger when sliding across textured surfaces. Participants were provided with four different types of vibration patterns, randomly repeated for five times and were required to state which surface textures that the vibration patterns represent. A Chi-square test statistical procedure was designed to evaluate the relationships between these two categorical variables. The investigation which comprises of 300 samples has shown a statistically significant relationship between the vibration patterns and the surface textures (p < 0.001). The participants were able to discriminate surface textures and associate them to the vibration patterns provided by the vibrotactile actuator accordingly. The outcome of this work has provided optimistic possibility for implementation of painless, non-invasive sensory feedback that will undoubtedly boost the users’ sense of embodiment and encourage them to fully utilize a well-designed prosthetic device. © 2018, © 2018 The Author(s). This open access article is distributed under a Creative Commons Attribution (CC-BY) 4.0 license.
... It also provides inspiration for the design of biologically-informed artificial tactile sensors that has not been fully exploited to date. There are more than a half dozen types of sensory receptors of touch in humans and other mammals [16], capturing different aspects and components of the mechanical signals supporting touch sensing and interaction [17]. Sub populations of these receptors are most responsive to sustained mechanical stimulation (slowly adapting types, SA), and others to transient or vibratory signals (fast adapting, FA). ...
Preprint
Tactile sensing is a essential for skilled manipulation and object perception, but existing devices are unable to capture mechanical signals in the full gamut of regimes that are important for human touch sensing, and are unable to emulate the sensing abilities of the human hand. Recent research reveals that human touch sensing relies on the transmission of mechanical waves throughout tissues of the hand. This provides the hand with remarkable abilities to remotely capture distributed vibration signatures of touch contact. Little engineering attention has been given to important sensory system. Here, we present a wearable device inspired by the anatomy and function of the hand and by human sensory abilities. The device is based on a 126 channel sensor array capable of capturing high resolution tactile signals during natural manual activities. It employs a network of miniature three-axis sensors mounted on a flexible circuit whose geometry and topology were designed match the anatomy of the hand, permitting data capture during natural interactions, while minimizing artifacts. Each sensor possesses a frequency bandwidth matching the human tactile frequency range. Data is acquired in real time via a custom FPGA and an I$^2$C network. We also present physiologically informed signal processing methods for reconstructing whole hand tactile signals using data from this system. We report experiments that demonstrate the ability of this system to accurately capture remotely produced whole hand tactile signals during manual interactions.
... The skin has been considered as a conduit for information [1,2], where a vibrotactile display can be added by an array of vibration actuators, with the resolution varying from 2 × 2 to 64 × 64 [3] and mostly applied to the skin on the back, abdomen, forehead, thigh, or the fingers. The vibrotactile display has been extensively studied in the context of sensory substitution. ...
Article
Full-text available
The sense of touch can be used for acquiring visual/auditory information through tactile stimuli. Sensory augmentation and substitution often require the extensive training of subjects, leading to exhaustion and frustration over time. In this paper, we present a vibrotactile device for delivering customizable spatiotemporal tactile patterns to be used as a promising method in relaying information through the skin. The proposed system is able to generate stimuli where both vibration frequency and amplitude can be modulated. The results prove that this assistive technology solution can be beneficial and efficient not only for delivering complex feedback, but also for people with some level of sensory impairment or total disability e.g. blind, color-blind or deaf people.
... Images and graphics are extensively used as teaching aids at schools at different education levels (Hersh & Johnson, 2008). Such a visual education medium, however, can only be accessed by visually impaired children either through the means of a verbal description (The Audio Description Project, 2018), by touch (Klatzky & Lederman, 2007;Visell, 2009), with the help of non-verbal representation, termed sonification (Hermann & Hunt, 2005) or through some form of a combined multimodal approach (Fernandes & Haley, 2013). These methods all require specific solutions that increase the effort of the teaching personnel and/or the cost of teaching due to, for instance, the necessity of purchasing additional technologies. ...
Article
The paper presents an application for interactive sonification of images intended for use on mobile devices in education of blind children at elementary school level. The paper proposes novel sonification algorithms for converting colour and grayscale images into sound. The blind user can interactively explore image content through multi‐touch gestures and select image sub‐regions for sonification with real‐time control of the synthesized sound parameters such as amplitude, frequency and timbre. Additionally, images may contain text fields read by text‐to‐speech synthesizer. The usability of one of the proposed sonification schemes is tested by collecting data on the tracking accuracy and recognition speed of basic shapes, such as lines, curves, as well as figures and simple functions. In order to facilitate the learning process for blind children, a number of games were proposed that use the devised sonification schemes. The first game—“Hear the Invisible”—is intended for two players—one child draws a shape and the task of the other one is to guess the displayed shape by means of the available sonification methods. The second proposed game is “Follow the Rabbit” and is intended for a single player who tracks a colourful “Rabbit” that runs along a path representing a given geometric shape. The obtained results show that the proposed sonification methods allow to reach previously unattainable levels of flexibility in exploration of shapes and colours. The main application of the described interactive tool is to teach geometry and mathematics in schools for blind children. The developed games are meant to enhance the learning process and motivate the children.
... Among the different perspectives included under the umbrella of 4E cognition, enactivism is the approach that has moved most from theoretical toy models to more tangible and applied psychological research. Noteworthy examples of experimental psychology performed from an enactive perspective can be found in the field of sensory substitution (Auvray, Hanneton, & O'Regan, 2007;Bermejo, Di Paolo, Hu¨g, & Arias, 2015;Froese, McGann, Bigge, Spiers, & Seth, 2012;Lenay, Gapenne, Hanneton, Marque, & Genoue¨lle, 2003;Visell, 2008;cf. Dı´az, Barrientos, Jacobs, & Travieso, 2012). ...
Article
In 2017, Di Paolo, Buhrmann, and Barandiarán proposed a list of criteria that post-cognitivist theories of learning should fulfill. In this article, we review the ecological theory of direct learning. We argue that this theory fulfills most of the criteria put forward by Di Paolo et al. and that its tools and concepts can be useful to other post-cognitivist theories of learning. Direct learning holds that improvements with practice are driven by information for learning that can be found in the dynamic organism-environment interaction. The theory formally describes information for learning as a vector field that spans a space with all the perception-action couplings that may be used to perform an action. Being located at a point of such a space means using a specific perception-action coupling. Changes in perception-action couplings due to learning can be represented as paths across the space, and can be explained with the vector field of information for learning. Previous research on direct learning considered actions that were best understood with single perception-action couplings. To conclude the article, and inspired by the criteria of Di Paolo et al., we discuss an extension of the theory to actions that are best understood with multiple perception-action couplings.
... The term « sensory substitution » refers to a process whereby an agent, by means of a removable specialized instrumentation, becomes capable of exploiting an available sensory modality in order to perceive properties of the environment which are normally accessible by means of a different modality which is temporarily or definitively unavailable (Bach-y-Rita, 2004;Bach-Y-Rita et al., 1969;Lenay et al., 2003;Visell, 2009;Wall and Brewster, 2006). The key to this learning involves mastery of the relation between the actions performed by the agent and the determination of the associated sensory effects. ...
Article
Sensory substitution refers to a process whereby an agent, by means of a removable specialized instrumentation, becomes capable of exploiting an available sensory modality in order to perceive properties of the environment which are normally accessible by means of a different modality. We describe a situation of visual-auditory sensory substitution in the rat. Rats were placed in complete darkness, and trained to follow a virtual path whose position was signalled by a sound activated by a video-tracking device. Our hypothesis was that the rats would be able to succeed in this task of spatial navigation, following the sound contour by means of sensory-motor coupling based on seeking the sound (all-or-none)and mastering the relation between their own actions and the expected sensory feedback. Our results confirm this hypothesis and show the progressive structuring of meaningful exploratory activity, leading from the appearance of stopping behaviour when the sound is lost or acceleration when the sound is discovered, up to a veritable sensory-motor strategy which maximizes the possibilities for discovering and following the sound path. Thus, the animals seem to have developed a new form of perception which translates in particular into motor behaviour adapted to the search for sound.
... Sensory substitution devices (SSDs) are devices that substitute one perceptual modality (usually vision) with another one (usually hearing or touch) [1][2][3][4][5][6][7]. A substantial number of SSDs and experiments with SSDs have been oriented toward the recognition of objects and the perception of properties of objects such as their shape or size [8][9][10][11][12][13]. ...
Article
Full-text available
The theory of affordances states that perception is of environmental properties that are relevant to action-capabilities of perceivers. The present study illustrates how concepts and methodological tools from the theory of affordances may help to advance research in the field of sensory substitution. The sensory substitution device (SSD) that was used consisted of two horizontal rows of 12 coin motors that each vibrated as a function of the distance to the nearest object. Sixty blindfolded participants used the SSD to explore virtual horizontal apertures with different widths. They were asked to judge the passability of the apertures. Participants with narrow shoulders judged narrower apertures as passable than participants with wide shoulders. This difference disappeared when aperture width was scaled to shoulder width, demonstrating that perception was body scaled. The actual aperture width was closely related to aspects of the exploratory movements and to aspects of the vibrotactile stimulation that was obtained with the exploratory movements. This implies that the exploratory movements themselves and the vibrotactile stimulation were both informative about the aperture width, and hence that the perception of passability may have been based on either of them or on a global variable that spans vibrotactile as well as kinaesthetic stimulation. Similar performance was observed for participants who accomplished the 7-trial familiarization phase with or without vision, meaning that practice with vision is not indispensable to learn to use the SSD.
... Multiple assistive technologies (AT; Hersh and Johnson, 2008a) have been designed to support visually impaired persons (VIPs; Csapó et al., 2015). One form of non-invasive AT for VIPs is sensory substitution devices (SSDs; Bach-y- Rita et al., 1969;Bachy-Rita and Kercel, 2003;Visell, 2009). SSDs provide VIPs with visual information in the format of another preserved modality (Bach-y-Rita et al., 1969;Lenay et al., 2003) via cross-modal displays (Lloyd-Esenkaya et al., 2020). ...
Article
Sensory substitution is thought to be a promising non-invasive assistive technology for people with complete loss of sight because it provides inaccessible visual information via a preserved modality. However, Sensory Substitution Devices (SSDs) are still rarely used by visually impaired persons, possibly due to a lack of structured and supervised training that could be offered alongside these devices. Here, we developed and evaluated a training program that supports the usage of a recently developed colour-to-sound SSD – the Colorophone. Following our recently proposed theoretical model of SSD development, we propose that this training should help people with complete loss of sight to learn how to efficiently use the device by developing relationships between the components of the user-environment-technology system. We applied systematic case studies combined with a mixed-method approach to evaluate the efficacy of this SSD training program. Five blind users underwent ca. 22 hours of training, divided into four main parts: identification of the users’ individual characteristics and adaptations; sensorimotor training with the device; semi-structured explorations with the device; and evaluation of the training. We demonstrated that this training allows users to successfully acquire a set of skills (i.e., master the sensorimotor contingencies required by the device, develop visual-like perceptual skills, as well as learn about colours) and progress along developmental trajectories (e.g., switch from serial to parallel information processing, recognize more complex colours, increase environment and task complexity). Importantly, we identified individual differences in learning strategies (i.e., sensorimotor vs. metacognitive strategy) that had an impact on the users’ training progress and required the training assistants (TAs) to apply different assistive strategies. Additionally, we described the crucial role of a (non-professional) training assistant in the training progress: this person facilitates the development of relationships between elements of the user-environment-technology system by supporting a metacognitive learning strategy, thereby reducing the risk of abandonment of the SSD. Our study shows the importance for SSD development of well-designed, tailored training, and it provides new insights into the process of SSD-related perceptual learning.
... Il existe un grand nombre d'appareils de substitution sensorielle (pour une revue, voir Dakopoulos & Bourbakis, 2010 ;Jones & Starter, 2008 ;et Visell, 2009 Veraart, 1998). La plupart des appareils utilisent la modalité tactile, notamment en raison de la grande prédisposition de la peau à discriminer les stimuli temporels et spatiaux et de son efficacité à mobiliser l'attention (Geldard, 1960 ;Van erp, 2007 (White, Saunders, Scadden, Bach-y-Rita et Collins, 1970, p.23) (Cf. Figure 6) Depuis 1967 le TVSS a été l'objet de nombreux perfectionnements techniques incluant la miniaturisation, l'amélioration de la définition de l'image ou encore le passage à la stimulation électrique (Aiello, 1998) (Cf. Figure 7). ...
Thesis
L’attribution distale réfère à notre capacité d’attribuer à nos sensations proximales (lumière sur la rétine, ondes sonores, etc.) une localisation dans l’espace distal. Les travaux issus de la substitution sensorielle ont permis d’identifier et de tester ce processus. L’objectif de cette thèse est d’investiguer l’attribution distale grâce à une approche conjointe et intégrée des théories sensorimotrices et idéomotrices. Dans deux études, nous utilisons une procédure idéomotrice classique consistant à associer des conséquences sensorielles proximales à des mouvements. La première étude montre que l’attribution de caractéristiques distales aux sensations proximales est déterminée par l’association bidirectionnelle sensations – mouvements. De plus, nous montrons que l’orientation du mouvement est consubstantielle des caractéristiques distales que nous conférons aux conséquences sensorielles. La seconde étude réplique ces résultats et les étend aux caractéristiques liées à l’amplitude du mouvement. Ces expériences nous amènent à définir l’attribution distale comme un processus sensori-moteur et relevant d’un mécanisme idéomoteur, inscrivant ainsi nos études dans une perspective constructiviste. Les limites théoriques et expérimentales de nos études nous conduisent alors à proposer de nouvelles expériences visant à mieux définir l’importance du mouvement dans la perception.
... Of main interest in this field is studying the differences between cases under sensorimotor contingency conditions, in which the scanning organ (actuator) is the same as the sensing organ (sensor) to cases under sensorimotor non-contingency conditions, in which the actuator is different from the sensing organ. Many studies have shown that under sensorimotor contingency conditions, active sensing outperforms the sensorimotor non-contingency conditions, probably due to the dependency on natural sensory motor loops [4,[38][39][40]. Furthermore, it has been shown that sensorimotor contingency is important for the normal development of the visual system [41]. ...
Article
In this work, we study the enhancement of simulated prosthetic reading performance through “active photonic sensing” in normally sighted subjects. Three sensing paradigms were implemented: active sensing, in which the subject actively scanned the presented words using the computer mouse, with an option to control text size; passive scanning produced by software-initiated horizontal movements of words; and no scanning. Our findings reveal a 30% increase in word recognition rate with active scanning as compared to no or passive scanning and up to 14-fold increase with zooming. These results highlight the importance of a patient interactive interface and shed light on techniques that can greatly enhance prosthetic vision quality.
... The pad is coupled with another prototype, a lingual stimulator named Tongue Display Unit® (TDU), allowing the feedback to the subject [76]. After the seminal work by Kazimierz Noiszewski [77][78][79][80][81], who developed the first device of substitutive vision by tactile stimulation, called El-ektroftalm® or "artificial eye" (1897), followed by the Optophone® of Fournier d'Albe (1912) [82], devices have been developed and validated by Samsó Diaz first (1962) and then, by Bach-y-Rita (1969) in a population of deaf animals and blind humans, for which acoustic or visual information captured by microphones or video cameras was transcoded in electrotactile stimulations of skin or tongue [83][84][85]. ...
Chapter
Full-text available
This chapter aims to show that big data techniques can serve for dealing with the information coming from medical signal devices such as bio-arrays, electro-physiologic recorders, mass spectrometers and wireless sensors in e-health applications, in which data fusion is needed for the personalization of Internet services allowing chronic patients, such as patients suffering cardio-respiratory diseases, to be monitored and educated in order to maintain a comfortable lifestyle at home or at their place of life. Therefore, after describing the main tools available in the big data approach for analyzing and interpreting data, several examples of medical signal devices are presented, such as physiologic recorders and actimetric sensors used to monitor a person at home. The information provided by the pathologic profiles detected and clustered thanks to big data algorithms, is exploited to calibrate the surveillance at home, personalize alarms and give adapted preventive and therapeutic education.
... In 1930, von Békésy started investigating the physiology behind tactile perception by drawing a parallel between the tactile and the auditory channels in terms of the mechanism governing the two perception mechanisms [53]. A thorough review of sensory substitution applications can be found in Visell [52]. In a musical context, several interfaces have been produced with the aim of translating sound into perceivable vibrations delivered via vibrotactile displays. ...
Chapter
Full-text available
Haptics, and specifically vibrotactile-augmented interfaces, have been the object of much research in the music technology domain: In the last few decades, many musical haptic interfaces have been designed and used to teach, perform, and compose music. The investigation of the design of meaningful ways to convey musical information via the sense of touch is a paramount step toward achieving truly transparent haptic-augmented interfaces for music performance and practice, and in this chapter we present our recent work in this context. We start by defining a model for haptic-augmented interfaces for music, and a taxonomy of vibrotactile feedback and stimulation, which we use to categorize a brief literature review on the topic. We then present the design and evaluation of a haptic language of cues in the form of tactile icons delivered via vibrotactile-equipped wearable garments. This language constitutes the base of a “wearable score” used in music performance and practice. We provide design guidelines for our tactile icons and user-based evaluations to assess their effectiveness in delivering musical information and report on the system’s implementation in a live musical performance.
... Since the early 1960s, the popularity of developing and testing devices that use alternative forms of sensory information has grown. As an illustration of this growth, a recent Google Scholar search using the term sensory substitution over the five decades between 1960 and 2009 yielded 13, 198, 373, 615, and 2,570 hits per decade, respectively, and 4,140 hits since 2010. 1 Reviews of different sensory substitution devices (SSDs) include those developed by Jones and Sarter (2008), Dakopoulos and Bourbakis (2010), and Visell (2009). As can be noted in these reviews, SSDs are potentially useful in a wide range of situations. ...
Article
This study investigates how active exploration helps users of sensory substitution devices (SSDs) to detect action-relevant information. A vibrotactile SSD was developed that generates stimulation that is contingent on the users’ movements. Target direction was specified by the location of the vibratory stimulation, and target distance by the size and intensity of the pattern of stimulation. A series of experiments was performed with blindfolded participants. In Experiments 1a to 1c, participants used the SSD to align their central body axis with prespecified targets. These experiments differed in the number of actuators that were used and whether online perception–action coupling was present. In Experiment 2, participants approached targets with forward locomotion along a straight line. Experiment 3 combined the previous experiments and studied the concomitant walking and steering toward targets. Oscillatory movements, which facilitated information pickup, were observed in all experiments. The exploratory oscillations were shown to depend on the online perception–action coupling and were related to cases of hyperacuity, for which absolute errors were found to be smaller than the areas of sensitivity of the actuators. It is concluded that, to improve the utility of SSDs, future research with SSDs should pay more attention to the role of active information detection.
... Im Vergleich zu kraftreflektierenden Interaktionsgeräten (mitüberwiegend komplizierten kinematischen Strukturen) sind hierzu vergleichsweise nur kleine Bauteile notwendig [16]. Folglich lassen sich sehr gut kompakte, tragbare und leichte Geräte konzipieren, die die Bewegungsfreiheit des Nutzers nicht oder nur geringfügig einschränken [170]. Die Stimulation kann dabei vorteilhaft auf einer großen Fläche (≈ 1,8 − 2,0 m 2 ) der Haut des Menschen [100] appliziert werden und Feedback kann relativ einfach an verschiedenen Stellen ausgegeben werden, an denen es die Anwendung erfordert. ...
Thesis
Diese Doktorarbeit befasst sich mit der Entwicklung eines vibrotaktilen Armbands zur vielseitigen Informationsübermittlung in Mensch-Maschine-Systemen. In einem iterativen Entwicklungsprozess wurde unter gesamtheitlicher Beachtung technischer und ergonomischer Aspekte sowie der menschlichen Wahrnehmung ein ergonomisches und praxistaugliches Gerät konzipiert und als produktnaher Prototyp realisiert. Dieser zeichnet sich durch eine intuitive, orts- und richtungsdifferenzierte Informationsausgabe mit eindeutig interpretierbaren Stimulationsmustern aus. Der Funktionsumfang und damit die vielseitige Einsetzbarkeit des Geräts wurden durch die Integration von Distanzsensorik zur Hinderniserkennung im nahen Umfeld des Nutzers erweitert. Dadurch ist eine gleichzeitige Erfassung in verschiedene Richtungen sowie eine intuitive Rückmeldung von Richtung und Distanz erfasster Objekte möglich. Nutzerstudien in unterschiedlichen Anwendungen wie z.B. Kraft- und Kollisionsrückmeldung aus virtueller Realität und Telerobotik, Aufmerksamkeitslenkung in komplexen Arbeitsbereichen, neuartige Ausbildungskonzepte mit augmentierter Realität, Navigation und Unterstützung blinder Menschen belegen den erfolgreichen Einsatz des patentierten Geräts und weisen den Beitrag dieser Arbeit zur Verbesserung des Informationsflusses in Mensch-Maschine-Systemen und zur Steigerung der Immersion des Nutzers nach. Schlagworte: vibrotaktiles Feedback Gerät, multimodale Mensch-Maschine-Schnittstelle, taktile orts- und richtungsdifferenzierte Hinweise
... ETAs consist of three components (Visell, 2009). First, a sensory component that detects certain information from the environment that is not available to the user of the ETA because of the loss of sight. ...
... There are even reports of users which became so emotionally attached to their sensory substitution devicerelated information that removing access to it resulted in a feeling of loss [9]. In highly trained users, the resulting embodiment of sensory substitution stimulation is perceived as an extension of the senses, not as an outside apparatus [73]. Visual sensory substitution devices differ from one another in terms of their respective approaches for capturing, transforming, and sending information [9]. ...
Chapter
Full-text available
The human brain is a formidably complex and adaptable organ capable of rewiring itself or adjusting existing connections in order to learn and to maximize its survival edge. Studies using sensory substitution devices have had a big impact on the uncovering of the mechanisms subtending brain organization. Sensory substitution devices are capable of conveying information typically received through a specific sensory modality (e.g., vision) and transferring it to the user via a different sense (e.g., audition or touch). Experimental research exploring the perceptual learning of sensory substitution devices has revealed the ability of users to recognize movement and shapes, to navigate routes, to detect and avoid obstacles, and to perceive colors or depth via touch or sound, even in cases of full and congenital blindness. Using a combination of functional and anatomical neuroimaging techniques, the comparisons of performances between congenitally blind people and sighted people using sensory substitution devices in perceptual and sensory-motor tasks as well as in several recognition tasks uncovered the striking ability of the brain to rewire itself during perceptual learning and to learn to interpret novel sensory information even during adulthood. This review discusses the impact of invasive and noninvasive forms of artificial vision on brain organization with a special emphasis on sensory substitution devices and also discusses the implications of these findings for the visual rehabilitation of congenitally and late blind and partially sighted individuals while applying insights from neuroimaging and psychophysics.
... In seguito a ciò, le connessioni tra A1 e S1 sono diventate più rade nel lato lesionato, ma le connessioni tra A1 e OP1/S2 sono diventate più Occasionalmente, il termine può essere considerato come un'espressione racchiudibile nel concetto di plasticità neurale, che la include. Essa infatti, può essere riferita alla flessibilità in cui una capacità sensoriale intrinseca è usata e legata a un'altra capacità sensoriale (Visell, 2008). In questa prospettiva, la sostituzione sensoriale offre la possibilità di recupero di abilità (per esempio in ...
Thesis
The focuses of my thesis work are: a) the analysis of the audio-tactile binding problems (experiments on the linguistic domain) (chpt.1), b) the analysis and investigation of the audio-tactile temporal binding window (experiments with non-predictable unisensorial and multisensorial audio-tactile streams) (chpt.2), c) the sensory substitution theory (specifically the possibility to vehiculate auditory linguistic information through a tactile sensory substitution device) (chpt.3). The aim of the work was to find a sensory substitution device (tactile stimulation) that can effectively help hearing-impaired people to better understand speech. The results did not confirm the influence of the tactile stimulation on the auditory perception and did not show evidence for audio-tactile binding. In the discussion I analysed the methodological limits (stimuli, procedure) and the limits in the current knowledge that are crucial for the aims of this work (e.g. not sufficient knowledge on auditory perception of speech, time plasticity and duration of training, training procedures).
... Enaction is a theory, close to situated cognition, that focuses on how organisms organize themselves in interaction with the environment. In an enactive perspective [29], we would like to provide the real user with all the visual information in its complexity. It is up to us to inject the appropriate stimuli in the environment, to encourage interaction and the development of the perception/action loop. ...
... In seguito a ciò, le connessioni tra A1 e S1 sono diventate più rade nel lato lesionato, ma le connessioni tra A1 e OP1/S2 sono diventate più Occasionalmente, il termine può essere considerato come un'espressione racchiudibile nel concetto di plasticità neurale, che la include. Essa infatti, può essere riferita alla flessibilità in cui una capacità sensoriale intrinseca è usata e legata a un'altra capacità sensoriale (Visell, 2008). In questa prospettiva, la sostituzione sensoriale offre la possibilità di recupero di abilità (per esempio in ...
... The last decades have seen research on both haptics tactile transducers steadily increasing. Haptics has grown into an interdisciplinary research covering perception [1], psychophysics [2], vision substitution system [3], [4], bio-medical engineering [5], telecommunication [6], e-shopping [7], teleoperation [8], telerobotics [9], material recognition [10], Human-Computer Interaction (HCI) [11], [12], and Virtual Reality (VR) environments [5], [7], [13]- [15]. ...
Conference Paper
Full-text available
Haptic primary colors correspond to temperature, vibration, and force. Previous studies combined these three haptic primary colors to produce different types of cutaneous sensations without the need to touch a real object. This study presents a low-cost untethered hand wearable with temperature, vibration, and force feedback. It is made from low-cost and commercial off-the-shelf components. A 26 mm annular Peltier element with a 10 mm hole is coupled to an 8 mm mini disc vibration motor, forming vibro-thermal tactile feedback for the user. All the other fingertips have an 8 mm disc vibration motor strapped on them using Velcro. Moreover, kinesthetic feedback extracted from a retractable ID badge holder with a small solenoid stopper is used as force feedback that restricts the fingers' movement. Hand and finger tracking is done using Leap Motion Controller interfaced to a virtual setup with different geometric figures developed using Unity software. Therefore, we argue this prototype as a whole actuates cutaneous and kinesthetic feedback that would be useful in many virtual applications such as Virtual Reality (VR), teleoperated surgeries, and teleoperated farming and agriculture.
... Braille), and "even a poor resolution sensory substitution system can provide the information necessary for the perception of complex images" [7,8]. For a thorough overview of sensory substitution see [9]. ...
Conference Paper
The real world is multisensory and our experiences in this world are constructed by the stimulation of all our senses including visual, auditory, touch, olfactory, and taste. However, virtual environments including virtual simulations and serious games, and the human-computer interface more generally, have focused on the visual and auditory senses. Simulating the other senses such as the sense of smell (olfaction) can be beneficial for supporting learning. In this paper we present a simple and cost-effective olfactory interface constructed using an Arduino Uno microcontroller board, a small fan, an off-the-shelf air freshener to deliver a scents to a user of a serious game. A fuzzy logic system regulated the amount of scent delivered to the user based on their distance to the display. As a proof-of-concept, we developed a serious game intended to teach basic math (counting) skills to children. Learners (players) collect pineapples from the scene and then enter the amount of pineapples collected. As the pineapples are collected, a pineapple scent is emitted from the olfactory interface thus serving to supplement or complement the learner’s senses and stimulate their affection and cognition. As part of our proof-of-concept, a 10 year old learner played the game and provided us with feedback regarding the olfactory interface and illustrated the potential of the system.
... Kaczmarek, Webster, Bach-y-Rita, and Tompkins (1991) found that areas of the skin had varying concentrations of receptors for different tactile stimuli; for example, vibrotactile receptors had higher concentration in the fingertips than in the palms. Visell (2009) concluded in their review that the design and mode of human-system interface moderated the effectiveness of substitution systems. There is some consistency that the effective tactile field of view is limited to one finger from either hand as visual acuity decreases with more points of contact (Craig, 1985;Loomis, Klatzky & Lederman, 1991). ...
Preprint
Full-text available
Sensory substitution has influenced the design of many tactile visual substitution systems with the aim of offering visual aids for the blind. This paper focuses on whether a novel electromagnetic vibrotactile display, a four by four vibrotactile matrix of taxels, can serve as an aid for dynamic communication for visually impaired people. A mixed methods approach was used to firstly assess whether pattern complexity affected undergraduate participants' perceptive success, and secondly, if participants total score positively correlated with their perceived success ratings. A thematic analysis was also conducted on participants' experiences with the vibrotactile display and what methods of interaction they used. The results indicated that complex patterns were less accurately perceived than simple and linear patterns respectively, and no significant correlation was found between participants' score and perceived success ratings. Additionally, most participants interacted with the vibrotactile display in similar ways using one finger to feel one taxel at a time; arguably, the most effective strategy from previous research. This technology could have applications to navigational and communication aids for the visually impaired and road users.
... Vision and touch share a common sensory strategy: Both modalities are based on two-dimensional (2D) arrays of receptors that actively scan the environment (Pei et al., 2010;Ahissar and Arieli, 2001;Collins and Bach-Y-Rita, 1973;Borji et al., 2013). Humans, and primates in general, typically move their eyes and hands while perceiving external objects, in patterns whose interactions with the objects determine receptor activations (Ahissar and Vaadia, 1990;Connor and Johnson, 1992;Kelso, 1997;Gamzu and Ahissar, 2001;Visell, 2009;Yoshioka et al., 2011;Ahissar and Arieli, 2012;Mannan et al., 2009). In both systems, this active-sensing strategy is supported by an elaborated system of parallel, multi-level motor-sensorymotor loops (Ahissar and Assa, 2016) and induces spatiotemporal sensory coding Arieli, 2001, 2012;Pei et al., 2010;Saig et al., 2012;Ahissar and Vaadia, 1990;Gamzu and Ahissar, 2001;Hollins and Bensmaïa, 2007;Cascio and Sathian, 2001;Sathian, 1989). ...
Article
Full-text available
We examined the development of new sensing abilities in adults by training participants to perceive remote objects through their fingers. Using an Active-Sensing based sensory Substitution device (ASenSub), participants quickly learned to perceive fast via the new modality, and preserved their high performance for more than 20 months. Both sighted and blind participants exhibited almost complete transfer of performance from 2D images to novel 3D physical objects. Perceptual accuracy and speed using the ASenSub were, on average, 300% and 600% better than previous reports for 2D images and 3D objects. This improvement is attributed to the ability of the participants to employ their own motor-sensory strategies. Sighted participants dominant strategy was based on motor-sensory convergence on the most informative regions of objects, similarly to fixation patterns in vision. Congenitally blind participants did not show such a tendency, and many of their exploratory procedures resembled those observed with natural touch.
... The experiments carried out by Bach-y-Rita on TVSS led to the commercialization of "VideoTact" [179], a device which allows blind users to recognize the shape of simple objects and line orientations thanks to array of vibrators acting on different areas of the skin. Other two devices enabling vision substitution are the "Visotoner", which converts video inputs in haptic signals through vibrators placed on fingers, and a camera system electronically interfaced with a flexible pad of vibrators placed on stomach's or back's skin [180]. ...
Article
Purpose: The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf–blind individuals. Method: The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. Results: The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf–blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. Conclusions: A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. • Implications for rehabilitation • Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. • Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. • Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. • Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. • Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients’ acceptability of these technologies.
... In seguito a ciò, le connessioni tra A1 e S1 sono diventate più rade nel lato lesionato, ma le connessioni tra A1 e OP1/S2 sono diventate più Occasionalmente, il termine può essere considerato come un'espressione racchiudibile nel concetto di plasticità neurale, che la include. Essa infatti, può essere riferita alla flessibilità in cui una capacità sensoriale intrinseca è usata e legata a un'altra capacità sensoriale (Visell, 2008). In questa prospettiva, la sostituzione sensoriale offre la possibilità di recupero di abilità (per esempio in ...
... Research on haptics and tactile transducers is an interdisciplinary field that covers perception [1], psychophysics [2], vision substitution system for the blind and visually impaired [3], [4], virtual reality (VR), mechanism design and control, bio-medical engineering, mobile communication [5], telerobotics [6], [7], material recognition [8], and humancomputer interaction (HCI) [9]. ...
Conference Paper
This study presents a novel 4x4 fingertip tactile matrix actuator that can be strapped on a finger. It is made from Dot Braille cells purchased from Dot Inc., Korea. The prototype has a surface area of 1.08 cm2 with a pin pitch of 2.6 mm and operates at 5V supply. Each tactile pin can be controlled using an h-bridge motor driver and Arduino microcontroller. The tactile matrix is coupled with a tactile matrix simulator that scans a binary image or edges of an image using Canny edge detector. The simulator has 16 sections corresponding to the 16 actuator pins. The integration of the simulator to the hardware prototype allows the user to feel a binary image of a plane geometric figure or to feel the edges of an image as the scanning region of interest (ROI) moves across the visual screen. This fingertip tactile matrix display would be useful in many Virtual Reality (VR) applications to provide tactile feedback on the textures of virtual objects. Therefore, the authors suggest that this device will be beneficial in many applications such as virtual surgery, virtual fashion, remote sensing, and telerobotics.
Chapter
Cross-modal plasticity causes an ordinary sense to become a supernormal wisdom as a result of deprivation in one sensory modality. This ability of the brain has created a potentially fertile framework for the development of sensory substitution systems and brain-machine interfaces.
Conference Paper
Sensory substitution and augmentation (SSA) offer novel perceptual experiences to users, by taking sensor data appropriate to one modality and mapping it to another. This technology can be implemented as a new kind of digital game interface, bringing new skills and novel experience in games designed to take advantage of it. My research explores practical and theoretical obstacles, to establish the potential efficacy of this approach. At the same time, I aim to increase understanding of the nature of perceptual experience, especially in relation to real-world and virtual perceptual environments.
Chapter
In order to engineer haptic technologies for the hand, knowledge about what signals are felt during natural interactions is needed. The findings from Chaps. 3 and 4 demonstrate the utility of examining distributed tactile vibrations accompanying whole-hand haptic interactions. However, existing sensing devices cannot capture the full range of tactile information in the naturally behaving hand and are unable to match human abilities of perception and action. Thus, in this chapter presents the design of a new sensor apparatus, comprising a 126-channel wearable tactile sensor array, that is adapted to the anatomy of the hand, and that corresponds to the frequency sensitivity range of human tactile sensing. This device permits tactile sensing in vivo without kinematic constraints on hand movements. It provides new methods for collecting tactile data outside of constrained laboratory experiments, physiologically informed signal processing methods for reconstructing whole-hand tactile signals, and new methods for distributed tactile sensing that may have applications in robotics, upper-limb prosthetics, and other domains.
Article
Recent advances in the rapidly growing field of soft robotics highlight the potential for innovations in wearable soft robotics to meet challenges and opportunities affecting individuals, society, and the economy. Some of the most promising application areas include wearable haptic interfaces, assistive robotics, and biomedical devices. Several attributes of soft robotic systems make them well-suited for use in human-wearable applications. Such systems can be designed to accommodate the complex morphology and movements of the human body, can afford sufficient compliance to ensure safe operation in intimate proximity with humans, and can provide context-appropriate haptic feedback or assistance to their wearers. Many soft robotic systems have been designed to resemble garments or wearables that are already widely used today. Such systems could one day become seamlessly integrated into a myriad of human activities and environments. Here, we review emerging advances in wearable soft robotic technologies and systems, including numerous examples from prior research. We discuss important considerations for the design of such systems based on functional concerns, wearability, and ergonomics. We describe an array of design strategies that have been adopted in prior research. We review wearable soft robotics applications in diverse domains, survey sensing and actuation technologies, materials, and fabrication methods. We conclude by discussing frontiers, challenges, and future prospects for soft, wearable robotics.
Chapter
Full-text available
Despite its behavioural significance and omnipresence throughout the animal kingdom, the sense of touch is still one of the least studied and understood modalities. There are multiple forms of touch, and the mechanosensory basis underlying touch perception must be divided into several distinct sub-modalities (such as vibration or pressure), as will be made clear by the contributions elsewhere in this encyclopaedia. The commonality of all touch sensing systems is that touch experience is mediated by specialised receptors embedded in the integument—the outer protective layers of the animal such as the mammalian skin or the arthropod cuticle. Comparative research on touch, and its neuroethology, is only just beginning to provide a larger picture of the different forms of touch sensing within the animal kingdom. We begin our volume by reviewing works on several different invertebrate and vertebrate species, focusing on mechanosensation, each one with a specific requirement for tactile information. The aim of this introductory overview is to give selected examples of research on important model organisms from various classes of the animal kingdom, ranging from the skin of worms to the feelers of insects, and from the whiskers of a rat to the human hand. We conclude by discussing forms of human touch and the possibility of its future extension via synthetic systems.
Chapter
Book link: https://books.google.ca/books?hl=en&lr=&id=ZfRNEAAAQBAJ&oi=fnd&pg=PA3&ots=KvImUNi961&sig=40HuC2lumwM6Ns0SMtVEja-W38g&redir_esc=y#v=onepage&q&f=false The real world is multisensory and our experiences in this world are constructed by the stimulation of all our senses including visual, auditory, touch, olfactory, and taste. However, virtual environments including virtual simulations and serious games, and the human-computer interface more generally, have focused on the visual and auditory senses. Simulating the other senses such as the sense of smell (olfaction) can be beneficial for supporting learning. In this paper we present a simple and cost-effective olfactory interface constructed using an Arduino Uno microcontroller board, a small fan, an off-the-shelf air freshener to deliver a scents to a user of a serious game. A fuzzy logic system regulated the amount of scent delivered to the user based on their distance to the display. As a proof-of-concept, we developed a serious game intended to teach basic math (counting) skills to children. Learners (players) collect pineapples from the scene and then enter the amount of pineapples collected. As the pineapples are collected, a pineapple scent is emitted from the olfactory interface thus serving to supplement or complement the learner’s senses and stimulate their affection and cognition. As part of our proof-of-concept, a 10 year old learner played the game and provided us with feedback regarding the olfactory interface and illustrated the potential of the system.
Chapter
Full-text available
The article presents the study results of cryptocurrencies dynamics, which are the most popular at the present time (Bitcoin, Ethereum and Ripple). The paper provides the opinions of various researchers regarding the economic nature of cryptocurrencies. Taking into account that the most significant drawback of modern cryptocurrencies is the high volatility of their market prices, the goal was to find an adequate method for forecasting cryptocurrency rates. Having analyzed an extensive selection of scientific articles on this topic and tested the cryptocurrency market for its information efficiency. The heterogeneous autoregressive model of realized volatility (HAR-RV) was first chosen as a working tool for predicting the rate of cryptocurrencies. To improve the accuracy of the forecast, it was additionally proposed to calculate the Shannon entropy based on the probabilities of a decrease in cryptocurrency market prices. The forecast accuracy by the proposed method exceeded all the most optimistic expectations. This method of calculating the market rate of cryptocurrencies will be useful to investors and speculators when making management decisions.
Article
Human-machine interface (HMI) techniques use bioelectrical signals to gain real-time synchronised communication between the human body and machine functioning. HMI technology not only provides a real-time control access but also has the ability to control multiple functions at a single instance of time with modest human inputs and increased efficiency. The HMI technologies yield advanced control access on numerous applications such as health monitoring, medical diagnostics, development of prosthetic and assistive devices, automotive and aerospace industry, robotic controls and many more fields. In this paper, various physiological signals, their acquisition and processing techniques along with their respective applications in different HMI technologies have been discussed.
Article
Tactile sensing is essential for skilled manipulation and object perception. Existing sensing devices cannot capture the full range of tactile information in the naturally behaving hand, and are unable to match human abilities of perception and action. Human touch sensing is mediated via contact-generated mechanical signals in the skin. Time varying contacts elicit propagating mechanical waves that are captured via numerous vibration-sensitive neurons distributed throughout the hand, yielding a wealth of sensory information. Little engineering attention has been given to important biological sensing system. Inspired by human sensing abilities, we present a wearable system based on a 126 channel sensor array capable of capturing high resolution tactile signals throughout the hand during natural manual activities. It employs a network of miniature threeaxis sensors mounted on a flexible circuit whose geometry is adapted to the anatomy of the hand, allowing tactile data to be captured during natural manual interactions. Each sensor possesses a frequency bandwidth overlapping the entire human tactile frequency range. Data is acquired in real time via a custom FPGA and an I2C network. We also present physiologically informed signal processing methods for reconstructing whole hand tactile signals. We report experiments that demonstrate the utility of this system for collecting rich tactile signals during manual interactions.
Article
This paper is an introduction to pseudo-haptics; that is, the use of touch-based illusions created by cross-modal perceptual interactions. Many studies have shown that it is possible to use visual or auditory stimuli to simulate the experience of touch, movement, and force. Pseudo-haptics isis useful in many applications, particularly where the user may not have a haptic device available but where the sensation of haptic feedback is useful in providing information or creating a sense of presence. In this paper, we present an overview of the current state of the literature in pseudo-haptics.
Chapter
This chapter addresses sensors and actuators for three main sensory modalities: hearing, vision, and touch. Technology in recent years was often focused on flat displays, e.g., or smartwatches. In contrast, the focus of this chapter is on wearable technologies of the post-smartphone area, allowing rich sensory input and output modalities. New approaches for tactile feedback include fiber-based sensors and actuators for e-textiles as well as complex 3D-fiber structures. Wearable sound sources and innovative sound-projection solutions using perceptual knowledge can replace complex distributed channel systems. One way to reduce latencies in vision are Organic Light-Emitting Diode (OLED) eyeables, that have the potential to significantly improving reaction time and power consumption. Combining the expertise of human perception and action with that on sensors and actuators contributes to augmented perception and interaction to exchange perceptual knowledge, which is needed for holistic sensor and actuator development. Moreover, sensors and actuators combined with tactile electronics in terms of flexible electronics and circuits for interfacing/processing explore new technologies for haptic and robotic systems.
Chapter
The sense of touch can be used for sensory substitution, i.e., to represent visual or auditory cues to impaired users. Sensory substitution often requires the extensive training of subjects, leading to exhaustion and frustration over time. The goal of this paper is to investigate the ability of the subjects to recognize alphanumeric letters on 3 × 3 vibration array, where the subjects can fully personalize the variables including spatial location, vibratory rhythm, burst duration and intensity. We present a vibrotactile device for delivering the spatiotemporal letter patterns while maintaining the high level of expressiveness. The results prove that this system is an effective solution with a low cognitive load for visually/auditory impaired people and for any context that would benefit from leaving the eyes/ears free for other tasks.
Article
Full-text available
Haptic interfaces enable person-machine communication through touch, and most commonly, in response to user movements. We comment on a distinct property of haptic interfaces, that of providing for simultaneous information exchange between a user and a machine. We also comment on the fact that, like other kinds of displays, they can take advantage of both the strengths and the limitations of human perception. The paper then proceeds with a description of the components and the modus operandi of haptic interfaces, followed by a list of current and prospective applications and a discussion of a cross-section of current device designs.
Article
Full-text available
During a 10 day taxiflight to the International Space Station (ISS) in 2004, Dutch astronaut André Kuipers is scheduled to test a multi-purpose vibro- tactile vest. The main application of the vest is supporting the astronaut's orien- tation awareness. To this end, we employ an artificial gravity vector analogy. The location of vibration on the torso indicates the direction of a vector repre- senting the standard ISS orientation. This application is hypothesised to in- crease the astronaut's performance and safety. A second application is designed to be used during rest. Additional vibrating elements are attached to the ankles, knees, elbows and wrists of the astronaut. By using specific, pre-programmed spatiotemporal patterns, the astronaut can get the feeling of whole body stimu- lation. This will support the astronaut in sensing and locating his extremities in space. This application is hypothesised to compensate for the sensory depriva- tion of the proprioceptive system during weightlessness, resulting in increased comfort for the astronaut.
Article
Full-text available
We studied on the feasibility of focused ultrasound for tactile feeling display. Our experiments showed that focused ultrasound at 1 MHz and 5 MHz effected our somatic sensation both by its radiation pressure and by direct stimulation to the nerve structures. The radiation pressure force generated by our apparatus was 0.23 gf/cm 2 per 1-watt power consumption and evoked tactile feeling stably, while direct nerve stimulation could not induce stable tactile feeling.
Article
Full-text available
Virtual reality techniques allow one to interact with synthetic worlds, i.e. virtual environments. Tactile feedback means conveying parameters such as roughness, rigidity, and temperature. These information and many others are obtained thanks to the sense of touch. Tactile feedback is of prime importance in many applications. In this paper we examine the state-of-the-art in tactile interfaces design. Thorough reviews of the literature reveal a significant amount of publications concerning tactile human-machine interfaces, especially onwards the ~1990. This paper reports the progress in tactile interfaces technology in the following areas: teleoperation, telepresence, sensory substitution, 3D surface generation, Braille systems, laboratory prototypes, and games. Different parameters and specifications required to generate and feed back tactile sensation are summarized.
Article
Full-text available
Tactation is the sensation perceived by the sense of touch. Tactation is based on the skin receptors. The skin nerves can be stimulated by mechanical, electrical or thermal stimuli. Apart from fibers for pain, skin has six more types of receptors. A review of the state of the art concerning the physiological and technological principles, considerations and characteristics, as well as latest implementations of micro-actuator based tactile graphic displays and the relative software interfaces structures and representations is presented. Fabrication technologies are reviewed in order to demonstrate the potential in tactile ap-plications. Existing electronic Braille displays for accessibility are limited to text-based information. Graphic tactile displays enable for viewing images by the sense of touch on a reusable surface and substitution of the visual/auditory sense. Applications include education, engineering/artistic design, web surfing, and viewing of art and photographs. Tactile substitution can be used in augmenting accessibility for the blind or deaf in order to: (a) to enhance access to computer graphical user interfaces, (b) to enhance mobility in controlled environments. In general tactation based interfaces may allow communication of visual information to the brain in situations where the visual or hearing system is already overloaded such as race car drivers, airplane pilots, operating rooms, virtual reality and tele-presence.
Article
Full-text available
Studies using the sensory substitution devices designed for visually impaired persons reveal that perceptive activity itself is embodied in a lived body capable of movement and possessing its own spatial dimensions. We have used these prosthetic devices, based on the substitution of the visual sensory input by a tactile sensory input, in order to carry out a systematic study of perception, and in particular of spatial depth. We show that the spatial localisation of a target requires dynamic sensori-motor coupling, and that this activity involves the spatial dimensions of the lived body of the perceiving subject.
Article
Full-text available
We have been developing an electro-tactile display to present realistic tactile sensation for Virtual Reality and robot teleoperation. This paper shows quantitative evidence that Meissner corpuscles are selectively activated in our specific setup of electrical stimulation, dubbed "Meissner mode." We also show the fundamental limitation of electro-tactile displays, which is caused by the physical incapacity to selectively stimulate Pacinian corpuscles, which reside in the deeper tissue, because shallower parts are inevitably co-activated.
Article
Full-text available
EXPERIMENTAL QUESTIONS There have been many accounts of blind and blindfolded subjects using sensory substitution devices to behave successfully with respect to visual stimuli (Bach-y-Rita, 1972; Auvray, et al., 2007). Yet there remain unanswered questions about why sensory substitution works: 1) Does sensory substitution enable subjects to perceive distal objects, or do they become aware of them as a result of cognitive infer-ences on the proximal stimulation? 2) How does self-movement facilitate distal at-tribution during sensory substitution? METHODS Subjects. 31 sighted participants. Apparatus. A simplified sensory substitution device was used, similar to that of Lenay, et al. (2001). The device consisted of a single finger-mounted photodiode that activated a small vi-brating motor whenever subjects directed it toward a light source. Photodiode (worn on finger) Motor (1.3 cm diameter, worn on back) Experimental protocol. Blinfolded, seated subjects used the device to determine the distance of a fluorescent light (B) placed randomly along a 1.93-m track (A). After 2 minutes, the light was removed, and the subject visually guided a remote-controlled target (C) to the remem-bered location of the light. Location of subject Experimental phases. (1) Learning phase: 60 trials over two sessions; no feedback about performance or the nature of the light source. (2) Transfer phase: 30 trials over one session; the device was altered prior to the start of this phase to determine whether learned abilities transfer to new conditions. Instructional conditions. Group CT: "Conscious triangulation"; in-structed to attend to proximal variables (e.g., arm angle) [N = 11] Group DA: "Distal attribution"; instructed to ignore proximal variables [N = 20] RESULTS Figure 3. Effect of instructional condition on performance Figure 5. Distal attribution is correlated with accuracy Responses to the question, "how solid does the light feel?" are negatively correlated with mean un-signed error (p = 0.0005). The amount of attention subjects reported paying to their arms was posi-tively correlated with error (p = 0.0387). Figure 6. Learning transfers to new limb configurations Arm Transfer. [N = 11] Photodiode is transferred to the index finger of the opposite hand. No significant change in accu-racy. Mean unsigned error is significantly better than that of first 20 trials (p = 0.0065) [t-test]. Rotation Transfer. [N = 9] Subject's body is rotated 90°. No sig-nificant change in accuracy. CONCLUSIONS • Biasing subjects toward attending to the light itself (DA), as opposed to proximal vari-ables only (CT), improved distance judgments. Subjects that experience the light as a "solid object" perform the task better. This is evidence that sensory substitution devices enable their users to perceive distal objects, rather than simply learn about the environ-ment through conscious inferences on proximal stimulation. • Based on the results of the transfer phase, abilities gained during the learning phase are not disrupted by changes in the "sensorimotor contingencies" (O'Regan & Noë, 2001) involved in the task (Fig. 6). The perceptual skills learned by subjects are not limited to a single motor system and are not disrupted by changes in the relationship between arm angle and the distance of the light. REFERENCES Auvray M, et al. (2007) Learning to perceive with a visuo–auditory substitution system: Localisation and object recognition with the 'vOICe'. Perception, 36: 416-430. Bach-y-Rita P (1972) Brain Mechanisms in Sensory Substitution (New York: Academic Press). Lenay C, et al. (2001) The constitution of spatiality in relation to the lived body. Conf. on the Emergence and De-velopment of Embodied Cognition (Beijing, China: August). O'Regan K and A Noë (2001) A sensorimotor account of visual consciousness. Behavioral and Brain Sciences, 24: 939-1031.
Technical Report
Full-text available
The blind and the visually impaired are in a unique position to appreciate and make functional use of haptic devices. Designing devices for the blind is, however, more arduous than many researchers and inventors expect. It is thus important to fully understand the needs and requirements of that community before attempting to create devices for them. It is also important to learn from past research and development in the application of technology for the blind. This survey provides an overview of current knowledge on blindness and rehabilitation technology relevant for the design of aids for the blind, and more particularly for the use of haptics with the blind. The survey begins with a demystification of blindness and a discussion of the differences between blind and sighted. Follows a broad overview of the many attempts at applying technological solutions to problems encountered by the blind. The survey ends with a discussion of lessons learned from previous failures and successes in rehabilitation technology as well as speculation on the future of haptics and other technologies for people living with blindness.
Article
Full-text available
When an object rolls or slides inside a hand-held tube, a variety of cues are normally available to estimate its location inside the cavity. These cues are related to the dynamics of an object subjected to the laws of physics such as gravity and friction. This may be viewed as a form of sensorymotor coupling which does not involve vision but which links motor output to acoustic and tactile inputs. The theory of sensorymotor contingency posits that humans exploit invariants about the physics of their environment and about their own sensory-motor apparatus to develop the perception of the outside world. We report on the design and the results of an experiment where subjects held an apparatus that simulated the physics of an object rolling or sliding inside a tubular cavity. The apparatus synthesized simple haptic cues resulting from rolling noise or impact on internal walls. Given these cues, subjects were asked to discriminate between the lengths of different virtual tubes. The subjects were not trained at the task and had to make judgments from a single gesture. The re-sults support the idea that the subjects mastered invariants related to the dynamics of objects under the influence of gravity that they were able to use them to perceive the length of invisible cavities.
Article
Full-text available
In this work, the tactual information transmission capabilities of a tactual display designed to provide stimulation along a continuum from kinesthetic movements to cutaneous vibrations are assessed. The display is capable of delivering arbitrary waveforms to three digits (thumb, index, and middle finger) within an amplitude range from absolute detection threshold to about 50 dB sensation level and a frequency range from dc to above 300 Hz. Stimulus sets were designed at each of three signal durations (125, 250, and 500 msec) by combining salient attributes, such as frequency (further divided into low, middle, and high regions), amplitude, direction of motion, and finger location. Estimated static information transfer (IT) was 6.5 bits at 500 mseC., 6.4 bits at 250 mseC., and 5.6 bits at 125 msec. Estimates of IT rate were derived from identification experiments in which the subject’s task was to identify the middle stimulus in a sequence of three stimuli randomly selected from a given stimulus set. On the basis of the extrapolations from these IT measurements to continuous streams, the IT rate was estimated to be about 12 bits/seC., which is roughly the same as that achieved by Tadoma users in tactual speech communication.
Conference Paper
Full-text available
Is there a way for an algorithm linked to an unknown body to infer by itself information about this body and the world it is in? Taking the case of space for example, is there a way for this algorithm to realize that its body is in a three dimensional world? Is it possible for this algorithm to discover how to move in a straight line? And more basically: do these questions make any sense at all given that the algorithm only has access to the very high-dimensional data consisting of its sensory inputs and motor outputs? We demonstrate in this article how these questions can be given a positive answer. We show that it is possible to make an algorithm that, by ana- lyzing the law that links its motor outputs to its sensory inputs, discovers information about the structure of the world regardless of the devices constituting the body it is linked to. We present results from simulations demonstrating a way to issue motor orders resulting in "fundamental" movements of the body as regards the structure of the physical world.
Conference Paper
Full-text available
Presents the design and testing of a multi-channel vibrotactile display. It is composed of a cylindrical handle with four embedded vibrating elements driven by piezoelectric beams. Vibrations are transmitted to the hands through arrays of pins. The device was tested in sensory substitution for conveying force information during a teleoperated peg insertion. Results show that the device is effective in reducing peak forces during the insertion task.
Article
Full-text available
We see with the brain, not the eyes (Bach-y-Rita, 1972); images that pass through our pupils go no further than the retina. From there image information travels to the rest of the brain by means of coded pulse trains, and the brain, being highly plastic, can learn to interpret them in visual terms. Perceptual levels of the brain interpret the spatially encoded neural activity, modified and augmented by nonsynaptic and other brain plasticity mechanisms (Bach-y-Rita, 1972, 1995, 1999, in press). However, the cognitive value of that information is not merely a process of image analysis. Perception of the image relies on memory, learning, contextual interpretation (e. g., we perceive intent of the driver in the slight lateral movements of a car in front of us on the highway), cultural, and other social fac-tors that are probably exclusively human characteristics that provide "qualia" (Bach-y-Rita, 1996b). This is the basis for our tactile vision substitution system (TVSS) studies that, starting in 1963, have demonstrated that visual information and the subjective qualities of seeing can be obtained tactually using sensory sub-stitution systems., The description of studies with this system have been taken
Article
Full-text available
Tactile arrays use a matrix of individually controllable elements to present spatial and temporal patterns of cutaneous information. Early devices of this type were in the field of sensory substitution to replace vision or hearing for users with a sensory impairment. Many advances have been made due to the appropriation of tactile displays for telerobotics and virtual reality, to represent physical contact with a remote or simulated environment. However, many of these have been limited to engineering prototypes. The recent commercial availability of affordable, portable tactile pin arrays has provided renewed impetus to apply the technology to sensory substitution applications. Lack of access to digitally stored data can prove a significant barrier to blind people seeking careers in numerate disciplines. Tactile displays could potentially provide a discrete and portable means of accessing graphical information in an intuitive non-visual manner. Results are presented from experiments on tactual perception related to understanding graphs and simple visualisations with a commercially available tactile array device. It was found that subjects could discriminate positive or negative line gradient to within 74.71 of the horizontal, compared to 73.251 for results with a force feedback mouse and 72.421 with a raised paper representation. r 2006 Elsevier B.V. All rights reserved.
Article
Full-text available
This paper provides a general overview of tactual displays (i.e., devices that communicate to a user through the sense of touch) and issues concerning the development of such displays for wearable computing. A wearable tactile directional display is presented. It takes advantage of a sensory phenomenon called ‘sensory saltation’ and simulates directional lines and simple geometric patterns dynamically. Initial results indicate that this information is intuitive and easy to interpret even for first-time users, and interpretations are highly consistent among observers. Several applications of this tactile directional display for wearable computing are proposed.
Article
Full-text available
The suggestion that the body surface might be used as an additional means of presenting information to human-machine operators has been around in the literature for nearly 50 years. Although recent technological advances have made the possibility of using the body as a receptive surface much more realistic, the fundamental limitations on the human information processing of tactile stimuli presented across the body surface are, however, still largely unknown. This literature review provides an overview of studies that have attempted to use vibrotactile interfaces to convey information to human operators. The importance of investigating any possible central cognitive limitations (i.e., rather than the peripheral limitations, such as related to sensory masking, that were typically addressed in earlier research) on tactile processing for the most effective design of body interfaces is highlighted. The applicability of the constraints emerging from studies of tactile processing under conditions of unisensory (i.e., purely tactile) stimulus presentation, to more ecologically valid conditions of multisensory stimulation, is also discussed. Finally, the results obtained from recent studies of tactile information processing under conditions of multisensory stimulation are described, and their implications for haptic/tactile interface design elucidated.
Article
Full-text available
Tactile interfaces are used to communicate information through the sense of touch, which is an area of growing interests from the research community. Potential applications include virtual training for surgeons, remotely touching materials via internet, automotive industry, active interfaces for blind persons, and sensory substitution devices.
Article
The two-point threshold, or compass test, has long been used as a measure of tactile spatial resolution; however, since it was first developed, there have been problems associated with its use. Some of these problems include setting an appropriate criterion for responding "two," extreme variability both within and between subjects, and the ability of subjects to discriminate two points from one at separations well below the two-point threshold. Recent neurophysiological results have clarified some of the neural mechanisms responsible for spatial resolution and demonstrated the inadequacy of the two-point threshold as a measure of spatial mechanisms. Several new methods may overcome these problems and provide a valid measure of spatial resolution and a reflection of neural mechanisms.
Article
In implementing tactile feedback, we are considering the problems of tactile transduction, signal processing, and tactile stimulation. While some of the problems--e.g. sensing--have been addressed in the past, tactile simulation or 'tactile display,' as an informational medium is yet at a very early stage of development. Our work has focused on the design of a 5 X 5 tactile display, and the fabrication problems surrounding high spatio-temporal resolution. Using pneumatic actuators and a mix of conventional and micromachining techniques, we have prototyped and characterized the display, and created a linked sensor-display system. The display was characterized in the usual manner of a linear system and the ability of human subjects to discriminate patterns, forces, and displacements was measured. The display was found to have a maximum force output of 340 milliNewton at each element, force resolution of 4.4 bits, and a frequency response of 7 Hz. Human subjects were able to recognize simple geometric patterns presented on the display, discriminant forces with 3.3 bits resolution, and sense displacements of 0.1 mm (5% of the array spacing).
Article
This paper presents the design and testing of a multi-channel vibrotactile display composed of cylindrical handle with four embedded vibrating elements driven by piezoelectric beams. The experimental goal of the paper is to analyze the performance of the device during a teleoperated force controlled task. As a test bed, a teleoperator system composed of two PHANToM haptic devices is used to trace a rectangular path while the operator attempts to maintain a constant force at the remote manipulator's tip. Four sensory modalities are compared. The first is visual feedback alone. Then, visual feedback is combined with vibration, force feedback, and force feedback plus vibration. Comparisons among these four modes are presented in terms of mean force error. Results show that force feedback combined with vibration provide the best feedback for the task. They also indicate that the vibrotactile device provides a clear benefit in the intended application, by reducing the mean force errors by 35 percent when compared to visual feedback alone.
Article
Publisher Summary To extract information derived from one sensory modality and used in another, shape can be known by touch and then can be identified correctly by sight. The chapter discusses the issue of internal versus experiential influences in the organization of the brain. The impressions generated by different sensory modalities can be integrated into a richer percept. The ventriloquism effect broadly refers to this phenomenon on which much of the knowledge of the world and an entire entertainment industry is based. Multimodal association areas that contain multisensory neurons are thought to provide a neural substrate for integrating sensory experiences, modulating the saliency of stimuli, assigning experiential and affective relevance, and providing the substrate for the true perceptual experience of the rich world. The chapter presents the hypothesis that the brain might actually represent a metamodal structure organized as operators that execute a given function or computation regardless of sensory input modality. Such operators might have a predilection for a given sensory input based on its relative suitability for the assigned computation. Such predilection might lead to operator-specific selective reinforcement of certain sensory inputs, eventually generating the impression of a brain structured in parallel, segregated systems processing different sensory signals.
Article
Current results of an evaluation of the speech reception of deaf adults who are making use of tactile aids for substantial periods of time in the field are described. Eight deaf adults were provided with a wearable multichannel tactile aid that they were asked to wear routinely in their everyday lives. These subjects, who did not receive any specific training in the use of the tactile aid, participated in periodic laboratory evaluations of their speech-reception performance with the aids and also provided responses to a communication-profile survey. The laboratory results reported here were concerned primarily with assessing speechreading ability with and without tactile aids for several types of connected-speech materials and tasks. The relative gain (difference between aided and unaided speechreading scores normalized by the maximum possible improvement) observed for the reception of words in isolated sentences averaged roughly 25%, with a range of 0-65% across subjects, and showed a tendency to increase with the speechreading-alone score. Small improvements, averaging six words/minute and ranging from -2 to 10 words/minute across subjects, were observed for connected-discourse tracking when tactile aids were used to supplement speechreading compared to speechreading alone. A comparison of performance for tactile aids, cochlear implants, and hearing aids is presented for aided speechreading of sentence materials.
Article
The two-point threshold, or compass test, has long been used as a measure of tactile spatial resolution; however, since it was first developed, there have been problems associated with its use. Some of these problems include setting an appropriate criterion for responding “two,” extreme variability both within and between subjects, and the ability of subjects to discriminate two points from one at separations well below the two-point threshold. Recent neurophysiological results have clarified some of the neural mechanisms responsible for spatial resolution and demonstrated the inadequacy of the two-point threshold as a measure of spatial mechanisms. Several new methods may overcome these problems and provide a valid measure of spatial resolution and a reflection of neural mechanisms.
Article
This thesis describes the development of custom-built tactile feedback hardware and its integration with an available force-reflecting haptic interface. Design requirements were motivated strongly by the characteristics of the human tactile sense as well as the biomechanical characteristics of the human finger. The work explores the feasibility of various actuators, and selects a small solenoid actuator for application in a closed-loop force control tactile feedback system. An adaptive PI algorithm using continuously variable gain scheduling helps to compensate for nonlinearities in the solenoid actuator. The system demonstrates adequate closed-loop control, but the mass added to the force-reflecting haptic interface proves less than optimal. Design suggestions for future prototypes may reduce the mass added by the tactile feedback hardware by over 30%. The work concludes with recommendations for psychophysical research that will increase understanding of human performance in tasks using haptic feedback devices.
Article
In implementing tactile feedback, we are considering the problems of tactile transduction, signal processing, and tactile stimulation. While some of the problems--e.g. sensing--have been addressed in the past, tactile simulation or 'tactile display,' as an informational medium is yet at a very early stage of development. Our work has focused on the design of a 5 X 5 tactile display, and the fabrication problems surrounding high spatio-temporal resolution. Using pneumatic actuators and a mix of conventional and micromachining techniques, we have prototyped and characterized the display, and created a linked sensor-display system. The display was characterized in the usual manner of a linear system and the ability of human subjects to discriminate patterns, forces, and displacements was measured. The display was found to have a maximum force output of 340 milliNewton at each element, force resolution of 4.4 bits, and a frequency response of 7 Hz. Human subjects were able to recognize simple geometric patterns presented on the display, discriminant forces with 3.3 bits resolution, and sense displacements of 0.1 mm (5% of the array spacing).© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.
Article
ABSTRACT What is the relation between perceptual experience and the suite of sensorimotor skills that enable us to act in the very world we perceive? The relation, according to ‘sensorimotor models’ (O'Regan and Noë 2001, Noë 2004) is tight indeed. Perceptual experience, on these accounts, is enacted via skilled sensorimotor activity, and gains its content and character courtesy of our knowledge of the relations between (typically) movement and sensory stimulation. I shall argue that this formulation is too extreme, and that it fails to accommodate the substantial firewalls, dis-integrations, and special-purpose streamings that form the massed strata of human cognition. In particular, such strong sensorimotor models threaten to obscure the computationally potent insensitivity of key information-processing events to the full subtleties of embodied cycles of sensing and moving.
Article
We introduce a distinction between cortical dominance andcortical deference, and apply it to various examples ofneural plasticity in which input is rerouted intermodally orintramodally to nonstandard cortical targets. In some cases butnot others, cortical activity `defers' to the nonstandard sourcesof input. We ask why, consider some possible explanations, andpropose a dynamic sensorimotor hypothesis. We believe that thisdistinction is important and worthy of further study, bothphilosophical and empirical, whether or not our hypothesis turnsout to be correct. In particular, the question of how the distinction should be explained is linked to explanatory gapissues for consciousness. Comparative and absolute explanatorygaps should be distinguished: why does neural activity in aparticular area of cortex have this qualitative expressionrather than that, and why does it have any qualitativeexpression at all? We use the dominance/deference distinction toaddress the comparative gaps, both intermodal and intramodal (notthe absolute gap). We do so not by inward scrutiny but rather by expanding our gaze to include relations between brain, body andenvironment.
Article
The objective of the research presented in this paper was to study the capabilities of sensory substitution for force feedback, presented to the operator of a teleoperation system, through the tactile and auditory senses. Traditional bilateral force feedback or force reflection, which applies forces to a human operator's hand or arm muscles, while generally beneficial can be limited in the operator's ability to perceive small force values. Sensory substitution for force feedback was shown in this study to increase the operator's ability to perceive small forces by allowing an increase in the effective feedback gain without risking instability or impeding the operator's inputs to the system.
Article
A vibrotactile display, consisting of eight vibrating elements or tactors mounted in a driver’s seat, was tested in a driving simulator. Participants drove with visual, tactile and multimodal navigation displays through a built-up area. Workload and the reaction time to navigation messages were measured for normal and high workload conditions. The results demonstrated that the tactile navigation display reduces the driver’s workload compared to the visual display, particularly in the high workload group. The fastest reaction was found with the multimodal display. It was concluded that this study quantitatively supports the claims that a localised vibration or tap is an intuitive way to present direction information, and that employing the tactile channel may release other heavily loaded sensory channels, therefore potentially providing a major safety enhancement.
Article
Blind subjects who learn to read Braille must acquire the ability to extract spatial information from subtle tactile stimuli. In order to accomplish this, neuroplastic changes appear to take place. During Braille learning, the sensorimotor cortical area devoted to the representation of the reading finger enlarges. This enlargement follows a two-step process that can be demonstrated with transcranial magnetic stimulation mapping and suggests initial unmasking of existing connections and eventual establishment of more stable structural changes. In addition, Braille learning appears to be associated with the recruitment of parts of the occipital, formerly `visual', cortex (V1 and V2) for tactile information processing. In blind, proficient Braille readers, the occipital cortex can be shown not only to be associated with tactile Braille reading but also to be critical for reading accuracy. Recent studies suggest the possibility of applying non-invasive neurophysiological techniques to guide and improve functional outcomes of these plastic changes. Such interventions might provide a means of accelerating functional adjustment to blindness.
Conference Paper
In this paper the concept and the first proving experiments of a navigation system for assistance during surgical interventions is presented. Novel aspects of this system are a new approach to transmit the navigation information to the surgeon. Contrary to known navigation systems the proposed approach does not only use visual support but provides information primarily by tactile stimuli. So the surgeon is not constrained to avert the gaze from the field of surgery during the navigation process. This paper concludes with the experimental evaluation of signal perception and a positioning task with tactile transmission of the navigation information