Distribution of mechanoreceptors in the foot's sole, following [26].

Distribution of mechanoreceptors in the foot's sole, following [26].

Source publication
Article
Full-text available
This paper presents a novel wearable interface for the foot: a shoe-integrated tactile display that enables users to obtain information through the sense of touch via their feet. A 16-point array of actuators stimulates the sole of the foot by inducing different vibration frequencies. A series of experiments were conducted with 20 sighted and 5 bli...

Citations

... For people with visual impairments, this is especially relevant as this participatory relationship to the "visually biased" public space has been postulated as one of the most challenging interactions between humans and spaces (11). Exploring devices that exploit tactile-foot stimulation for improving directional navigation is an exciting and necessary venture (12). Unlike assistive devices (canes), shoebased systems do not interfere with natural tactile feedback of the hands during exploration or auditory signals used for successful navigation. ...
Article
Full-text available
How humans perceive the texture of a surface can inform and guide how their interaction takes place. From grasping a glass to walking on icy steps, the information we gather from the surfaces we interact with is instrumental to the success of our movements. However, the hands and feet differ in their ability to explore and identify textures. Higher concentrations of mechanoreceptors in the fingertips provide tactile information to help modulate force and grip while the receptors of the feet help to inform surface texture and aid in balance. Cleland et al. (2024), explores the relationship between texture perception, mode of exploration and region of body used to explore said texture (hands vs feet). This research is especially important in the context of understanding how texture perception affects stability, how hands and feet differ in their management and execution of tasks and how this is adjusted in special populations of visually impaired individuals.
... A-RAC is processed through the auditory cortex, aiding in gait coordination, while T-RAC, detected by Pacinian corpuscles, is processed through the somatosensory cortex, essential for interpreting tactile feedback. The integration of these stimuli in the association cortices aids in planning and executing voluntary movements [36,37]. The motor cortices, basal ganglia, and cerebellum refine motor commands for precise gait pattern adjustments, demonstrating the efficacy of these cueing methods in gait rehabilitation [38,39]. ...
... The device received signals from CAREN's D-Flow software and operated the motors at a verified frequency of 257.7 ± 6.7 Hz. The setup aimed to maximize participant comfort and safety, with vibration motors strategically placed under the second and third toes to maximize localization accuracy, since the tactile cue is an indicator for initiating a step and to target areas dense in Pacinian corpuscles, known for their sensitivity to high-frequency vibrations [36,37,43]. The wiring was arranged along the participants' legs and connected to the haptic device, which was securely attached to the participants' clothing. ...
... One way to enhance the user's ability to interact with and navigate through virtual or remote environments in real-time is to integrate haptic feedback in footwear, such as vibrotactile shoes or shoes with actuators. These wearable interfaces for the foot can be used for a variety of purposes, such as delivering dynamic information (Velázquez et al., 2012), generating virtual materials (Strohmeier et al., 2020), angular menu selections (Anlauff et al., 2018), performing pointing tasks (Horodniczy & Cooperstock, 2017), and language transmission (Hill et al., 2014). ...
... This adds a design benefit by hiding the device inside the shoe as opposed to wearing it openly, e.g., as a vibrational cuff around the upper arm or the shanks. The viability of such feedback was already shown in mobile robot control (Jones et al., 2020) and in supported navigation while walking (Velázquez et al., 2012;Meier et al., 2015). Importantly, neither research reported burdening of the participants by e.g., additional weight or induced gait disturbances. ...
... In that case, electrotactile stimulation at the sole of the feet provoked potentially unwanted muscle activities. Velázquez et al. (2012) also applied vibrotactile feedback in a walking navigation task and they did not encounter any such undesired effect. We did not investigate the effect of FeetBack in standing or walking tasks. ...
Article
Full-text available
Introduction Adding sensory feedback to myoelectric prosthetic hands was shown to enhance the user experience in terms of controllability and device embodiment. Often this is realized non-invasively by adding devices, such as actuators or electrodes, within the prosthetic shaft to deliver the desired feedback. However, adding a feedback system in the socket adds more weight, steals valuable space, and may interfere with myoelectric signals. To circumvent said drawbacks we tested for the first time if force feedback from a prosthetic hand could be redirected to another similarly sensitive part of the body: the foot. Methods We developed a vibrotactile insole that vibrates depending on the sensed force on the prosthetic fingers. This self-controlled clinical pilot trial included four experienced users of myoelectric prostheses. The participants solved two types of tasks with the artificial hands: 1) sorting objects depending on their plasticity with the feedback insole but without audio-visual feedback, and 2) manipulating fragile, heavy, and delicate objects with and without the feedback insole. The sorting task was evaluated with Goodman-Kruskal's gamma for ranked correlation. The manipulation tasks were assessed by the success rate. Results The results from the sorting task with vibrotactile feedback showed a substantial positive effect. The success rates for manipulation tasks with fragile and heavy objects were high under both conditions (feedback on or off, respectively). The manipulation task with delicate objects revealed inferior success with feedback in three of four participants. Conclusion We introduced a novel approach to touch sensation in myoelectric prostheses. The results for the sorting task and the manipulation tasks diverged. This is likely linked to the availability of various feedback sources. Our results for redirected feedback to the feet fall in line with previous similar studies that applied feedback to the residual arm. Clinical trial registration Name: Sensor Glove and Non-Invasive Vibrotactile Feedback Insole to Improve Hand Prostheses Functions and Embodiment (FeetBack). Date of registration: 23 April 2019. Date the first participant was enrolled: 3 September 2021. ClinicalTrials.gov Identifier: NCT03924310.
... Stimulation of the foot sole has been shown to be sufficient to elicit a walking experience (Turchet et al., 2013), due to its high sensitivity (Gu and Griffin, 2011). Furthermore, non-directional tactile cues have been shown to provide some self-motion cues (Terziman et al., 2012;Feng et al., 2016) while directional cues can be used to aid directional guidance, e.g., by using higher density grids of vibrotactors under the foot sole (Velázquez et al., 2012). In some studies, vibration and audio cues have been studied in concert and showed cross-modal benefits in ground-surface perception (Marchal et al., 2013) and selfmotion perception (vection) (Riecke et al., 2009). ...
... Upper and lower body parts can resonate at specific vibration frequencies caused by bodyoperated mechanical equipment (e.g., jack hammer) Rasmussen (1983). The feet are reasonably sensitive towards vibrations (Strzalkowski, 2015) even when denser grids of tactors are used (Velázquez et al., 2012). Even considering skin differences under the foot sole (e.g., skin hardness, epidermal thickness), low threshold cutaneous mechanoreceptors have shown good results for stimuli discrimination. ...
... Results show that users could rather easily judge the different cues at a reasonably high granularity. Generally, the results are in line with previous findings on feet-based actuation regarding sensitivity and interpretation of cues (Velázquez et al., 2012;Strzalkowski, 2015;Kruijff et al., 2016). Namely, previous studies have shown that the feet can well be used to perceive directional feedback through vibration, and can well differentiate between feedback locations. ...
Article
Full-text available
The visual and auditory quality of computer-mediated stimuli for virtual and extended reality (VR/XR) is rapidly improving. Still, it remains challenging to provide a fully embodied sensation and awareness of objects surrounding, approaching, or touching us in a 3D environment, though it can greatly aid task performance in a 3D user interface. For example, feedback can provide warning signals for potential collisions (e.g., bumping into an obstacle while navigating) or pinpointing areas where one’s attention should be directed to (e.g., points of interest or danger). These events inform our motor behaviour and are often associated with perception mechanisms associated with our so-called peripersonal and extrapersonal space models that relate our body to object distance, direction, and contact point/impact. We will discuss these references spaces to explain the role of different cues in our motor action responses that underlie 3D interaction tasks. However, providing proximity and collision cues can be challenging. Various full-body vibration systems have been developed that stimulate body parts other than the hands, but can have limitations in their applicability and feasibility due to their cost and effort to operate, as well as hygienic considerations associated with e.g., Covid-19. Informed by results of a prior study using low-frequencies for collision feedback, in this paper we look at an unobtrusive way to provide spatial, proximal and collision cues. Specifically, we assess the potential of foot sole stimulation to provide cues about object direction and relative distance, as well as collision direction and force of impact. Results indicate that in particular vibration-based stimuli could be useful within the frame of peripersonal and extrapersonal space perception that support 3DUI tasks. Current results favor the feedback combination of continuous vibrotactor cues for proximity, and bass-shaker cues for body collision. Results show that users could rather easily judge the different cues at a reasonably high granularity. This granularity may be sufficient to support common navigation tasks in a 3DUI.
... The availability of advancing technology that is becoming increasingly mobile and wearable, could enable an extension of human's interaction capabilities through an assistive smart shoe [83]. Using the foot as an alternative position for providing feedback is now being explored in modern Human-Computer Interaction (HCI) research [58,86]. Also having a look at input through feet is of interest in HCI [29,87]. ...
Conference Paper
Full-text available
Our feet are not just used to walk upright, feet can also reveal important context information on our physical constitution and context. In this research, we present a proof-of-concept multisen- sory approach to uncover the hidden information our feet hold. We instrumented a shoe with an accelerometer, a gyroscope, FSRs, a microphone, and a humidity sensor, while we drove a conventional machine learning approach. Our results show that we are capable to extract information on body posture (e.g., standing, sitting, kneel- ing), ambulation activity (e.g., walking, jumping, climbing stairs), gait abnormalities (e.g., overpronation, supination), and terrain (e.g., lawn, gravel, sand). These insights may show a new direction for the future of smart shoes.
... Because the feet serve different functions (i.e., gait control, maintaining posture, body orientation, and walking) than the hands, the distribution of afferent receptors and their frequency responses rely on population firing rather than individual neuron firing as it seems to be the case for the hand (Strzalkowski et al., 2017;Reed and Ziat, 2018;de Grosbois et al., 2020). Haptic feedback on the feet have been used for multiple purposes such as robotic telepresence (Jones et al., 2020), illusory self-motion (Riecke and Schulte-Pelkum, 2013), gait control in elderly (Galica et al., 2009;Lipsitz et al., 2015), and improved situational awareness in blind people (Velázquez et al., 2012). Location sites around the foot area and the technology used differ from one study to another. ...
Article
Full-text available
Virtual reality has been used in recent years for artistic expression and as a tool to engage visitors by creating immersive experiences. Most of these immersive installations incorporate visuals and sounds to enhance the user’s interaction with the artistic pieces. Very few, however, involve physical or haptic interaction. This paper investigates virtual walking on paintings using passive haptics. More specifically we combined vibrations and ultrasound technology on the feet using four different configurations to evaluate users’ immersion while they are virtually walking on paintings that transform into 3D landscapes. Results show that participants with higher immersive tendencies experienced the virtual walking by reporting illusory movement of their body regardless the haptic configuration used.
... This device is the fourth version of wearable electronic on-shoe tactile displays that our team has implemented. This last version incorporates the technological improvements of the three previous developments [43][44][45]. The four vibrating actuators were integrated in a commercially available inexpensive foam insole. ...
... The four vibrating actuators were integrated in a commercially available inexpensive foam insole. An experimental characterization of the actuators [43] confirmed that they are capable of delivering axial forces up to 13 mN and vibrating frequencies between 10 and 55 Hz, demanding a maximum of 400 mW from the power source. ...
... This device is the fourth version of wearable electronic on-shoe tactile displays that our team has implemented. This last version incorporates the technological improvements of the three previous developments [43][44][45]. ...
Article
Full-text available
This paper reports on the progress of a wearable assistive technology (AT) device designed to enhance the independent, safe, and efficient mobility of blind and visually impaired pedestrians in outdoor environments. Such device exploits the smartphone’s positioning and computing capabilities to locate and guide users along urban settings. The necessary navigation instructions to reach a destination are encoded as vibrating patterns which are conveyed to the user via a foot-placed tactile interface. To determine the performance of the proposed AT device, two user experiments were conducted. The first one requested a group of 20 voluntary normally sighted subjects to recognize the feedback provided by the tactile-foot interface. The results showed recognition rates over 93%. The second experiment involved two blind voluntary subjects which were assisted to find target destinations along public urban pathways. Results show that the subjects successfully accomplished the task and suggest that blind and visually impaired pedestrians might find the AT device and its concept approach useful, friendly, fast to master, and easy to use.
... A nickel-metal hydride rechargeable battery bank (Radio This device is the third version of wearable electronic tactile-foot interfaces that our team has implemented. This last version incorporates the technological improvements of the two previous developments [23,24]. of the foot [22], are the most sensitive to low frequency vibrating tactile stimulation. ...
Article
Full-text available
This paper presents a novel wearable system devoted to assist the mobility of blind and visually impaired people in urban environments with the simple use of a smartphone and tactile feedback. The system exploits the positioning data provided by the smartphone’s GPS sensor to locate in real-time the user in the environment and to determine the directions to a destination. The resulting navigational directions are encoded as vibrations and conveyed to the user via an on-shoe tactile display. To validate the pertinence of the proposed system, two experiments were conducted with test users. The first one involved a group of 20 voluntary normally sighted subjects that were requested to recognize the navigational instructions displayed by the tactile-foot device. The results show high recognition rates for the task. The second experiment consisted of guiding two blind voluntary subjects along public urban spaces to target destinations. Results show that the task was successfully accomplished and suggest that the system enhances independent safe navigation of visually impaired and blind people. Moreover, results show the potentials of smartphones and tactile-foot devices in assistive technology. Keywords: assistive technology, GPS localization, mobility of blind people, tactile-foot stimulation, vibrotactile display, wearable system.
... This implies that a distance of 4 cm was not clearly distinguishable from a distance of 12 cm. Various studies on vibrotactile spatial acuity on the back have estimated the twopoint-threshold between 13 and 60 mm depending on the vibration motor type, sequential or simultaneous presentation, the exact location of stimulation, and the psychophysical method used 10,11,[24][25][26][27][28] . Our largest distance of 12 cm was thus twice as large as the largest estimate of the two-point-threshold. ...
Article
Full-text available
Vibrotactile displays worn on the back can be used as sensory substitution device. Often vibrotactile stimulation is chosen because vibration motors are easy to incorporate and relatively cheap. When designing such displays knowledge about vibrotactile perception on the back is crucial. In the current study we investigated distance perception. Biases in distance perception can explain spatial distortions that occur when, for instance, tracing a shape using vibration. We investigated the effect of orientation (horizontal vs vertical), the effect of positioning with respect to the spine and the effect of switching vibration motors on sequentially versus simultaneously. Our study includes four conditions. The condition which had a horizontal orientation with both vibration motors switching on sequentially on the same side of the spine was chosen is the baseline condition. The other three conditions were compared to this baseline condition. We found that distances felt longer in the vertical direction than in the horizontal direction. Furthermore, distances were perceived to be longer when vibration motors were distributed on both sides of the spine compared to when they were on the same side. Finally, distances felt shorter when vibration motors were switched on simultaneously compared to sequentially. In the simultaneous case a distance of 4 cm was not clearly perceived differently than a distance of 12 cm. When designing vibrotactile displays these anisotropies in perceived distance need to be taken into account because otherwise the intended shape will not match the perceived shape. Also, dynamically presented distances are more clearly perceived than static distances. This finding supports recommendations made in previous studies that dynamic patterns are easier to perceive than static patterns.