Conference Paper

Roles of Force Cues and Proprioceptive Joint Angles in Active Exploration of Compliant Objects

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We employ distinct exploratory procedures to improve our perceptual judgments of an object’s properties. For instance, with respect to compliance, we exert pressure against a resisting force. The present work investigates ties between strategies for active control of the finger and resultant cues by which compliances may be discriminated. In particular, we employ elastic spheres that co-vary in compliance and radius, as these generate non-differentiable contact areas and are discriminable only in active touch with proprioceptive inputs. During human-subjects psychophysical experiments, we measure touch force, fingertip displacement, and joint kinematics. Two active touch paradigms are used, with and without a force constraint. First, in behaviorally-controlled situations that make force cues non-useful, the results indicate that participants can employ a force-matching strategy between the compliant objects and rely upon displacement-related cues to differentiate them. We show these cues are directly tied to a proprioception mechanism, specifically, the angle of the MCP joint. However, in the fully active paradigm, participants control displacements instead and discriminate via force-related cues. Similar to prior findings in passive touch, we find that force-related cues, likewise, are used in active touch for the optimal and efficient discrimination of compliant objects.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... With respect to the dimension of compliance, cutaneous and proprioceptive cues are recruited and integrated into multimodal inputs, and conveyed and converged at the perceptual space where compliances are recognized and discriminated [1], [7], [8]. Many efforts have focused on cues of contact area, skin deformation, and kinesthetic inputs of force and joint angles [1], [5], [7]. However, works with tactile displays suggest that replicating these stationary cues alone does not afford realistic perceptual acuity as naturalistic stimuli [1], [7], [9]- [11]. ...
... Indeed, physical cues of a time-dependent nature, e.g., the change rate of skin deformation, penetration of surface, and dynamics of finger joints are suggested to improve efficiency and fidelity in conveying compliance [5], [6], [10]. In fact, for the case of differentiating stimulus curvature and angle, the relative timing of just the first spikes elicited in tactile afferents reliably conveys such spatial information [12]. ...
... At the behavioral level, the availability of force-rate cues can make the compliant objects more readily discriminable, by reducing the amount of skin deformation [10]. Moreover, when terminal contact area cues are non-distinct, displacements are volitionally matched so to generate discriminable force-rate cues [5]. Furthermore, the duration and sequence of exploration procedures also affect compliance discrimination. ...
Conference Paper
Our perception of compliance is informed by multi-dimensional tactile cues. Compared with stationary cues at terminal contact, time-dependent cues may afford optimal efficiency, speed, and fidelity. In this work, we investigate strategies by which temporal cues may encode compliances by modulating our exploration time. Two potential perceptual strategies are considered, inspired by memory representations within and between explorations. For either strategy, we introduce a unique computational approach. First, a curve similarity analysis, of accumulating touch force between sequentially explored compliances, generates a minimum time for discrimination. Second, a Kalman filtering approach derives a recognition time from progressive integration of stiffness estimates over time within a single exploration. Human-subjects experiments are conducted for both single finger touch and pinch grasp. The results indicate that for either strategy, by employing a more natural pinch grasp, time-dependent cues afford greater efficiency by reducing the exploration time, especially for harder objects. Moreover, compared to single finger touch, pinch grasp improves discrimination rates in judging plum ripeness. The time-dependent strategies as defined here appear promising, and may tie with the time-scales over which we make perceptual judgments.
... This percept is informed by some combination of cutaneous inputs from mechanosensitive afferents signaling skin deformation and proprioceptive inputs signaling body movements. Efforts to define the precise cues within skin deformation and body movements have focused on contact area at the finger pad [20][21][22][23][24], spatiotemporal deformation of the skin's surface [25][26][27], and kinesthetic inputs of displacement, force, and joint angle [28][29][30][31]. Such an array of sensory contact inputs, mediated by independent cortical mechanisms, are recruited and integrated in the primary somatosensory cortex, and form the perceptual basis from which compliances are recognized and discriminated [32]. ...
... The former was deemed as the cutaneous cue [27,[33][34][35]. The latter was associated with the proprioceptive cue where displacement approximates the change in muscle length and force tied to muscle tension [30,[35][36][37][38]. ...
... A normalization procedure was required for data aggregation since participants exhibited distinct sensorimotor capabilities, range of finger movements, and dimensions of the finger pad [5,26]. In particular, for each experimental task, all recordings of each tactile cue were normalized to the range of (0, 1) by sigmoidal membership function [5,30]. The center of the transition area was set as the mean value of the data normalized, and the logistic growth rate of the curve was set to 1. ...
Article
Full-text available
Our sense of touch helps us encounter the richness of our natural world. Across a myriad of contexts and repetitions, we have learned to deploy certain exploratory movements in order to elicit perceptual cues that are salient and efficient. The task of identifying optimal exploration strategies and somatosensory cues that underlie our softness perception remains relevant and incomplete. Leveraging psychophysical evaluations combined with computational finite element modeling of skin contact mechanics, we investigate an illusion phenomenon in exploring softness; where small-compliant and large-stiff spheres are indiscriminable. By modulating contact interactions at the finger pad, we find this elasticity-curvature illusion is observable in passive touch, when the finger is constrained to be stationary and only cutaneous responses from mechanosensitive afferents are perceptible. However, these spheres become readily discriminable when explored volitionally with musculoskeletal proprioception available. We subsequently exploit this phenomenon to dissociate relative contributions from cutaneous and proprioceptive signals in encoding our percept of material softness. Our findings shed light on how we volitionally explore soft objects, i.e., by controlling surface contact force to optimally elicit and integrate proprioceptive inputs amidst indiscriminable cutaneous contact cues. Moreover, in passive touch, e.g., for touch-enabled displays grounded to the finger, we find those spheres are discriminable when rates of change in cutaneous contact are varied between the stimuli, to supplant proprioceptive feedback.
... Distinct efforts have focused upon cutaneous cues of contact area, skin deformation and spatial distribution of pressure [3]- [6]. However, recent studies have shown that the terminal contact area is not readily discriminable, and additional cues likely augment our judgments of compliance [7]- [10]. Among those, our proprioceptive system provides vital inputs through the kinesthetic sense of joint angles [10], spatiotemporal patterns in cutaneous contact [11], and visual-haptic integration [12]- [14]. ...
... However, recent studies have shown that the terminal contact area is not readily discriminable, and additional cues likely augment our judgments of compliance [7]- [10]. Among those, our proprioceptive system provides vital inputs through the kinesthetic sense of joint angles [10], spatiotemporal patterns in cutaneous contact [11], and visual-haptic integration [12]- [14]. These physical cues are recruited and integrated into multimodal signals, fine-tuned by optimal exploratory strategies, which modulate sensorimotor movements [15], and then transferred to perceptual space where compliances may be discriminated. ...
... The ramp segments were then extracted according to the first-order derivatives. As reported previously, the ramp onset and ending were defined based on the peak derivatives [10]. Finally, a linear regression was applied to the ramp and the slope was noted as the force-rate. ...
... On-going efforts are refining our understanding of the physical and perceptual cues that help encode our sense of compliance. Distinct efforts have focused upon skin deformation and surface contact [1], [2], proprioception [3]- [5], as well as exploratory strategies [6]. Most of these studies use man-made materials, e.g., silicone-elastomers, foams, and robotic devices to stand-in for ecological materials. ...
... In general, the cues utilized by participants to form compliance judgements of the plums seem to align with those from studies done with silicone-elastomers, foams and other non-natural objects. In particular, efforts by Drewing, et al. have shown that maximum force is a reliable cue used in discrimination of softness [12], and our group likewise has reported on force-related cues, including force rate [5], [11]. As well, contact area has for years been associated with judging changes in compliance [1], and more recently has been associated with proprioceptive cues [4]. ...
Conference Paper
Full-text available
Touch interaction cues help inform our sense of compliance. Efforts to ascertain the most relevant cues are mostly done with materials such as silicone-elastomers, foams, and robotic devices. It is unclear how well these engineered substances approximate the intrinsic material properties of ecologically compliant objects. Herein we study human tactile interactions with a natural object, that of soft plum fruit varying in ripeness; an organic substance associated with everyday tasks. An ink-based method was used to mark the finger pad to capture contact interactions, and instrumented load cells and non-contact laser sensors capture imposed force and displacement. In human-subjects experiments, participants discriminated plums grounded to a table (single finger) and held in a pinch grasp (index and thumb). The results indicate that contact area and force relationships vary per trial. Touch force and finger displacement also vary but significantly differ between the plums. In two-finger pinch, discrimination improved slightly. Compared to the single-finger case, the flatter finger pad contacted the plum, opposed to the fingertip, and led to larger and slightly more distinguishable contact areas. This work demonstrates a new technique for quantifying cues in human discrimination of natural objects, and articulates a path forward in working with such objects.
... Recent work suggests that force-related cues facilitate compliance discrimination [6]. Force-rate cues, in particular, have been shown to be very efficient in reducing object deformation necessary to discriminate compliances [7] and in illusion cases whereby radius of curvature and compliance are co-varied [8], [9]. ...
... Additionally, this initial work was done in passive touch, but the metrics, or cues, will likely see quite different levels of distinguishability in active touch. For instance, prior work has observed that stimuli can be distinguished in active touch earlier, but force rates in the present work (~1 N/s) are lower than in active touch (~4-6 N/s) [8]. ...
Conference Paper
Full-text available
We regularly touch soft, compliant fruits and tissues. To help us discriminate them, we rely upon cues embedded in spatial and temporal deformation of finger pad skin. However, we do not yet understand, in touching objects of various compliance, how such patterns evolve over time, and drive perception. Using a 3-D stereo imaging technique in passive touch, we develop metrics for quantifying skin deformation, across compliance, displacement, and time. The metrics map 2-D estimates of terminal contact area to 3-D metrics that represent spatial and temporal changes in penetration depth, surface curvature, and force. To do this, clouds of thousands of 3-D points are reduced in dimensionality into stacks of ellipses, to be more readily comparable between participants and trials. To evaluate the robustness of the derived 3-D metrics, human subjects experiments are performed with stimulus pairs varying in compliance and discriminability. The results indicate that metrics such as penetration depth and surface curvature can distinguish compliances earlier, at less displacement. Observed also are distinct modes of skin deformation, for contact with stiffer objects, versus softer objects that approach the skin's compliance. These observations of the skin's deformation may guide the design and control of haptic actuation.
... Indeed, improved tactile acuity is positively correlated with differences in proprioceptive cues employed for discrimination (Fig. 5C). Such exploration strategy of behavioral control aligns with how we explore naturalistic soft objects [4], [22]: individuals utilize distinct fingertip displacements to readily differentiate compliances. Overall, in the context of natural exploration, individuals could actively control their movements to achieve optimal perceptual performance, even amidst inherent constraints of skin mechanics. ...
Conference Paper
Tactile acuity differs between individuals, likely a function of several interrelated factors. The extent of the impact of skin mechanics on individual differences is unclear. Herein, we investigate if differences in skin elasticity between individuals impact their ability to distinguish compliant spheres near limits of discriminability. After characterizing hyperelastic material properties of their skin in compression, the participants were asked to discriminate spheres varying in elasticity and curvature, which generate non-distinct cutaneous cues. Simultaneous biomechanical measurements were used to dissociate the relative contributions from skin mechanics and volitional movements in modulating individuals’ tactile sensitivity. The results indicate that, in passive touch, individuals with softer skin exhibit larger gross contact areas and higher perceptual acuity. In contrast, in active touch, where exploratory movements are behaviorally controlled, individuals with harder skin evoke relatively larger gross contact areas, which improve and compensate for deficits in their acuity as observed in passive touch. Indeed, these participants exhibit active control of their fingertip movements that improves their acuity, amidst the inherent constraints of their less elastic finger pad skin.
... Previous studies on haptic softness typically think of compliance as the physical correlate of softness (Okamoto et al., 2013;Higashi et al., 2019;Cellini et al., 2013;Bergman-Tiest & Kappers, 2009;Xu et al., 2019), which is measured as the deformation of an elastic object in response to an applied force (Di Luca, 2014, Zöller et al., 2019Kaim & Drewing, 2011) (see (Higashi et al., 2018) for an exception). In contrast, everyday experiences of soft materials seem to include a much broader range of physical correlates: from squeezing playdough to stroking a rabbit's fur to digging your fingers into the warm sand on the beach. ...
Thesis
Material perception is a crucial part of our everyday life. This is more evident, when deciding where to put our step forward while walking under the rain or when applying enough force to a soap to grasp it without it slipping through our fingers. Over the past decade, material perception attracted increasingly more attention. Yet many questions remain unanswered. Material perception is a complex problem with numerous entry points. For instance, in haptic research, softness is generally equated to the compliance of the objects. However, a recent study has shown that this is not the only case. Perceived material dimensions underlying softness (Dovencioglu et al., 2021) include granularity, viscosity, surface softness, and deformability of the materials. Moreover, people adapt their hand movements according to the material. Another open question would be in addition to extrinsic material properties whether our purpose (i.e., information to be gained) affects hand movements when haptically perceiving different softness dimensions. To this extend, in Study 1 we investigated whether the task and the explored material modulate the exploratory movements. Firstly, our findings replicated the previously reported multiple perceptual dimensions of softness (Dovencioglu et al., 2021). More importantly, our results extend the literature by showing that people adapt their movements based on the material, task, and the interaction between the two. Another entry point to the material perception would be to ask how different modalities provide information on the same material. In daily life, we usually see what we touch and touch what we see. Whether vision provide similar information to haptic about the various aspects of softness is an intriguing question. For instance, in order to judge the softness of a rabbit’s fur, we can inspect its softness by looking at a picture or by touching the rabbit’s fur. Another source of information could be, watching someone else petting the rabbit. In contrast to the merely looking at the rabbit’s fur, watching someone else’s action not only provides the visual feature of the fur but also reveals how the material reacts to the hand movements. It is elusive whether these three examples would yield similar interpretations of softness as a multidimensional construct or not. In Study 2, we investigated to what extent the perceived softness dimensions are similar in vision compared to haptics. Our results showed high overall consistency across haptics, static visual information (i.e., images), and dynamic visual information (i.e., hand movement of other exploring materials). These similarities were the strongest between availability of haptic and dynamic visual information. In our daily experience, we do not only touch objects with bare hands but also sometimes through intermediate surfaces (e.g., wearing gloves). Perceiving materials over another layer of material could reduce some of the haptic information such as thermal properties of the material in question. Despite our regular interaction with materials under restrained conditions (i.e., wearing gloves in winter), it is mostly unknown to what extend these restrictions affect our perception of different aspects of softness. Therefore, another entry point to understanding material perception would be to understand how material perception is affected by physical constraints. It is almost ironic that the augmented or mixed reality technologies for haptics generally construct the haptic experience through gloves or other restrictive proxies. Hence, understanding how physical constraints affect material perception is an important question concerning both theoretical and practical research. In Study 3, we seek to understand how haptic constraints affect perceived softness. Participants explored haptic stimuli under four conditions: bare hand, open-fingered glove, open-fingered glove with rigid sensors, and full glove. General results suggest that softness perception was overall highly similar across conditions. However, in a closer inspection, we found that glove condition differed from the others especially in terms of surface softness. So far, the discussed entry points to material perception scrutinized the material perception in sensory domains. However, as in other topics in perception, the material perception - in addition to the sensation - depends on the agent’s cognitive state such as their motivation, emotion, etc. In a similar vein, the previous studies have shown that sensory and affective properties are related. For instance, fine grained materials like sand feel pleasant while rough materials such as sandpaper feel unpleasant (Drewing et al., 2018). The origin of these relationships is another piece of the puzzle. To remedy this gap, in Study IV, we investigated whether the relationship between sensory materials properties (i.e., granular) and affective responses (i.e., feeling pleasant) can be modified by learning. We further investigated previously observed relationships: positive relationship between granular and pleasantness, negative relationship between roughness and valence. With a classical conditioning paradigm, instead of participants’ existing material-emotion associations the opposite affective relationship was reinforced. The results have shown a significantly decreased relationship between valence and granularity in the experimental group compared to the control group. However, valence and roughness relationships did not differ between the experimental and the control groups. The results suggest that not all affective associations of the perceived material dimensions could be modified. We explain these results with the difference in learned and hard-wired connections.
... Previous studies on haptic softness typically think of compliance as the physical correlate of softness [2][3][4][5][6], which is measured as the deformation of an elastic object in response to an applied force [7][8][9] (see [10] for an exception). In contrast, everyday experiences of soft materials seem to include a much broader range of physical correlates: from squeezing playdough to stroking a rabbit's fur to digging your fingers into the warm sand on the beach. ...
Article
Full-text available
Haptic research has traditionally often equated softness with the compliance of elastic objects. However, in a recent study we have suggested that compliance is not the only perceived object dimension underlying what is commonly called softness [1]. Here, we investigate how the different perceptual dimensions of softness affect how materials are haptically explored. Participants freely explored and rated 19 materials on 15 adjectives. Materials and adjectives were chosen to represent each dimension; some materials served as control. Hand movements were recorded and subsequently categorized into different exploratory procedures (EPs). A linear support vector machine successfully predicted material categories from EPs. A multivariate analysis of variance (MANOVA) yielded significant effects on EPs of material, the task and their interaction: for example, for viscous materials pulling is used to judge deformability and viscosity, however for furry materials it is only used to judge deformability (but not viscosity). Taken together, the results support the notion of multiple perceived dimensions of softness and suggest that participants actively adapt their EPs in very differentiated ways for judging different softness dimensions in different material categories.
Thesis
Our sense of touch is essential and permeates in interactions involving natural explorations and affective communications. For instance, we routinely judge the ripeness of fruit at the grocery store, caress the arm of a spouse to offer comfort, and stroke textiles to gauge their softness. Meanwhile, interactive displays that provide tactile feedback are becoming normal and ubiquitous in our daily lives, and are extending rich and immersive interactions into augmented and virtual reality. To replicate touch sensation and make virtual objects feel tangible, such feedback will need to relay a sense of compliance, or "softness", one of the key dimensions underlying haptic perception. As our understanding of softness perception remains incomplete, this study seeks to understand exploratory strategies and perceptual cues that may optimally encode material softness. Specifically, we employ methods of computational finite element modeling, biomechanical experimentation, psychophysical evaluation, and data-driven analysis. First, we characterize the functional roles of physical contact cues, by studying a tactile illusion phenomenon where small-compliant and large-stiff spheres are naturally indistinguishable. In modulating contact interactions between the finger pad and stimuli, we found that pressing an object into the finger does not fully reveal its softness, but pressing actively does. Thus, our percept of softness is a product of both sensation and volition and depends upon both tactile afferents in the skin and musculoskeletal proprioception. Second, in exploring both engineered and ecological soft objects, we investigate how cues are optimally evoked and integrated under one's active control. By varying exploration time, we observe that exploratory strategies are finely tuned to elicit efficient contact force and finger movements for superior performance. Third, considering inherent differences and constraints among individuals' skin mechanics, we investigate individual differences in eliciting tactile cues, exploratory strategies, and thus, perceptual sensitivity. We characterized the skin material properties of individuals' finger pads and evaluated their contact interactions in performing discriminative touch. The results indicate that an individual's tactile acuity is constrained by their skin softness, but could be improved under volitional control of their exploratory
Article
Full-text available
Active finger movements play a crucial role in natural haptic perception. For the perception of different haptic properties people use different well-chosen movement schemes (Lederman and Klatzky, 1987). The haptic property of softness is stereotypically judged by repeatedly pressing one’s finger against an objects’ surface, actively indenting the object. It has been shown that people adjust the peak indentation forces of their pressing movements to the expected stimulus’ softness in order to improve perception (Kaim and Drewing, 2011). Here, we aim to clarify the mechanisms underlying such adjustments. We disentangle how people modulate executed peak indentation forces depending on predictive vs. sensory signals to softness, and investigate the influence of the participants’ motivational state on movement adjustments. In Experiment 1, participants performed a two alternative forced-choice (2AFC) softness discrimination task for stimulus pairs from one of four softness categories. We manipulated the predictability of the softness category. Either all stimuli of the same category were presented in a blocked fashion, which allowed predicting the softness category of the upcoming pair (predictive signals high), or stimuli from different categories were randomly intermixed, which made prediction impossible (predictive signals low). Sensory signals to softness category of the two stimuli in a pair are gathered during exploration. We contrasted the first indentation (sensory signals low) and last indentation (sensory signals high) in order to examine the effect of sensory signals. The results demonstrate that participants systematically apply lower forces when softer objects (as compared to harder objects) are indicated by predictive signals. Notably, sensory signals seemed to be not as relevant as predictive signals. However, in Experiment 2, we manipulated participant motivation by introducing rewards for good performance, and showed that the use of sensory information for movement adjustments can be fostered by high motivation. Overall, the present study demonstrates that exploratory movements are adjusted to the actual perceptual situation and that in the process of fine-tuning, closed- and open-loop mechanisms interact, with varying contributions depending on the observer’s motivation.
Article
Full-text available
Background: Proprioceptive function can be affected after neurological injuries such as stroke. Severe and persistent proprioceptive impairments may be associated with a poor functional recovery after stroke. To better understand their role in the recovery process, and to improve diagnostics, prognostics, and the design of therapeutic interventions, it is essential to quantify proprioceptive deficits accurately and sensitively. However, current clinical assessments lack sensitivity due to ordinal scales and suffer from poor reliability and ceiling effects. Robotic technology offers new possibilities to address some of these limitations. Nevertheless, it is important to investigate the psychometric and clinimetric properties of technology-assisted assessments. Methods: We present an automated robot-assisted assessment of proprioception at the level of the metacarpophalangeal joint, and evaluate its reliability, validity, and clinical feasibility in a study with 23 participants with stroke and an age-matched group of 29 neurologically intact controls. The assessment uses a two-alternative forced choice paradigm and an adaptive sampling procedure to identify objectively the difference threshold of angular joint position. Results: Results revealed a good reliability (ICC(2,1) = 0.73) for assessing proprioception of the impaired hand of participants with stroke. Assessments showed similar task execution characteristics (e.g., number of trials and duration per trial) between participants with stroke and controls and a short administration time of approximately 12 min. A difference in proprioceptive function could be found between participants with a right hemisphere stroke and control subjects (p<0.001). Furthermore, we observed larger proprioceptive deficits in participants with a right hemisphere stroke compared to a left hemisphere stroke (p=0.028), despite the exclusion of participants with neglect. No meaningful correlation could be established with clinical scales for different modalities of somatosensation. We hypothesize that this is due to their low resolution and ceiling effects. Conclusions: This study has demonstrated the assessment's applicability in the impaired population and promising integration into clinical routine. In conclusion, the proposed assessment has the potential to become a powerful tool to investigate proprioceptive deficits in longitudinal studies as well as to inform and adjust sensorimotor rehabilitation to the patient's deficits.
Conference Paper
Full-text available
We need to understand the physics of how the skin of the finger pad deforms, and their tie to perception, to accurately reproduce a sense of compliance, or 'softness,' in tactile displays. Contact interactions with compliant materials are distinct from those with rigid surfaces where the skin flattens completely. To capture unique patterns in skin deformation over a range of compliances, we developed a stereo imaging technique to visualize the skin through optically clear stimuli. Accompanying algorithms serve to locate and track points marked with ink on the skin, correct for light refraction through stimuli, and estimate aspects of contact between skin and stimulus surfaces. The method achieves a 3-D spatial resolution of 60-120 microns and temporal resolution of 30 frames per second. With human subjects, we measured the skin's deformation over a range of compliances (61-266 kPa), displacements (0-4 mm), and velocities (1- 15 mm/s). The results indicate that the method can differentiate patterns of skin deformation between compliances, as defined by metrics including surface penetration depth, retention of geometric shape, and force per gross contact area. Observations of biomechanical cues of this sort are key to understanding the perceptual encoding of compliance.
Conference Paper
Full-text available
Although we hardly interact with objects that are purely elastic or viscous, haptic perception studies of deformable objects are mostly limited to stiffness and damping. Psychophysical investigation of materials that show both elastic and viscous behavior (viscoelastic materials) is challenging due to their complex, time and rate dependent mechanical behavior. In this study, we provide a new insight into the investigation of human perception of viscoelasticity in the frequency domain. In the frequency domain, the force response of a viscoelastic material can be represented by its magnitude and phase angle. Using this framework, we estimated the point of subjective equality (PSE) of a Maxwell arm (a damper and a spring in series) to a damper and a spring using complex stiffness magnitude and phase angle in two sets of experiments. A damper and a spring are chosen for the comparisons since they actually represent the limit cases for a viscoelastic material. We first performed 2I-2AFC adaptive staircase experiments to investigate how the perceived magnitude of complex stiffness changes in a Maxwell arm for small and large values of time constant. Then, we performed 3I-2AFC adaptive staircase experiments to investigate how the PSE changes as a function of the phase angle in a Maxwell arm. The results of our study show that the magnitude of complex stiffness was underestimated due to the smaller phase lag (with respect to a damper's) between the sinusoidal displacement applied by the participants to the Maxwell arm and the force felt in their finger when the time constant was small, whereas no difference was observed for a large time constant. Moreover, we observed that the PSE values estimated for the lower bound of the phase angle were significantly closer to their actual limit (0 •) than those of the upper bound to 90 • .
Conference Paper
Full-text available
In neuro-rehabilitation after stroke, the conventional constrained induced movement therapy (CIMT) has been well-accepted. Existing bilateral trainings are mostly on mirrored symmetrical motion. However, complementary bilateral movements are dominantly involved in activities of daily living (ADLs), and functional bilateral therapies may bring better skill transfer from trainings to daily life. Neurophysiological evidence is also growing. In this work, we firstly introduce our bilateral arm training system realized with a haptic interface and a motion sensor, as well as the tasks that have been designed to train both the manipulation function of the paretic arm and coordination of bilateral upper limbs. Then, we propose quantitative measures for functional assessment of complementary bilateral training performance, including kinematic behavior indices, smoothness, submovement and bimanual coordination. After that, we describe the experiments with healthy subjects and the results with respect to these quantitative measures. Feasibility and sensitivity of the proposed indices were evaluated through comparison of unilateral and bilateral training outcomes. The proposed bilateral training system and tasks, as well as the quantitative measures, have been demonstrated effective for training and assessment of unilateral and bilateral arm functions.
Article
Full-text available
Humans, many animals, and certain robotic hands have deformable fingertip pads [1, 2]. Deformable pads have the advantage of conforming to the objects that are being touched, ensuring a stable grasp for a large range of forces and shapes. Pad deformations change with finger displacements during touch. Pushing a finger against an external surface typically provokes an increase of the gross contact area [3], potentially providing a relative motion cue, a situation comparable to looming in vision [4]. The rate of increase of the area of contact also depends on the compliance of the object [5]. Because objects normally do not suddenly change compliance, participants may interpret an artificially induced variation in compliance, which coincides with a change in the gross contact area, as a change in finger displacement, and consequently they may misestimate their finger’s position relative to the touched object. To test this, we asked participants to compare the perceived displacements of their finger while contacting an object varying pseudo-randomly in compliance from trial to trial. Results indicate a bias in the perception of finger displacement induced by the change in compliance, hence in contact area, indicating that participants interpreted the altered cutaneous input as a cue to proprioception. This situation highlights the capacity of the brain to take advantage of knowledge of the mechanical properties of the body and of the external environment.
Conference Paper
Full-text available
Our capability to discriminate object compliance is based on cues both tactile and proprioceptive, in addition to visual. To understand how the mechanics of the fingertip skin and bone might encode such information, we used finite element models to simulate the task of differentiating spherical indenters of radii (4, 6 and 8 mm) and elasticity (initial shear modulus of 10, 50 and 90 kPa). In particular, we considered two response variables, the strain energy density (SED) at the epidermal-dermal interface where Merkel cell end-organs of slowly adapting type I afferents reside, and the displacement of the fingertip bone necessary to achieve certain surface contact force. The former variable ties to tactile cues while the latter ties to proprioceptive cues. The results indicate that distributions of SED are clearly distinct for most combinations of object radii and elasticity. However, for certain combinations - e.g., between 4 mm spheres of 10 kPa and 8 mm of 90 kPa - spatial distributions of SED are nearly identical. In such cases where tactile-only cues are non-differentiable, we may rely on proprioceptive cues to discriminate compliance.
Article
Full-text available
Haptic perception essentially depends on the executed exploratory movements. It has been speculated that spontaneously executed movements are optimized for the computation of associated haptic properties. We investigated to what extent people strategically execute movements that are tuned for softness discrimination of objects with deformable surfaces. In Experiment 1, we investigated how movement parameters depend on expected stimulus compliance. In a discrimination task, we measured exploratory forces for less compliant (hard) stimuli and for more compliant (soft) stimuli. In Experiment 2, we investigated whether exploratory force also depends on the expected compliance difference between the two stimuli. The results indicate that participants apply higher forces when expecting harder objects as compared to softer objects, and they apply higher forces for smaller compliance differences than for larger ones. Experiment 3 examined how applied force influences differential sensitivity for softness as assessed by just noticeable differences (JNDs). For soft stimuli, JNDs did not depend on force. For hard stimuli, JNDs were "worse" (higher) if participants applied less force than they use naturally. We conclude that applying high force is a robust strategy to obtain high differential sensitivity, and that participants used this strategy if it was required for successful discrimination performance.
Article
Full-text available
The compliance of a material can be conveyed through mechanical interactions in a virtual environment and perceived through both visual and haptic cues.We investigated this basic aspect of perception. In two experiments subjects performed compliance discriminations, and the mean perceptual estimate (PSE) and and the perceptual standard deviation (proportional to JND) were derived from psychophysical functions. Experiment 1 supported a model in which each modality acted independently to produce a compliance estimate, and the two estimates were then integrated to produce an overall value. Experiment 2 tested three mathematical models of the integration process. The data ruled out exclusive reliance on the more reliable modality and stochastic selection of one modality. Instead the results supported an integration process that constitutes a weighted summation of two random variables, which are defined by the single modality estimates. The model subsumes optimal fusion but provided valid predictions also if the weights were not optimal. Weights were optimal (i.e., minimized variance) when vision and haptic inputs were congruent, but not when they were incongruent.
Article
Full-text available
In these experiments, two plates were grasped between the thumb and the index finger and squeezed together along a linear track. The force resisting the squeeze, produced by an electromechanical system under computer control, was programmed to be either constant (in the case of the force discrimination experiments) or linearly increasing (in the case of the compliance discrimination experiments) over the squeezing displacement. After completing a set of basic psychophysical experiments on compliance resolution (Experiment 1), we performed further experiments to investigate whether work and/or terminal-force cues played a role in compliance discrimination. In Experiment 2, compliance and force discrimination experiments were conducted with a roving-displacement paradigm to dissociate work cues (and terminal-force cues for the compliance experiments) from compliance and force cues, respectively. The effect of trial-by-trial feedback on response strategy was also investigated. In Experiment 3, compliance discrimination experiments were conducted with work cues totally eliminated and terminal-force cues greatly reduced. Our results suggest that people tend to use mechanical work and force cues for compliance discrimination. When work and terminal-force cues were dissociated from compliance cues, compliance resolution was poor (22%) relative to force and length resolution. When work cues were totally eliminated, performance could be predicted from terminal-force cues. A parsimonious description of all data from the compliance experiments is that subjects discriminated compliance on the basis of terminal force.
Article
Full-text available
The neural mechanisms underlying the sense of joint position and movement remain controversial. While cutaneous receptors are known to contribute to kinesthesia for the fingers, the present experiments test the hypothesis that they contribute at other major joints. Illusory movements were evoked at the interphalangeal (IP) joints of the index finger, the elbow, and the knee by stimulation of populations of cutaneous and muscle spindle receptors, both separately and together. Subjects matched perceived movements with voluntary movements of homologous joints on the contralateral side. Cutaneous receptors were activated by stretch of the skin (using 2 intensities of stretch) and vibration activated muscle spindle receptors. Stimuli were designed to activate receptors that discharge during joint flexion. For the index finger, vibration was applied over the extensor tendons on the dorsum of the hand, to evoke illusory metacarpophalangeal (MCP) joint flexion, and skin stretch was delivered around the IP joints. The strong skin stretch evoked the illusion of flexion of the proximal IP joint in 6/8 subjects (12 +/- 5 degrees, mean +/- SE). For the group, strong skin stretch delivered during vibration increased the perceived flexion of the proximal IP joint by eight times with a concomitant decrease in perceived flexion of the MCP joint compared with vibration alone (P < 0.05). For the elbow, vibration was applied over the distal tendon of triceps brachii and skin stretch over the dorsal forearm. When delivered alone, strong skin stretch evoked illusory elbow flexion in 5/10 subjects (9 +/- 4 degrees). Simultaneous strong skin stretch and vibration increased the illusory elbow flexion for the group by 1.5 times compared with vibration (P < 0.05). For the knee, vibration was applied over the patellar tendon and skin stretch over the thigh. Skin stretch alone evoked illusory knee flexion in 3/10 subjects (8 +/- 4 degrees) and when delivered during vibration, perceived knee flexion increased for the group by 1.4 times compared with vibration (P < 0.05). Hence inputs from cutaneous receptors, muscle receptors, and combined inputs from both receptors likely subserve kinesthesia at joints throughout the body.
Article
In our ability to discriminate compliant, or ‘soft,’ objects, we rely upon information acquired from interactions at the finger pad. We have yet to resolve the most pertinent perceptual cues. However, doing so is vital for building effective, dynamic displays. By introducing psychophysical illusions through spheres of various size and elasticity, we investigate the utility of contact area cues, thought to be key in encoding compliance. For both active and passive touch, we determine finger pad-to-stimulus contact areas, using an ink-based procedure, as well as discrimination thresholds. The findings indicate that in passive touch, participants cannot discriminate certain small compliant versus large stiff spheres, which generate similar contact areas. In active touch, however, participants easily discriminate these spheres, though contact areas remain similar. Supplementary cues based on stimulus rate and/or proprioception seem vital. One cue that does differ for illusion cases is finger displacement given a volitionally applied force.
Article
Understanding how we perceive differences in material compliance, or ‘softness,’ is a central topic in the field of haptics. The intrinsic elasticity of an object is the primary factor thought to influence our perceptual estimates. Therefore, most studies test and report the elasticity of their stimuli, typically as stiffness or modulus. However, many reported estimates are of very high magnitude for silicone-elastomers, which may be due to artifacts in characterization technique. This makes it very difficult to compare the perceptual results between the studies. The work herein defines a standardized and easy-to-implement way to characterize test stimuli. The procedure involves the unconstrained, uniaxial compression of a plate into cylindrical substrates 10 mm tall by 10 mm diameter. The resultant force-displacement data are straightforwardly converted into stress-strain data, from which a modulus is readily derived. This procedure was used to re-characterize stimuli from prior studies. The revised results from the validated method herein are 200-1,100% lower than modulus values either reported and/or approximated from stiffness. This is practically significant when differences of 10-15% are perceptually discriminable. The re-characterized estimates are useful in comparing prior studies and designing new studies. Furthermore, this characterization methodology may help more readily bridge studies on perception with those designing technology.
Article
1. We investigated the ability of humans to tactually discriminate the softness of objects, using novel elastic objects with deformable and rigid surfaces. For objects with deformable surfaces, we cast transparent rubber specimens with variable compliances. For objects with rigid surfaces ("spring cells") we fabricated telescoping hollow cylinders with the inner cylinder supported by several springs. To measure the human discriminability and to isolate the associated information-processing mechanisms, we performed psychophysical experiments under three conditions: 1) active touch with the normal finger, where both tactile and kinesthetic information was available to the subject: 2) active touch with local cutaneous anesthesia, so that only kinesthetic information was available; and 3) passive touch, where a computer-controlled mechanical stimulator brought down the compliant specimens onto the passive fingerpad of the subject, who therefore had only tactile information. 2. We first characterized the mechanical behavior of the human fingerpad and the test objects by determining the relationship between the depth and force of indentation during constant-velocity indentations by a rigid probe. The fingerpad exhibited a pronounced nonlinear behavior in the indentation depth versus force trace such that compliance, as indicated by the local slope of the trace, decreased with increases in indentation depth. The traces for all the rubber specimens were approximately linear, indicating a constant but distinct value of compliance for each specimen. The fingerpad was more compliant than each of the rubber specimens. 3. All the human subjects showed excellent softness discriminability in ranking the rubber specimens by active touch, and the subjective perception of softness correlated one-to-one with the objectively measured compliance. The ability of subjects to discriminate the compliance of spring cells was consistently poorer compared with that of the rubber specimens. 4. For pairwise discrimination of a selected set of rubber specimens, kinesthetic information alone was insufficient. However, tactile information alone was sufficient, even when the velocities and forces of specimen application were randomized. In contrast, for discriminating pairs of spring cells, tactile information alone was insufficient, and both tactile and kinesthetic information were found to be necessary. 5. The differences in the sufficiency of tactile information for the discrimination of the two types of objects can be explained by the mechanics of contact of the fingerpad and its effect on tactile information. For objects with deformable surfaces, the spatial pressure distribution within the contact region depends on both the force applied and the specimen compliance.(ABSTRACT TRUNCATED AT 250 WORDS)
Article
Grasping and manipulating an object requires us to perceive its material compliance. Compliance is thought to be encoded by relationships of force, displacement and contact area at the finger pad. Prior work suggests that objects must be sufficiently deformed to become discriminable, but the utility of time-dependent cues has not been fully explored. The studies herein find that the availability of force-rate cues improve compliance discriminability so as to require less deformation of stimulus and finger pad. In particular, we tested the impact of controlling force-rate and displacement-rate cues in passive touch psychophysical experiments. An ink-based method to mark the finger pad was used to measure contact area per stimulus, simultaneously with displacement and force. Compliances spanned a range harder and softer than the finger pad. The results indicated harder compliances were discriminable at lower peak forces when the stimulus control mode was displacement-rate (0.5 N) compared to force-rate (1.3 N). That is, when displacement-rate was controlled to be equal between the two compliances, the resultant force-rate psychophysical cues could be more readily discriminated. In extending prior studies, while some magnitude of finger pad deformation may be sufficient for discriminability, temporal cues tied to force afford more efficient judgments.
Article
The classical view of somatosensory processing holds that proprioceptive and cutaneous inputs are conveyed to cortex through segregated channels, initially synapsing in modality-specific areas 3a (proprioception) and 3b (cutaneous) of primary somatosensory cortex (SI). These areas relay their signals to areas 1 and 2 where multimodal convergence first emerges. However, proprioceptive and cutaneous maps have traditionally been characterized using unreliable stimulation tools. Here, we employed a mechanical stimulator that reliably positioned animals' hands in different postures and presented tactile stimuli with superb precision. Single-unit recordings in SI revealed that most neurons responded to cutaneous and proprioceptive stimuli, including cells in areas 3a and 3b. Multimodal responses were characterized by linear and nonlinear effects that emerged during early (∼20 ms) and latter (> 100 ms) stages of stimulus processing, respectively. These data are incompatible with the modality specificity model in SI, and provide evidence for distinct mechanisms of multimodal processing in the somatosensory system. Copyright © 2015 Elsevier Inc. All rights reserved.
Article
To enable a realistic tactile interaction with remote or virtual objects, softness information represents a fundamental property to be rendered via haptic devices. What is challenging is to reduce the complexity of such an information as it arises from contact mechanics and to find suitable simplifications that can lead an effective development of softness displays. A possible approach is to surrogate detailed tactile cues with information on the rate of spread of the contact area between the object and the finger as the contact force increases, i.e. force/area relation. This paradigm is called Contact Area Spread Rate. In this paper we discuss how such a paradigm has inspired the design of a tactile device (hereinafter referred to as Fabric Yielding Display, FYD-2), which exploits the elasticity of a fabric to mimic different levels of stiffness, while the contact area on the finger indenting the fabric is measured. In this manner, the FYD-2 can be controlled to reproduce force-area characteristics. In this work, we describe the FYD-2 architecture and report a psychophysical characterization. FYD-2 is shown to be able to accurately reproduce force-area curves of typical objects and to enable a reliable softness discrimination in human users.
Article
Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling one's real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of-freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subject's real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.
Article
For the perception of the hardness of compliant materials, several cues are available. In this paper, the relative roles of force/displacement and surface deformation cues are investigated. We have measured discrimination thresholds with silicone rubber stimuli of differing thickness and compliance. Also, the influence of the finger span is assessed. When compliance is expressed as the Young's modulus, the thresholds in the different conditions follow Weber's law with a Weber fraction of 15 percent. When the surface deformation cue was removed, thresholds more than trebled. Under the assumption of optimal cue combination, this suggests that a large fraction of the information comes from the surface deformation cue. Using a matching experiment, we found that differences in object thickness are correctly taken into account. When cues appear to contradict each other, the conflict is resolved by means of a compromise.
Article
While it is known that softness discrimination relies on both kinesthetic and cutaneous information, relatively little work has been done on the realization of haptic devices replicating the two cues in an integrated and effective way. In this paper, we first discuss the ambiguities that arise in unimodal touch, and provide a simple intuitive explanation in terms of basic contact mechanics. With this as a motivation, we discuss the implementation and control of an integrated device, where a conventional kinesthetic haptic display is combined with a cutaneous softness display. We investigate the effectiveness of the integrated display via a number of psychophysical tests and compare the subjective perception of softness with that obtained by direct touch on physical objects. Results show that the subjects interacting with the integrated haptic display are able to discriminate softness better than with either a purely kinesthetic or a purely cutaneous display.
Article
Two experiments establish links between desired knowledge about objects and hand movements during haptic object exploration. Experiment 1 used a match-to-sample task, in which blindfolded subjects were directed to match objects on a particular dimension (e.g., texture). Hand movements during object exploration were reliably classified as “exploratory procedures,” each procedure defined by its invariant and typical properties. The movement profile, i.e., the distribution of exploratory procedures, was directly related to the desired object knowledge that was required for the match. Experiment 2 addressed the reasons for the specific links between exploratory procedures and knowledge goals. Hand movements were constrained, and performance on various matching tasks was assessed. The procedures were considered in terms of their necessity, sufficiency, and optimality of performance for each task. The results establish that in free exploration, a procedure is generally used to acquire information about an object property, not because it is merely sufficient, but because it is optimal or even necessary. Hand movements can serve as “windows,” through which it is possible to learn about the underlying representation of objects in memory and the processes by which such representations are derived and utilized.
Article
The need for a simply applied quantitative assessment of handedness is discussed and some previous forms reviewed. An inventory of 20 items with a set of instructions and response- and computational-conventions is proposed and the results obtained from a young adult population numbering some 1100 individuals are reported. The separate items are examined from the point of view of sex, cultural and socio-economic factors which might appertain to them and also of their inter-relationship to each other and to the measure computed from them all. Criteria derived from these considerations are then applied to eliminate 10 of the original 20 items and the results recomputed to provide frequency-distribution and cumulative frequency functions and a revised item-analysis. The difference of incidence of handedness between the sexes is discussed.