ArticlePDF Available

Haptics: The Present and Future of Artificial Touch Sensation

Authors:

Abstract and Figures

This article reviews the technology behind creating artificial touch sensations and the relevant aspects of human touch. We focus on the design and control of haptic devices and discuss the best practices for generating distinct and effective touch sensations. Artificial haptic sensations can present information to users, help them complete a task, augment or replace the other senses, and add immersiveness and realism to virtual interactions. We examine these applications in the context of different haptic feedback modalities and the forms that haptic devices can take. We discuss the prior work, limitations, and design considerations of each feedback modality and individual haptic technology. We also address the need to consider the neuroscience and perception behind the human sense of touch in the design and control of haptic devices. Expected final online publication date for the Annual Review of Control, Robotics, and Autonomous Systems Volume 1 is May 28, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Content may be subject to copyright.
AS01CH12_Okamura ARI 29 January 2018 10:42
Annual Review of Control, Robotics, and
Autonomous Systems
Haptics: The Present and
Future of Artificial Touch
Sensations
Heather Culbertson,1,2 Samuel B. Schorr,1
and Allison M. Okamura1
1Department of Mechanical Engineering, Stanford University, Stanford, California 94305,
USA; email: sschorr@alumni.stanford.edu, aokamura@stanford.edu
2Department of Computer Science, University of Southern California, Los Angeles,
California 90089, USA; email: hculbert@usc.edu
Annu. Rev. Control Robot. Auton. Syst. 2018.
1:12.1–12.25
The Annual Review of Control, Robotics, and
Autonomous Systems is online at
control.annualreviews.org
https://doi.org/10.1146/annurev-control- 060117-
105043
Copyright c
2018 by Annual Reviews.
All rights reserved
Keywords
haptics, kinesthesia, tactile, vibration, virtual reality, teleoperation
Abstract
This article reviews the technology behind creating artificial touch sensa-
tions and the relevant aspects of human touch. We focus on the design and
control of haptic devices and discuss the best practices for generating distinct
and effective touch sensations. Artificial haptic sensations can present infor-
mation to users, help them complete a task, augment or replace the other
senses, and add immersiveness and realism to virtual interactions. We exam-
ine these applications in the context of different haptic feedback modalities
and the forms that haptic devices can take. We discuss the prior work, limi-
tations, and design considerations of each feedback modality and individual
haptic technology. We also address the need to consider the neuroscience
and perception behind the human sense of touch in the design and control
of haptic devices.
12.1
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
1. INTRODUCTION
Haptics—the sense of touch—enables humans to perform a wide variety of exploration and ma-
nipulation tasks in the real world. In virtual worlds and robot teleoperation scenarios, this sense
of touch must be artificially recreated by stimulating the human body (typically the hands) in a
manner that produces the salient features of touch needed to enhance realism and human perfor-
mance. This article focuses on the state of the art in design, control, and application of noninvasive
haptic devices that generate artificial human sensations. There are two other main areas of haptic
technology that are beyond the scope of this article but certainly deserve mention and can be
further studied using the works listed in the Related Resources section at the end of the arti-
cle: robot haptics (giving robots the sense of touch using force and tactile sensors and associated
processing/perception algorithms; e.g., see Reference 1) and invasive haptic stimulation (creating
haptic sensations in humans and other animals by electrically stimulating the peripheral nervous
system or the brain; e.g., see Reference 2).
1.1. Applications of Haptics
It is difficult to imagine life without haptics, in part because it is such a natural and integral part of
our lives. Without haptics, we would have great difficulty grasping and manipulating objects, be un-
able to determine many material or surface properties, and miss feeling the warmth of a loved one’s
hand. Thus, many of the applications of artificial haptics address scenarios where the sense of touch
is lost or greatly diminished compared with the experience of a healthy person in the real world.
Certain highly specialized professions can use augmented haptic feedback, such as an astronaut
teleoperating a robot outside the International Space Station to enable repair tasks while avoiding
a dangerous human space walk, or a surgeon using a robot to perform a delicate procedure at
a scale not achievable with the human hand. In such teleoperation scenarios, we often aim to
give human operators a sense of “telepresence” such that they feel they are directly manipulating
the environment with their own hands, rather than having their actions mediated by a robot and
communication/control system.
In some cases, we seek to replace a sense of touch that was lost owing to disease or accident.
An upper-limb amputee has completely lost the sense of touch through the loss of a hand; ideally,
a prosthetic hand would sense haptic interactions between itself and the environment and relay
that information back to the amputee, so that the amputee does not need to rely entirely on sight
in order to manipulate objects.
A more universally experienced lack of haptics is in interactive computing. Computers, tablets,
and smartphones have sensors to measure human inputs, but their outputs (displays) are limited
primarily to the visual and auditory channels. As discussed below, vibration feedback has made
inroads as a haptic display for human–computer interaction, but the quality of this interaction
leaves much room for improvement, and many other promising haptic feedback modalities have
yet to be implemented in commercial systems.
1.2. Human Haptic Perception
Unlike the four other senses (sight, hearing, taste, and smell), the sense of touch is not localized
to a specific region of the body; instead, it is distributed across the entire body through the touch
sensory organ, our skin, and in our joints, muscles, and tendons. The sense of touch is typically
described as being divided into two modalities: kinesthetic and tactile. Kinesthetic sensations, such
as forces and torques, are sensed in the muscles, tendons, and joints. Tactile sensations, such as
12.2 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
pressure, shear, and vibration, are sensed by specialized sensory end organs known as mechanore-
ceptors that are embedded in the skin. Each type of mechanoreceptor senses and responds to a
specific type of haptic stimulus.
The mechanoreceptors are characterized by their temporal resolution and the size of their
receptive fields. Fast-adapting mechanoreceptors capture transient signals, and slow-adapting
mechanoreceptors capture mostly static stimuli. For example, Meissner corpuscles are fast-
adapting mechanoreceptors that respond to low-frequency vibrations and sense the rate of skin
deformation (3). Pacinian corpuscles respond to a wider range of high-frequency vibrations and
provide information about transient contacts (4). Merkel disks are slow-adapting mechanorecep-
tors that detect edges and spatial features (4). Ruffini endings sense skin stretch and allow for the
perception of the direction of object motion or force (3).
The density of mechanoreceptors differs with the location on the body. Mechanoreceptors are
more dense in the glabrous skin of the hands and feet than in hairy skin, which makes touch easier
to localize on the glabrous skin (5). To create a truly effective haptic interaction, designers must
account for the location dependency and specialization of the mechanoreceptors in creating both
the device and the signals to drive it.
1.3. Haptic Devices
To introduce the breadth of haptic device design and control, we consider three major categories
of haptic systems: graspable, wearable, and touchable. Figure 1 gives examples of each of these
types of systems.
Graspable systems are typically kinesthetic (force-feedback) devices that are grounded (e.g., to
a table) and allow the user to push on them (and be pushed back) through a held tool. Graspable
devices can also be ungrounded (e.g., using flywheels to provide inertial forces) or can be tactile
devices that are held in the hand.
Wearable systems are typically tactile (cutaneous) devices that are mounted to the hands or
other parts of the body and display sensations directly to the skin. They can provide cues such
as vibration, lateral skin stretch, and normal skin deformation. They may also be body-grounded
devices, such as an exoskeleton, that provide a kinesthetic cue to the user by creating a reaction
Graspable Wearable Touchable
Figure 1
Examples of graspable, wearable, and touchable haptic systems. These three categories describe the breadth of interaction modalities
for kinesthetic and cutaneous stimulation in interactive haptic devices.
www.annualreviews.org Haptics 12.3
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
force on a less sensitive part of the body. The wearability of the devices makes them attractive
for use in mobile applications where users should be free and unencumbered to move about their
environment.
Touchable systems are encountered-type displays that allow the user to actively explore the
entire surface. They can be purely cutaneous devices that change their tactile properties based on
location, such as a surface with variable friction. Touchable devices can also be hybrid cutaneous
and kinesthetic devices that change their shape, mechanical properties, and surface properties.
For each of these categories, the mechanism of haptic feedback can vary. The remainder of this
review is organized mainly by these mechanisms, discussing kinesthetic devices (Section 2), skin
deformation devices (Section 3), vibration (Section 4), and haptic surfaces (Section 5). In addition,
owing to the increasing research and commercial interest in haptics for virtual and augmented
reality, we include an additional discussion of this application (Section 6).
2. KINESTHETIC DEVICES
Kinesthesia refers to the sensation of movement and force and is typically associated with force-
displacement relationships. Receptors involved include muscle spindles, which transduce muscle
stretch, and Golgi tendon organs, which sense change in muscle tension. Stimulating these recep-
tors can produce the illusion of movement and/or force. Kinesthetic or force-type haptic devices
are defined by their ability to apply force about a joint such that movement (and resistance of
that movement) is possible. Historically, kinesthetic feedback represented the bulk of haptic re-
search in terms of device design and rendering algorithms. Interestingly, less is known about
human kinesthesia, especially proprioception (the sense of self, usually referring to the sense of
bodily movements), than about cutaneous sensing. This disparity is likely due to the complexity of
developing experimental platforms for high-degree-of-freedom (high-DoF) force-displacement
relationships on the scale and precision of human force sensing and motor control. Kinesthetic
haptic devices are challenging to design and control because of this impressive human dynamic
range, but they are the basis for a huge body of literature.
2.1. Traditional Kinesthetic Haptic Devices
The Phantom Premium haptic device (originally commercially available from SensAble Technolo-
gies) was a milestone in the field, because it enabled three DoFs of high-force and high-bandwidth
force feedback with very low free-space impedance (6) [in other words, great Z-width (7)]. Nu-
merous other kinesthetic haptic devices, from one-DoF knobs to six-DoF manipulandums, have
been developed, with significant research effort put toward ensuring the realism and stability of
virtual environments rendered via programmed force-displacement relationships.
Traditional kinesthetic devices designed using rigid links are no longer considered at the
forefront of novel haptic device design, in large part because many of the research problems
have been solved and the relatively high cost–benefit ratio has lacked commercial potential. Yet
there are some novel designs and new rendering approaches worth briefly reviewing here. One
design concept to consider is the choice of control formulation: admittance versus impedance
control. Admittance control devices can apply much larger forces to the human operator for
applications such as rehabilitation, albeit with differing control challenges from impedance devices.
Alternatively, one can design devices that mechanically resist movement in controllable directions;
cobots are a great example of this (8). Improvements to rendering techniques (9) and better
understanding of human perception and cognition in the context of different haptic rendering
algorithms (10) will also continue to refine the capabilities of kinesthetic haptic devices.
12.4 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
2.2. Low-Cost Kinesthetic Devices
Many kinesthetic devices have focused on high-DoF systems using the highest-quality motors
with high force and low friction and cogging, linkages made from high-performance engineered
materials such as carbon fiber, and low-friction bearings at joints. However, a new generation
of kinesthetic haptics research is seeking to determine the minimal requirements for compelling
haptic displays.
For example, haptic devices are being designed for use in education, where one or two DoFs can
be sufficient to explain physical and mathematical phenomena in an intuitive, hands-on manner.
For integration into classrooms, such devices should be inexpensive, robust, and perhaps even
assembled by students or teachers. Haptics researchers are taking advantage of new and popular
manufacturing techniques such as 3-D printing (11) and layered manufacturing to make haptic
devices more accessible to nonengineers. The conflicting goals of low cost and high performance
generate important research questions for haptic device designers as well as those interested in
the application of haptics in training environments. For example, we do not fully understand
how the accuracy of perception of mechanical properties via a haptic device affects learning or
understanding of a concept or task (12).
2.3. Body-Grounded Kinesthetic Devices
Exoskeletons are typically body-grounded kinesthetic devices. How they should be designed and
controlled in order to optimize performance for applications such as rehabilitation and user as-
sistance is the subject of current research, which integrates design, control, biomechanics, and
neuroscience (13). For example, hand-grounded devices can assist surgeons in steadying an in-
strument for medical tasks (14). An exciting design development in exoskeletons has been the use
of soft robotic techniques and pneumatic actuation to make exoskeletons lighter, cheaper, and
more adaptable to the human body (15, 16).
We differentiate body-grounded kinesthetic devices from wearable tactile devices by whether
the device allows forces across a movable human joint. Pacchierotti et al. (17) provided a tax-
onomy that clarifies this difference. Figure 2 shows how world-grounded kinesthetic haptic de-
vices, exoskeletons (body-grounded kinesthetic devices), and wearable tactile devices differ in their
grounding.
abc
User
Ground
Figure 2
Schematic connections between grounds and haptic device user contact points for (a) world-grounded
kinesthetic haptic devices, (b) exoskeletons (body-grounded kinesthetic devices), and (c) wearable tactile
devices. Blue arrows indicate forces and torques applied to the user, and red arrows indicate reaction forces
and torques at the ground. Figure adapted from Reference 17; c
2017 IEEE, reprinted with permission.
www.annualreviews.org Haptics 12.5
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
2.4. Multimodal Haptics
Kinesthesia has been successfully integrated with tactile (especially vibrotactile) feedback as well
as other sensory modalities (particularly vision and sound) to produce compelling effects with
less stringent requirements for the fidelity of force rendering. The force display of any kinesthetic
haptic device is limited by the system’s sampling rate, computational time delays, and quantization
of position measurements (18).
In the 1990s, Srinivasan et al. (19) showed that visual rendering could significantly alter haptic
perception. For example, when a virtual wall is displayed with a certain mechanical stiffness, a classic
stiffness-rendering algorithm based on Hooke’s law states that the force displayed ( f) should be
stiffness (k) multiplied by the penetration into the wall (x): f=kx. If the cursor representing
the haptic device’s position is visually allowed to penetrate the wall (thus matching the haptic
rendering algorithm), the user will perceive a less stiff wall than if the visual display is modified to
make the cursor stop at the surface of the wall. (We revisit the dominance of vision over haptics
in Section 6.)
Lower-fidelity kinesthetic haptic systems can also be enhanced by combining low-frequency
force with high-frequency vibration feedback. In the real world, when we tap on a hard surface
with a finger or even a stylus, we feel not only kinesthetic force, but also high-frequency vibrations
resulting from the contact event. In a virtual environment, playing those vibrations with either the
kinesthetic device motors (20, 21) or a separate vibrotactile motor (22) can increase the perceived
hardness of a surface.
2.5. Applications
Kinesthetic haptic devices have yet to find their “killer app,” since the high-DoF devices that are
most compelling are expensive and not amenable to commercialization. Popular applications for
researchers have been medical simulation, rehabilitation, and computer-aided design. Attempts
have also been made at developing force-feedback joysticks for gaming applications, but the use
of kinesthetic feedback in these fields has not yet achieved widespread commercial success. A
subtle but common application of kinesthetic feedback in a consumer product is in the dashboard
control/navigation system in some high-end automobiles.
Teleoperation of robots that are remote in distance or scale can enhance human performance or
keep humans safe when performing tasks in dangerous environments. Typically, haptic feedback
for teleoperation has focused on kinesthesia, in large part because sensing force is simpler than
sensing distributed tactile information. Applications include situations in which remote operation
of a robot is desirable owing to challenges in access (especially distance and danger) and large
differences in scale relative to typical human manipulation.
3. SKIN DEFORMATION DEVICES
The kinesthetic force-feedback devices described in the preceding section represent a fairly com-
plete haptic experience because they not only display large forces but also stimulate the skin
through the held tool, effectively providing tactile feedback in addition to the actively controlled
kinesthetic feedback. In the last decade, researchers began to examine whether providing only
the tactile component of contact forces would have advantages for the size, wearability, and cost
of haptic devices without detracting from the perception that net forces are being applied to the
body. Prattichizzo et al. (23) called this “cutaneous force feedback” and acknowledged the removal
of the kinesthetic forces as a form of “sensory subtraction.”
12.6 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
3.1. Lateral Skin Deformation
Skin deformation is a promising feedback modality in which a shear force is applied to the user’s
finger pad, similar to the skin deformation that occurs naturally during haptic interactions. This
method takes advantage of the finger’s increased sensitivity to shear forces compared with normal
forces (24). In addition, two DoFs of shear can be displayed, enabling directional cues. In order to
display shear force, a device contacts the skin, presses on it to ensure sufficient friction, and moves
laterally to deform the skin in shear. Researchers have investigated the ability to discriminate
direction through both lateral skin stretch and lateral skin slip (25–27). Guzererler et al. (28)
investigated the sensitivity to skin stretch on the palm in different speeds and displacements, and
Bark et al. (29) and Wheeler et al. (30) showed that skin deformation is effective in communicating
proprioceptive information when used on the forearm.
Such skin deformation can be achieved with or without an aperture, which changes the
manner in which the device is grounded to the body (Figure 3). Many lateral skin deformation
devices have been created using aperture-based grounding, where the finger is grounded on an
outer aperture while a tactor moves against the skin, because it allows easy grasping without any
associated donning or doffing process. Aperture-based devices, although limited in workspace
and DoFs by the aperture constraints, can still provide sophisticated multi-DoF feedback, as
demonstrated by Guinan et al. (31), who used multiple devices with two-DoF tactors to convey
five-DoF directional cues.
Much of the work on lateral skin deformation has focused on passive perception of the skin
deformation feedback cues. Only in recent years have researchers begun to focus on superimposing
lateral skin deformation with dynamic force-feedback stimuli. Provancher & Sylvester (32) found
that rendering a small amount of skin stretch by moving a high-friction tactor against the skin of
the finger pad could increase the perception of friction. Quek et al. (33) extended this idea using
an aperture-based one-DoF device and found that adding skin deformation to kinesthetic force
feedback causes increased perception of stiffness. Further work by Quek et al. (34), using a three-
DoF device that combined skin-stretch with normal indentation, demonstrated that dynamic force
information could also be perceived in multiple DoFs. A more complicated six-DoF device design
was later used by Quek et al. (35) to provide force and torque information in teleoperated tasks using
the da Vinci surgical robot. These works show that an assortment of skin deformation feedback
DoF configurations, including two-DoF, three-DoF, and six-DoF devices, can supplement force
feedback (34–37).
Skin
Skin deformation
Aperture
Finger Finger
No aperture
Skin
deformation
ab
Figure 3
Skin deformation feedback achieved by moving a point in contact with the skin in the normal (into the skin) or lateral (parallel to the
skin) direction. (a) Skin deformation with an aperture constraint that localizes the deformation. (b) Skin deformation without an
aperture constraint, which can provide a stronger sensation because more mechanoreceptors are engaged. Figure provided by William
Provancher.
www.annualreviews.org Haptics 12.7
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
Despite a large body of work demonstrating the efficacy of combining skin deformation with
kinesthetic feedback, determining how users perceive lateral skin stretch feedback as a complete
substitute for force feedback is still an open question. Schorr et al. (38) found that these skin
deformation cues could be used to discriminate between various levels of stiffness without any
underlying kinesthetic cues. In subsequent work, Schorr et al. (39) showed that this form of
feedback performs similarly to kinesthetic feedback during a teleoperated palpation task.
3.2. Wearable Skin Deformation Devices
Although the tactile feedback devices discussed above act on different parts of the body, most of
them focus on the finger pad owing to the great density of mechanoreceptors present there (38).
A large body of work has investigated the use of small, wearable, finger-grounded devices for
deforming the finger pulp. Although there are many ways that these devices could be classified,
one way is to distinguish between the devices that render indentation normal to the finger pad
(often with control of orientation as well) and those that provide lateral translation (often with
normal indentation as well).
Recently, researchers have investigated the effects of wearable finger-grounded tactile haptic
devices that can move an end effector against the surface of the finger pad (Figure 4). One category
of these devices compresses a platform normally against the finger pad in addition to changing ori-
entation (44–46). These devices have been used to provide normal force sensory substitution to the
finger pad during teleoperation with the aim of preserving stability and improving transparency
(23, 41, 47, 48). Such devices are well suited to displaying orientation information but are not nec-
essarily capable of displaying arbitrary lateral forces. That is, the display of lateral force is directly
coupled with differential normal forces at each attachment point of the finger-pad platform.
Another category of wearable fingertip haptic devices focuses on designs that move a high-
friction tactor element, similar to that used in previous aperture-based skin deformation research,
against the finger pad. These designs, while less appropriate for displaying orientation informa-
tion, are appropriate for the display of lateral forces (49). Leonardis et al. (42) developed a wearable
servo-actuated three-DoF skin deformation device with revolute–spherical–revolute (RSR) kine-
matics. The device was used to render contact forces in a virtual pick-and-place task and resulted
in decreased grasping forces when compared with no haptic feedback. While the device has im-
pressively small dimensions, the tactor translational motion has inherently coupled changes in
orientation that cause the end effector to roll across the finger pad, possibly affecting the display
of lateral finger-pad forces. Other devices (43, 50) decouple these DoFs in a similar package.
abcd
Figure 4
Recently developed wearable haptic devices that invoke skin deformation without an aperture. Panel areproduced from Reference 40;
c
2010 IEEE, reprinted with permission. Panel breproduced from Reference 41; c
2014 Association for Computing Machinery,
reprinted with permission. Panel creproduced from Reference 42; c
2015 IEEE, reprinted with permission. Panel dreproduced from
Reference 43; c
2017 IEEE, reprinted with permission.
12.8 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
4. VIBRATION
Haptic feedback in consumer products is usually synonymous with vibration feedback. Vibrations
provide an added dimension to gaming by simulating the sensation of collisions or driving over a
rough surface, the buzzing of a phone allows us to stay in contact and receive private notifications
of messages even when in meetings, and vibrations have been used to simulate the feel of a button
click. Similarly, much haptics research has focused on the use of vibration to display information,
help the user complete a task, and add realism to virtual environments. Vibration’s popularity
can be explained by the widespread availability of actuators and the ease with which they can be
integrated into systems. The actuators’ small size, light weight, and low cost mean that vibration
systems are easily scalable and can potentially display large amounts of information to users.
4.1. Human Vibration Sensing
Humans have two distinct types of mechanoreceptors that sense vibration, each of which is re-
sponsible for sensing a specific frequency band. In glabrous (nonhairy) skin, the lower-frequency
vibrations (5–50 Hz) are sensed by the Meissner corpuscles, while the higher-frequency vibrations
(40–400 Hz) are sensed by the Pacinian corpuscles (3). During interactions with physical objects,
the Meissner corpuscles sense the rate of skin deformation caused by the slipping of a grasped ob-
ject or by surface discontinuities and edges moving under the finger (3). The Pacinian corpuscles
sense high-frequency vibrations caused by transient contacts with an object, as in collisions or a
tool dragging across a textured surface (4).
Common vibration actuators, such as eccentric rotating mass motors or linear resonant ac-
tuators, typically produce vibrations at frequencies above 100 Hz, which primarily activate the
Pacinian corpuscles. The Pacinian corpuscles have large receptive fields (4), so it is difficult for
users to distinguish actuators that are placed close together. This issue of discriminability is fur-
ther compounded by the propagation of vibrations through the skin (51). Pacinian corpuscles
do not have directional sensitivity, so humans cannot distinguish the direction of high-frequency
vibrations (52).
The density of both Pacinian and Meissner corpuscles in glabrous skin is high (3), which
lends itself well to easily perceivable haptic feedback cues. However, the spatial density of the
Pacinian corpuscles is significantly reduced in hairy skin (5), and the Meissner corpuscles are
completely absent (53). Instead, hairy skin contain C-tactile afferents, which are rapidly adapting
mechanoreceptors that respond preferentially to slow, stroking touch (53). Haptic designers must
understand the methods and limits of human haptic sensing to create haptic systems that effectively
stimulate the mechanoreceptors to achieve the desired sensation.
4.2. Information Display
Traditionally, vibration has conveyed binary information using a simple on–off state change. Arrays
of these actuators can convey directional information, usually with one actuator used for each DoF.
These arrays of tactors are typically mounted directly on the body (54), with common locations
being the hand (55), arm (56), and torso (57, 58). Array size and actuator placement are limited
by the ability of the user to distinguish the cues from multiple actuators. This distinguishability is
limited both by the receptive fields of the Pacinian corpuscles and by the propagation of vibration
through the skin (51). Owing to differences in the distribution of Pacinian corpuscles in glabrous
and hairy skin, arrays of actuators can be denser when mounted on the hand than when placed on
other parts of the body, such as the torso or the forearm (5).
www.annualreviews.org Haptics 12.9
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
Vibrotactile cues can also help users correct errors when completing a task. Feedback is pro-
vided if the user moves away from a desired set point or if their motion deviates from a desired
trajectory. The vibration can be displayed to direct the user to move either away from the cue
(repulsive feedback) or toward the cue (attractive feedback). Research has shown that users do not
exhibit a preference for either repulsive or attractive feedback (59), but they may respond more
quickly to attractive feedback (60). Error correction systems have been incorporated into handheld
devices (61), prosthetic hands (62), and wearable bands (63) in order to guide users toward desired
locations or body positions. Vibration has also been used to correct user error when following
trajectories for upper-arm motions (64) and during surgery (65).
Finally, vibration can be used to provide information about events that occur at a distance,
namely during teleoperation. With such systems, the user controls the motion of a robot remotely
but is unable to feel the forces or vibrations experienced by the robot. Researchers have sought
to use vibrations to convey to the user information about the interaction between the robot and
its environment. This information transfer requires sensors on the robot (e.g., a force sensor or
accelerometer) and an actuator on the master controller (66). Systems have been created to provide
cues for remote surgery (67) and for teleoperation of a robot (68, 69).
4.3. Haptic Icons
Haptic designers often rely on vibration as a simple binary cue (on or off) in vibrotactile arrays.
However, vibration signals can be much more expressive and can present a wider range of informa-
tion by altering the amplitude, frequency, rhythm, and envelope of the vibration (70). By altering
these parameters, designers can create distinct haptic icons to relay abstract information, such as
the urgency of a message or the identity of the sender. These haptic icons are especially useful in
applications where providing the same information through visual or auditory means is not feasi-
ble or ideal. However, one downside of haptic icons is that the mapping from the sensation to the
meaning is often abstract, so the user must learn the meaning behind the icon. Recognizing and
interpreting these abstract cues may cause a delay in the user’s response to the cue. Despite this
downside, haptic icons are a promising method for creating a large set of distinguishable haptic
cues that present information to the user. Researchers have studied how to best create distinct
haptic icons based on human tactile perception (71), and tools have been created for designing
and testing distinct icons (72).
Vibrations can also be used to display less abstract information, such as emotions. Patterns
of vibrations can be accurately matched to emoticons (73, 74) and emotions from facial expres-
sions (75). Similar to traditional haptic icons, these haptic emoticons are created by modulating
the temporal properties of the vibration signal. They still require training to successfully match
the cue to its intended meaning, but the mapping may be more intuitive because haptics can
affect the emotional state of the user (76).
4.4. Vibrotactile Illusions
In addition to displaying static sensations at discrete locations on the body, multiple vibration
actuators can also display illusory sensations, such as motion. The key to these haptic illusions
is the shape of the signals and the timing and spacing between individual actuators. Apparent
tactile motion, which was first described by Burtt (77), is the sensation that vibration travels in a
continuous motion across the skin. This illusion is created by controlling the length and delay of
actuation between individual actuators in a linear array. It has been successfully implemented in a
12.10 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
wearable navigation system (58) and in both one-dimensional (78, 79) and two-dimensional (80)
holdable devices.
A second haptic illusion created with vibration is the phantom tactile sensation. In this illusion,
two vibration actuators are activated simultaneously, and the user experiences a phantom vibration
sensation between the two actuators (81). The location of the phantom vibration is dependent on
the relative amplitudes of the two actuators. Israr & Poupyrev (82) combined the apparent motion
and phantom tactile sensation illusions to create smooth, two-dimensional tactile strokes with a
sparse vibrotactile array using an algorithm they called the tactile brush.
Another haptic illusion created with vibration actuators is sensory saltation, or the “cutaneous
rabbit,” which was first presented by Geldard & Sherrick (83). In this illusion, a series of short
vibration pulses are successively presented at three discrete locations on the skin, and the user
feels as though a rabbit were hopping along the skin in a continuous motion. An illusory hop is
felt midway between the actuators. This illusion is quite robust (84) and can be expanded to two
dimensions (85).
4.5. Asymmetric Vibrations
Vibrations have also been successfully used to generate a pulling sensation by creating an asym-
metric vibration profile. These profiles are created by generating a large positive acceleration pulse
followed by a small negative acceleration pulse. Although the net force generated by the system is
zero, the user senses the large pulse more strongly than the small pulse, resulting in the sensation
of being pulled in the direction of the large pulse.
The first systems to apply asymmetric vibrations were large mechanical devices that used a
slider–crank mechanism (86, 87) or spring (88) to move a mass with a specified motion profile.
Other systems have used an asymmetrically moving handle (89, 90). More recently, researchers
have turned to using smaller vibration actuators, including a linear resonant actuator (91), voice
coil (92–94), or speaker (95), to generate the asymmetric vibrations (Figure 5). These linear vi-
bration actuators rely on current sent to an electromagnet to move a mass to generate the desired
vibration. The small size of these actuators has allowed the asymmetric vibration principle to be
expanded to systems with multiple degrees-of-direction cues (94). The perception behind this
ungrounded pulling sensation is an area of active study, but researchers have hypothesized that it
is created by asymmetric lateral skin deformation that is sensed by the Meissner corpuscles (93).
Other researchers have proposed that the pulling sensation is induced by vibrations in the ten-
dons (95). Research into the perception of the sensation has indicated that the perceived strength
of the pulling sensation increases with the motion of the hand holding the actuator (96).
4.6. Vibrations for Increased Realism
Vibration has also been explored as a means of adding realism to virtual interactions. Vibration is
naturally produced during many interactions with the physical world, especially those that occur
through a tool. The human sense of touch excels at sensing and interpreting these vibrations
to gather information about interactions and the physical world (97). Vibrations produced when
dragging across a surface encode its roughness (98), and vibrations produced during tapping encode
its hardness (21). However, these high-frequency vibrations are missing during traditional haptic
rendering owing to limitations in the algorithms and devices (21, 99).
Researchers have worked to capture and recreate these high-frequency vibrations to create more
realistic and immersive virtual environments. Complete physics-based simulations to generate
these vibrations are too computationally complex for real-time rendering (100). Therefore, rather
www.annualreviews.org Haptics 12.11
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
100
Time (ms)
025
1
Amplitude
Time (ms)
0
1
Amplitude
Time (ms)
–1
0
1
Amplitude
Time (ms)
–1
1
Amplitude
Force
aPWM (2:8 duty cycle)
F = 100 Hz
Linear resonant actuator
Inverted sine wave
F = 50 Hz
PWM (7:18 duty cycle)
F = 40 Hz
Voice coil actuator
Modied PWM (7:18 duty cycle)
F = 40 Hz
Voice coil actuator
Speaker
b
cd
50 75
0
1000255075
10002550 75 1000 255075
Figure 5
Examples of ungrounded asymmetric vibration systems: (a) a Traxion system that uses a linear resonant actuator driven with a pulse-
width modulation (PWM) signal, (b) a voice coil driven with a PWM signal, (c) a speaker driven with an inverted sine wave signal, and
(d) a voice coil driven with a modified PWM signal. All signal amplitudes shown are individually normalized. Panel aadapted from
Reference 91; c
2013 Association for Computing Machinery, reprinted with permission. Panel badapted from Reference 92; c
2014
Springer Nature, reprinted with permission. Panel cadapted from Reference 95; c
2016 IEEE, reprinted with permission. Panel d
adapted from Reference 93; c
2016 IEEE, reprinted with permission.
than create models to simulate the vibrations from physics principles, many researchers have
instead worked to directly model the vibrations produced from interactions with the physical
surfaces in a process known as data-driven modeling (101). Data-driven models seek to capture
the output response of a system (e.g., force and acceleration) given user inputs (e.g., position,
velocity, and force).
Researchers captured and modeled vibrations to simulate the feel of interactions such as cut-
ting (102) and tapping (21). The most promising and widespread application of haptic data-driven
modeling has been to capture the feel of a textured surface. The simplest and most direct approach
has been to play back the recorded vibrations directly (103, 104). Direct playback of signals, how-
ever, often fails to fully capture the complexity of the real interaction. Following the principle of
distal attribution, for the virtual interaction to realistically mimic the real interaction, the rendered
signals must behave in a physically appropriate manner and must match the motions made by the
user (105). Therefore, the power and frequency content of the rendered signals must change if the
user’s force or speed changes in the virtual environment. This behavior of the rendered signals
requires that more complex models of the recorded data be created before rendering.
In one of the first examples, Okamura et al. (20) modeled the vibrations from patterned textures
as data-driven decaying sinusoids that depend on the user’s speed and applied force. Similarly,
Guruswamy et al. (106) created texture models based on a spatial distribution of infinite-impulse-
response filters that are fit with decaying sinusoids. Researchers have also created haptic tex-
ture models using autoregressive models that depended on the user’s normal force and scanning
speed (107, 108).
Traditional kinesthetic haptic devices do not have a high enough bandwidth to accurately recre-
ate the high-frequency vibrations needed for realistic texture modeling. Therefore, the majority of
researchers opt to use a dedicated vibration actuator, such as a voice coil, that is capable of accurate
12.12 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
and high-bandwidth vibration output. Some researchers who focus on the feel of textures with a
bare finger instead of through a tool opt to render the texture vibrations with a variable-friction
surface display (109).
4.7. Midair Vibration
The above-mentioned applications of haptics all require that the actuators directly contact the user
to transfer the mechanical signals to the skin. However, there has been recent interest in exploring
midair haptics, i.e., displaying haptic sensations to the user as they explore a virtual object in free
space without contacting the actuators. Focused ultrasound beams can be used to create a localized
sense of pressure on the fingertip in midair. An array of ultrasound transducers generates a focal
point by individually controlling the phase and intensity of each transducer in a phenomenon
known as acoustic radiation pressure. The radiation pressure is modulated at 1–1,000 Hz, so users
often report experiencing a vibratory-like stimulus in addition to the localized pressure (110). This
property of the signal has been exploited to provide midair localized vibration to the body (111).
The same principles have also been used to create floating screens with touchable icons (112),
multipoint haptic feedback (113), and three-dimensional shapes (114).
5. HAPTIC SURFACES
Another perspective on the creation of haptic experiences is that they should be real-world sur-
faces that can be actively explored by a human, enabling simultaneous kinesthetic and cutaneous
feedback. Such haptic surfaces would ideally feel just like natural objects in the environment,
except that they can arbitrarily change shape, mechanical properties, and surface texture. Owing
to limitations in scale, dimensionality, size, and weight of sensors and actuators, this ideal cannot
be perfectly achieved. However, through a combination of clever use of material properties and
control coupled with sensory illusions, several approaches are maturing and emerging to generate
highly compelling surface displays.
5.1. Pin Arrays
The traditional method of recreating a surface has been to control the vertical displacement of
pins laid out in a two-dimensional array. Because of the linear actuators, this rendering is often
limited to 2.5-dimensional shapes—each “pixel” in the tactile image is a physical pin attached
to a linear actuator that can move up and down to render 2.5-dimensional shapes. A variety of
technological approaches have been applied for actuation: DC motors with lead screws, rotational
servos, pneumatic actuators, and shape-memory alloys. For the majority of these displays, the
large number of actuators means that shape control primarily involves downsampling the desired
shape to the resolution of the display and positioning each element accordingly. The SmartMesh
multiloop mechanism (115) is based on extendable links arranged in a double-layer square grid,
and the formable object (116) uses a parallel rigid-body structure with kinematics optimized to
render basic shapes like cylinders and spheres. For a bed-of-pins display consisting of hydraulic
actuators arranged along the rows and columns of an array, the resolution of the interface can
more feasibly be increased because the number of actuators scales linearly rather than polynomially
with the size of the array (117). Another way of achieving perceived vertical displacement with
simpler actuation is to laterally bend a pin using a piezoelectric bimorph (118). Although each
pin shears (as opposed to normally displacing) the skin, the created strain field can feel like a
normal displacement. Interaction techniques and applications have been examined with large shape
www.annualreviews.org Haptics 12.13
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
Shape output
Pins
Linkages
Actuators
Figure 6
Large-scale pin array. This device can be used in conjunction with projection to create visuo-haptic environments, teleoperated via
three-dimensional measurements of human movement, and used as a distributed manipulator for objects placed on the surface. The
pins are driven by a large array of underlying motors and linkages. Figure adapted from Reference 119; c
2013 Association for
Computing Machinery, reprinted with permission.
displays (e.g., Figure 6), with a focus on supporting remote collaboration (119–121). However,
shape displays using the bed-of-pins approach remain limited by their 2.5-dimensional nature and
the fact that they are often large, table-scale devices owing to the size of their actuators.
5.2. Deformable Crust Devices
Rather than filling the volume of a shape with the body of a pin, one can control the shape
of a surface directly. This is the idea behind deformable crust topologies (122). A particularly
promising approach for creating deformable crusts is through an actuation approach often used in
soft robotics: pneumatics. For example, pneumatic soft composite actuators can undergo complex
shape changes with a single DoF (123). Particle jamming can be used to control the stiffness of
segments of a robot to lock segments, allowing for locomotion or shape changes (124, 125). Particle
jamming and pneumatic composites can be used for new human–computer interfaces (126, 127).
A combination of particle jamming cells on a nominally flat surface with air pressure applied from
below can create a surface with controllable and distributed geometry and mechanical properties
(128) (Figure 7).
5.3. Variable-Friction Surfaces
Given the shape of a surface, how do we control what it feels like? In the pin array and deformable
crust examples, the stiffness of the surface can be directly controlled. However, texture and friction
are also important to display realistic surfaces. This has been accomplished in recent years by
modulation of surface friction, which effectively displays changing shear force as the skin (usually
the fingertip) slides over a surface (Figure 8a). These displays change the friction between a
human finger and a plate by vibrating the plate at high (ultrasonic) frequencies (131, 132). When
the plate vibrates, the friction between the finger and the plate is drastically lower than it is when
the plate is static. The vibration frequencies (dozens of kilohertz) are high enough that they cannot
12.14 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
Low pressure
High pressure
Low vacuum
High vacuum
Inlet for
pressurized air Silicone
top layer
Coee grounds
Tubing for
vacuum lines
Reel Solenoid Reel cap
Silicone
bottom layer
Figure 7
Deformable crust using pneumatics and particle jamming. The pressure underneath the crust can raise and lower a cell, and the vacuum
inside a cell filled with granular material (in this case, coffee grounds) changes the hardness of the cell. Multiple cells can be combined
in an array with reels and solenoids to control the outlay of string pulling down on the crust at points between the cells. This results in a
surface with controllable shape and mechanical properties. Figure adapted from Reference 128; c
2015 IEEE, reprinted with
permission.
be heard or felt directly by a human. Measuring the position and speed of a finger touching the
surface enables the friction to be modulated in order to display walls, textures, etc., on the surface.
Another way to modulate friction uses changing electrostatic forces (109, 130) (Figure 8b).
Here, the normal force between the finger and surface depends on the attraction of electrostatic
force, which is not strong enough to feel in the direction perpendicular to the surface but changes
the effective friction enough to display similar walls and textures. The effect is not typically as strong
as with vibrations, but electrostatics has the advantage that it uses no moving parts. Inherent in
these approaches is that the surface must be actively explored by the finger(s), and the surface
can dissipate energy only by resisting fingertip movement. There is a way to actively push on
the finger, akin to what a traditional kinesthetic haptic device does: An actuator moves the plate
laterally under the finger while friction is high, imparting a force to the finger. Since the lateral
motion cannot go on forever, the system can be reset by lowering the friction using either the
ultrasonic vibration or electrostatic technique and relocating the plate to its original position,
ready to display to the finger again (133). In the future, this effect might be achieved passively
with clever microstructure design of materials.
ab
t0t1t2t3t4
– – – – – – –
+ + + + + +
+
++
Figure 8
Variable-friction surfaces. (a) The open-source Tactile Pattern Display (TPaD) tablet. This tablet operates by vibrating the glass
surface ultrasonically, which changes the friction between the finger and the surface and thus changes the shear force felt as a finger
slides across the surface (as described in Reference 129). (b) Electrostatic vibration. This approach changes the friction by using
electrostatic force to attract the finger to the surface. Although the forces are too small to feel directly, they change the friction felt by
the user as the finger slides over the surface. Panel aprovided by Joe Mullenbach. Panel badapted from Reference 130; c
2010
Association for Computing Machinery, reprinted with permission.
www.annualreviews.org Haptics 12.15
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
ab
Figure 9
Virtual reality scenarios. (a) A grasp-and-lift task with combined tactile and visual information to display virtual mass. (b)Haptic
redirection. This technique can invoke tactile interaction with static walls in a virtual environment in conjunction with a warped
graphical display to give the illusion of straight-line movement even when the actual movement is curved. Panel areproduced from
Reference 50; c
2017 Association for Computing Machinery, reprinted with permission. Panel breproduced from Reference 134.
6. INTEGRATION WITH VIRTUAL AND AUGMENTED REALITY
Head-mounted virtual-reality displays, with associated movement tracking and the development of
creative virtual-reality content, have recently led to significant interest in using haptics to enhance
the quality and accessibility of virtual experiences. Indeed, the lack of realistic haptic feedback is a
significant barrier to immersivity during object contact and manipulation in virtual reality—it is
terribly disappointing to reach toward a highly compelling (graphically) rendered virtual object,
see your fingers close around it, and feel ... nothing. The spell of virtual reality is broken. Here,
we explore the haptics-related approaches that have been successfully used or are in development
to enhance interaction with virtual worlds (Figure 9).
6.1. Encountered-Type Haptic Devices
An encountered-type haptic device produces physical environments for the user to explore directly
with his or her entire hands rather than relying on the user to hold an intermediary device through
which they receive haptic feedback (135). The active surfaces described in the preceding section
fall into this category, but there is much work to be done to seamlessly integrate encountered-
type haptic devices with virtual reality. One of the main reasons to use an encountered-type haptic
device with virtual reality is that the human user will see the graphically rendered environment,
not the haptic device. This means that the haptic device does not need to have any realistic physical
appearance and is required to feel right only at the point(s) of contact. However, this presents a
significant sensing and control challenge: predicting where the user will want to touch the virtual
environment as the hand approaches an object, such that the haptic device can position and shape
itself as needed in order to provide the desired haptic experience.
6.2. Pseudohaptics
Visuo-haptic illusions seek to use haptic illusions (136) and the overall dominance of the visual
system (137, 138) to create haptic feedback with passive props and visual feedback (139). This
pseudohaptic feedback can be used to render the perception of friction, stiffness, size, and weight.
Researchers have also sought to combine visuo-haptic illusions with active haptic feedback
12.16 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
systems. Pseudohaptics through visual feedback is effective in augmenting cutaneous active
haptic feedback for increased stiffness rendering (140). Spatial manipulation with single-point,
underactuated kinesthetic haptic devices has been augmented with visuo-haptic feedback to
increase the perception of rotation alignment (141). Ban et al. (142) have investigated actively
changing a single physical ridge on a passive shape to influence haptic perception of bumps or
other features on a surface. However, this space has not been explored fully, especially with regard
to integration with encountered-type shape-changing interfaces, which can allow for many types
of haptic exploratory procedures.
6.3. Haptic Retargeting
Haptic retargeting combines the concepts of encountered-type haptic devices and pseudohaptics
described above. Moving the visual representation of the hand in the virtual environment, even
if the felt environment is flat, can imply curvature (143). This same effect can be used to display
angular information (144). Taking advantage of visual dominance in proprioception enables the
use of visual stimuli to exaggerate the angular displacement of the head, which can be used for
redirected walking (Figure 9b). More recently, this effect coupled with spatial warping of the
visually perceived hand location has been used for redirected touching (145) and grasping of
shape primitives (146).
7. CONCLUSION
As the role of technology in communication, training, and entertainment increases, so does the
number of potential applications of haptics. Haptics has already shown great promise in mobile
communication and gaming, although the expressiveness of many commercial haptic devices has
been limited. The potential impact of haptics is greatest in areas where the sense of touch is critical
to the task or situation (e.g., remote surgery and prosthetic limbs), where touch can replace visual or
auditory cues (e.g., mobile communication and navigation), and where touch can enhance virtual
interactions (e.g., virtual reality and gaming). One of the largest factors limiting the impact of the
field of haptics has been the availability and expressiveness of hardware. Commercially available
actuators designed exclusively for haptic output are limited, which leads many researchers to
develop their own haptic devices using off-the-shelf components. The key to developing functional
and effective haptic hardware is to focus on the perceptual capabilities of the human sense of touch
during the design process. This human-centered design paradigm results in haptic hardware that
more effectively stimulates the mechanoreceptors to display the desired sensations.
FUTURE ISSUES
1. How will we identify and understand the perceptual basis of new haptic illusions to
facilitate the design of haptic systems with minimal sensing and actuation? Two key
challenges will be identifying haptic illusions that are robust across users and determining
the ideal actuation parameters for creating the strongest illusion.
2. Can we enable consumer haptic devices by decreasing cost, size and weight, and power
requirements, potentially via the use of novel actuators and smart materials? The chal-
lenges here will be reducing the size of haptic actuators while maintaining bandwidth,
force output, range of motion, and important degrees of freedom; powering haptic ac-
tuators in mobile and wearable devices; and enabling wireless communication with and
control of haptic devices.
www.annualreviews.org Haptics 12.17
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
3. Is it possible to create predictive models of human perception and cognition surround-
ing touch feedback, in order to decrease reliance on exhaustive human user studies? Past
psychophysics studies have focused on human touch perception at a small number of
locations on the body, including the hand, so it will be useful to expand our existing
knowledge of touch perception to cover other body locations. In addition, there is cur-
rently no standardized metric for evaluating the output characteristics of haptic devices.
DISCLOSURE STATEMENT
The authors are not aware of any affiliations, memberships, funding, or financial holdings that
might be perceived as affecting the objectivity of this review.
ACKNOWLEDGMENTS
We would like to acknowledge that there are numerous important works in haptics that could not
be cited here owing to space limitation. Ideas discussed in this review were formed in large part via
interesting discussions with researchers in the haptics field, including Katherine Kuchenbecker,
Karon MacLean, William Provancher, Sean Follmer, J. Edward Colgate, Andrew Stanley, and
Jacob Suchoski. We thank the following sponsors for supporting our haptics research: the National
Science Foundation, the National Institutes of Health, Oculus Research, and Ford.
LITERATURE CITED
1. Chu V, McMahon I, Riano L, McDonald CG, He Q, et al. 2015. Robotic learning of haptic adjectives
through physical interaction. Robot. Auton. Syst. 63:279–92
2. Flesher SN, Collinger JL, Foldes ST, Weiss JM, Downey JE, et al. 2016. Intracortical microstimulation
of human somatosensory cortex. Sci. Transl. Med. 8:361ra141
3. Johansson RS, Flanagan JR. 2009. Coding and use of tactile signals from the fingertips in object manip-
ulation tasks. Nat. Rev. Neurosci. 10:345–59
4. Johnson KO, Yoshioka T, Vega-Bermudez F. 2000. Tactile functions of mechanoreceptive afferents
innervating the hand. J. Clin. Neurophysiol. 17:539–58
5. Bolanowski SJ, Gescheider GA, Verrillo RT. 1994. Hairy skin: psychophysical channels and their phys-
iological substrates. Somatosens. Motor Res. 11:279–90
6. Massie TH, Salisbury JK. 1994. The phantom haptic interface: a device for probing virtual objects. In
Proceedings of the ASME Dynamic Systems and Control Division, Vol. 55, pp. 295–300. New York: IEEE
7. Colgate JE, Brown JM. 1994. Factors affecting the Z-width of a haptic display. In Proceedings of the 1994
IEEE International Conference on Robotics and Automation, pp. 3205–10. New York: IEEE
8. Peshkin MA, Colgate JE, Wannasuphoprasit W, Moore CA, Gillespie RB, Akella P. 2001. Cobot archi-
tecture. IEEE Trans. Robot. Autom. 17:377–90
9. Chan S, Conti F, Blevins NH, Salisbury K. 2011. Constraint-based six degree-of-freedom haptic ren-
dering of volume-embedded isosurfaces. In 2011 IEEE World Haptics Conference, pp. 89–94. New York:
IEEE
10. Walker JM, Colonnese N, Okamura AM. 2016. Noise, but not uncoupled stability, reduces realism and
likeability of bilateral teleoperation. IEEE Robot. Autom. Lett. 1:562–69
11. Orta Martinez M, Morimoto TK, Taylor AT, Barron AC, Pultorak JDA, et al. 2016. 3-D printed haptic
devices for educational applications. In 2016 IEEE Haptics Symposium, pp. 126–33. New York: IEEE
12. Minogue J, Jones MG. 2006. Haptics in education: exploring an untapped sensory modality. Rev. Educ.
Res. 76:317–48
12.18 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
13. Zhang J, Fiers P, Witte KA, Jackson RW, Poggensee KL, et al. 2017. Human-in-the-loop optimization
of exoskeleton assistance during walking. Science 356:1280–84
14. Stetten G, Wu B, Klatzky R, Galeotti J, Siegel M, et al. 2011. Hand-held force magnifier for surgical
instruments. In Information Processing in Computer-Assisted Interventions: IPCAI 2011, ed. RH Taylor,
GZ Yang, pp. 90–100. Berlin: Springer
15. Polygerinos P, Wang Z, Galloway KC, Wood RJ, Walsh CJ. 2015. Soft robotic glove for combined
assistance and at-home rehabilitation. Robot. Auton. Syst. 73:135–43
16. Wehner M, Quinlivan B, Aubin PM, Martinez-Villalpando E, Baumann M, et al. 2013. A lightweight soft
exosuit for gait assistance. In 2013 IEEE International Conference on Robotics and Automation, pp. 3362–69.
New York: IEEE
17. Pacchierotti C, Sinclair S, Solazzi M, Frisoli A, Hayward V, Prattichizzo D. 2017. Wearable haptic
systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE Trans. Haptics 10:580–
600
18. Diolaiti N, Niemeyer G, Barbagli F, Salisbury J. 2006. Stability of haptic rendering: discretization,
quantization, time delay, and coulomb effects. IEEE Trans. Robot. 22:256–68
19. Srinivasan MA, Beauregard GL, Brock DL. 1996. The impact of visual information on the haptic percep-
tion of stiffness in virtual environments. In Proceedings of the ASME Dynamic Systems and Control Division,
Vol. 58, pp. 555–59. New York: ASME
20. Okamura AM, Dennerlein JT, Howe RD. 1998. Vibration feedback models for virtual environments. In
1998 IEEE International Conference on Robotics and Automation, pp. 674–79. New York: IEEE
21. Kuchenbecker KJ, Fiene J, Niemeyer G. 2006. Improving contact realism through event-based haptic
feedback. IEEE Trans. Vis. Comput. Graph. 12:219–30
22. Hachisu T, Sato M, Fukushima S, Kajimoto H. 2012. Augmentation of material property by modulating
vibration resulting from tapping. In Haptics: Perception, Devices, Mobility, and Communication: EuroHaptics
2012, ed. P Isokoski, J Springare, pp. 173–80. Berlin: Springer
23. Prattichizzo D, Pacchierotti C, Rosati G. 2012. Cutaneous force feedback as a sensory subtraction
technique in haptics. IEEE Trans. Haptics 5:289–300
24. Biggs J, Srinivasan M. 2002. Tangential versus normal displacements of skin: relative effectiveness for
producing tactile sensations. In Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environ-
ment and Teleoperator Systems, pp. 121–28. New York: IEEE
25. Drewing K, Fritschi M, Zopf R, Ernst MO, Buss M. 2005. First evaluation of a novel tactile display
exerting shear force via lateral displacement. ACM Trans. Appl. Percept. 2:118–31
26. Gleeson B, Horschel S, Provancher W. 2010. Perception of direction for applied tangential skin dis-
placement: effects of speed, displacement, and repetition. IEEE Trans. Haptics 3:177–88
27. Webster RJ, Murphy TE, Verner LN, Okamura AM. 2005. A novel two-dimensional tactile slip display:
design, kinematics and perceptual experiment. ACM Trans. Appl. Percept. 2:150–65
28. Guzererler A, Provancher WR, Basdogan C. 2016. Perception of skin stretch applied to palm: effects
of speed and displacement. In Haptics: Perception, Devices, Control, and Applications: EuroHaptics 2016,ed.
F Bello, H Kajimoto, Y Visell, pp. 180–89. Berlin: Springer
29. Bark K, Wheeler J, Shull P, Savall J, Cutkosky M. 2010. Rotational skin stretch feedback: a wearable
haptic display for motion. IEEE Trans. Haptics 3:166–76
30. Wheeler J, Bark K, Savall J, Cutkosky M. 2010. Investigation of rotational skin stretch for proprioceptive
feedback with application to myoelectric systems. IEEE Trans. Neural Syst. Rehabil. Eng. 18:58–66
31. Guinan AL, Hornbaker NC, Montandon MN, Doxon AJ, Provancher WR. 2013. Back-to-back skin
stretch feedback for communicating five degree-of-freedom direction cues. In 2013 World Haptics Con-
ference (WHC), pp. 13–18. New York: IEEE
32. Provancher WR, Sylvester ND. 2009. Fingerpad skin stretch increases the perception of virtual friction.
IEEE Trans. Haptics 2:212–23
33. Quek ZF, Schorr SB, Nisky I, Okamura AM, Provancher WR. 2014. Augmentation of stiffnessperception
with a 1-degree-of-freedom skin stretch device. IEEE Trans. Hum.-Mach. Syst. 44:731–42
34. Quek ZF, Schorr SB, Nisky I, Okamura AM, Provancher WR. 2014. Sensory substitution using 3-
degree-of-freedom tangential and normal skin deformation feedback. In 2014 IEEE Haptics Symposium,
pp. 27–33. New York: IEEE
www.annualreviews.org Haptics 12.19
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
35. Quek ZF, Schorr SB, Nisky I, Provancher WR, Okamura AM. 2015. Sensory substitution of force
and torque using 6-DoF tangential and normal skin deformation feedback. In 2015 IEEE International
Conference on Robotics and Automation (ICRA), pp. 264–71. New York: IEEE
36. Girard A, Marchal M, Gosselin F, Chabrier A, Louveau F, L´
ecuyer A. 2016. HapTip: displaying haptic
shear forces at the fingertips for multi-finger interaction in virtual environments. Front. ICT 3:6
37. Quek ZF, Schorr SB, Nisky I, Provancher WR, Okamura AM. 2015. Sensory substitution and augmen-
tation using 3-degree-of-freedom skin deformation feedback. IEEE Trans. Haptics 8:209–21
38. Schorr SB, Quek ZF, Romano RY, Nisky I, Provancher WR, Okamura AM. 2013. Sensory substitution
via cutaneous skin stretch feedback. In 2013 IEEE International Conference on Robotics and Automation,
pp. 2341–46. New York: IEEE
39. Schorr SB, Quek ZF, Nisky I, Provancher W, Okamura AM. 2015. Tactor-induced skin stretch as a
sensory substitution method in teleoperated palpation. IEEE Trans. Hum.-Mach. Syst. 45:714–26
40. Solazzi M, Frisoli A, Bergamasco M. 2010. Design of a novel finger haptic interface for contact and
orientation display. In 2010 IEEE Haptics Symposium, pp. 129–32. New York: IEEE
41. Pacchierotti C, Tirmizi A, Prattichizzo D. 2014. Improving transparency in teleoperation by means of
cutaneous tactile force feedback. ACM Trans. Appl. Percept. 11:4
42. Leonardis D, Solazzi M, Bortone I, Frisoli A. 2015. A wearable fingertip haptic device with 3 DoF
asymmetric 3-RSR kinematics. In 2015 IEEE World Haptics Conference, pp. 388–93. New York: IEEE
43. Schorr SB, Okamura AM. 2017. Three-dimensional skin deformation as force substitution: wearable
device design and performance during haptic exploration of virtual environments. IEEE Trans. Haptics
10:418–30
44. Brown JD, Ibrahim M, Chase EDZ, Pacchierotti C, Kuchenbecker KJ. 2016. Data-driven comparison
of four cutaneous displays for pinching palpation in robotic surgery. In 2016 IEEE Haptics Symposium,
pp. 147–54. New York: IEEE
45. Perez AG, Lobo D, Chinello F, Cirio G, Malvezzi M, et al. 2015. Soft finger tactile rendering for
wearable haptics. In 2015 IEEE World Haptics Conference, pp. 327–32. New York: IEEE
46. Prattichizzo D, Chinello F, Pacchierotti C, Malvezzi M. 2013. Towards wearability in fingertip haptics:
a 3-DoF wearable device for cutaneous force feedback. IEEE Trans. Haptics 6:506–16
47. Pacchierotti C, Meli L, Chinello F, Malvezzi M, Prattichizzo D. 2015. Cutaneous haptic feedback to
ensure the stability of robotic teleoperation systems. Int. J. Robot. Res. 34:1773–87
48. Pacchierotti C, Prattichizzo D, Kuchenbecker KJ. 2016. Cutaneous feedback of fingertip deformation
and vibration for palpation in robotic surgery. IEEE Trans. Biomed. Eng. 63:278–87
49. Tsetserukou D, Hosokawa S, Terashima K. 2014. LinkTouch: a wearable haptic device with five-bar
linkage mechanism for presentation of two-DOF force feedback at the fingerpad. In 2014 IEEE Haptics
Symposium, pp. 307–12. New York: IEEE
50. Schorr SB, Okamura AM. 2017. Fingertip tactile devices for virtual object manipulation and exploration.
In Proceedings of the 2017 ACM CHI Conference on Human Factors in Computing Systems, pp. 3115–19. New
York: ACM
51. Sofia KO, Jones L. 2013. Mechanical and psychophysical studies of surface wave propagation during
vibrotactile stimulation. IEEE Trans. Haptics 6:320–29
52. Bell J, Bolanowski S, Holmes MH. 1994. The structure and function of Pacinian corpuscles: a review.
Prog. Neurobiol. 42:79–128
53. Ackerley R, Carlsson I, Wester H, Olausson H, Wasling HB. 2014. Touch perceptions across skin sites:
differences between sensitivity, direction discrimination and pleasantness. Front. Behav. Neurosci. 8:54
54. Meier A, Matthies DJ, Urban B, Wettach R. 2015. Exploring vibrotactile feedback on the body and foot
for the purpose of pedestrian navigation. In Proceedings of the 2nd International Workshop on Sensor-Based
Activity Recognition and Interaction, art. 11. New York: ACM
55. Zelek JS, Bromley S, Asmar D, Thompson D. 2003. A haptic glove as a tactile-vision sensory substitution
for wayfinding. J. Vis. Impair. Blind. 97:621–32
56. Paneels S, Anastassova M, Strachan S, Van SP, Sivacoumarane S, Bolzmacher C. 2013. What’s around
me? Multi-actuator haptic feedback on the wrist. In 2013 IEEE World Haptics Conference, pp. 407–12.
New York: IEEE
12.20 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
57. Elliott LR, van Erp J, Redden ES, Duistermaat M. 2010. Field-based validation of a tactile navigation
device. IEEE Trans. Haptics 3:78–87
58. Jones LA, Lockyer B, Piateski E. 2006. Tactile display and vibrotactile pattern recognition on the torso.
Adv. Robot. 20:1359–74
59. Bark K, Khanna P, Irwin R, Kapur P, Jax SA, et al. 2011. Lessons in using vibrotactile feedback to guide
fast arm motions. In 2011 IEEE World Haptics Conference, pp. 355–60. New York: IEEE
60. Jansen C, Oving A, van Veen HJ. 2004. Vibrotactile movement initiation. In Proceedings of the EuroHaptics
International Conference (EuroHaptics ’04), pp. 110–17. Berlin: Springer
61. Culbertson H, Walker JM, Raitor M, Okamura AM, Stolka PJ. 2016. Plane assist: the influence of haptics
on ultrasound-based needle guidance. In Medical Image Computing and Computer-Assisted Intervention:
MICCAI 2016, ed. S Ourselin, L Joskowicz, M Sabuncu, G Unal, W Wells, pp. 370–77. Cham, Switz.:
Springer
62. Christiansen R, Contreras-Vidal JL, Gillespie RB, Shewokis PA, O’Malley MK. 2013. Vibrotactile
feedback of pose error enhances myoelectric control of a prosthetic hand. In 2013 IEEE World Haptics
Conference, pp. 531–36. New York: IEEE
63. Rotella MF, Guerin K, He X, Okamura AM. 2012. HAPI Bands: a haptic augmented posture interface.
In 2012 IEEE Haptics Symposium, pp. 163–70. New York: IEEE
64. Bark K, Hyman E, Tan F, Cha E, Jax SA, et al. 2015. Effects of vibrotactile feedback on human learning
of arm motions. IEEE Trans. Neural Syst. Rehabil. Eng. 23:51–63
65. Bluteau J, Dubois MD, Coquillart S, Gentaz E, Payan Y. 2010. Vibrotactile guidance for trajectory
following in computer aided surgery. In 2010 Annual International Conference of the IEEE Engineering in
Medicine and Biology, pp. 2085–88. New York: IEEE
66. Kontarinis DA, Howe RD. 1995. Tactile display of vibratory information in teleoperation and virtual
environments. Presence Teleoper. Virtual Environ. 4:387–402
67. McMahan W, Gewirtz J, Standish D, Martin P, Kunkel J, et al. 2011. Tool contact acceleration feedback
for telerobotic surgery. IEEE Trans. Haptics 4:210–20
68. Dennerlein JT, Millman PA, Howe RD. 1997. Vibrotactile feedback for industrial telemanipulators. In
Sixth Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 189–95.
New York: ASME
69. Sibert J, Cooper J, Covington C, Stefanovski A, Thompson D, Lindeman RW. 2006. Vibrotactile feed-
back for enhanced control of urban search and rescue robots. In Proceedings of the 2006 IEEE International
Workshop on Safety, Security and Rescue Robotics. New York: IEEE
70. Brewster S, Brown LM. 2004. Tactons: structured tactile messages for non-visual information display.
In Proceedings of the Fifth Conference on Australasian User Interface, pp. 15–23. New York: ACM
71. Azadi M, Jones LA. 2014. Evaluating vibrotactile dimensions for the design of tactons. IEEE Trans.
Haptics 7:14–23
72. Schneider OS, MacLean KE. 2016. Studying design process and example use with macaron, a web-based
vibrotactile effect editor. In 2016 IEEE Haptics Symposium, pp. 52–58. New York: IEEE
73. Rovers L, van Essen HA. 2004. Design and evaluation of hapticons for enriched instant messaging.
Virtual Reality 9:177–91
74. Mathew D. 2005. vSmileys: imaging emotions through vibration patterns. In Alternative Access: Feeling
and Games 2005, pp. 75–80. Tampere, Finl.: Univ. Tampere
75. Krishna S, Bala S, McDaniel T, McGuire S, Panchanathan S. 2010. VibroGlove: an assistive technology
aid for conveying facial expressions. In CHI ’10: Extended Abstracts on Human Factors in Computing Systems,
pp. 3637–42. New York: ACM
76. Eid MA, Al Osman H. 2016. Affectivehaptics: current research and future directions. IEEE Access 4:26–40
77. Burtt HE. 1917. Tactual illusions of movement. J. Exp. Psychol. 2:371–85
78. Kang J, Lee J, Kim H, Cho K, Wang S, Ryu J. 2012. Smooth vibrotactile flow generation using two
piezoelectric actuators. IEEE Trans. Haptics 5:21–32
79. Seo J, Choi S. 2013. Perceptual analysis of vibrotactile flows on a mobile device. IEEE Trans. Haptics
6:522–27
80. Seo J, Choi S. 2015. Edge flows: improving information transmission in mobile devices using two-
dimensional vibrotactile flows. In 2015 IEEE World Haptics Conference, pp. 25–30. New York: IEEE
www.annualreviews.org Haptics 12.21
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
81. Alles DS. 1970. Information transmission by phantom sensations. IEEE Trans. Man-Mach. Syst. 11:85–91
82. Israr A, Poupyrev I. 2011. Tactile brush: drawing on skin with a tactile grid display. In CHI ’11: Proceedings
of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2019–28. New York: ACM
83. Geldard FA, Sherrick CE. 1972. The cutaneous “rabbit”: a perceptual illusion. Science 178:178–79
84. Cholewiak RW, Collins AA. 2000. The generation of vibrotactile patterns on a linear array: influences
of body site, time, and presentation mode. Atten. Percept. Psychophys. 62:1220–35
85. Yang GH, Ryu D, Park S, Kang S. 2012. Sensory saltation and phantom sensation for vibrotactile display
of spatial and directional information. Presence Teleoper. Virtual Environ. 21:192–202
86. Amemiya T, Ando H, Maeda T. 2005. Virtual force display: direction guidance using asymmetric
acceleration via periodic translational motion. In 2005 IEEE World Haptics Conference, pp. 619–22.
New York: IEEE
87. Amemiya T, Ando H, Maeda T. 2005. Phantom-DRAWN: direction guidance using rapid and asym-
metric acceleration weighted by nonlinearity of perception. In Proceedings of the 2005 ACM International
Conference on Augmented Tele-Existence, pp. 201–8. New York: ACM
88. Shima T, Takemura K. 2012. An ungrounded pulling force feedback device using periodical vibration-
impact. In Haptics: Perception, Devices, Mobility, and Communication: EuroHaptics 2012, ed. P Isokoski,
J Springare, pp. 481–92. Berlin: Springer
89. Tappeiner HW, Klatzky RL, Unger B, Hollis R. 2009. Good vibrations: asymmetric vibrations for
directional haptic cues. In 2009 IEEE World Haptics Conference, pp. 285–89. New York: IEEE
90. Imaizumi A, Okamoto S, Yamada Y. 2014. Friction sensation produced by laterally asymmetric vibrotac-
tile stimulus. In Haptics: Neuroscience, Devices, Modeling, and Applications: EuroHaptics 2014, ed. M Auvray,
C Duriez, pp. 11–18. Berlin: Springer
91. Rekimoto J. 2013. Traxion: a tactile interaction device with virtual force sensation. In Proceedings of the
26th Annual ACM Symposium on User Interface Software and Technology, pp. 427–32. New York: ACM
92. Amemiya T, Gomi H. 2014. Distinct pseudo-attraction force sensation by a thumb-sized vibrator that
oscillates asymmetrically. In Haptics: Neuroscience, Devices, Modeling, and Applications: EuroHaptics 2014,
ed. M Auvray, C Duriez, pp. 88–95. Berlin: Springer
93. Culbertson H, Walker JM, Okamura AM. 2016. Modeling and design of asymmetric vibrations to induce
ungrounded pulling sensation through asymmetric skin displacement. In 2016 IEEE Haptics Symposium,
pp. 27–33. New York: IEEE
94. Culbertson H, Walker JM, Raitor M, Okamura AM. 2017. WAVES: a wearable asymmetric vibration
excitation system for presenting three-dimensional translation and rotation cues. In Proceedings of the
2017 CHI Conference on Human Factors in Computing Systems, pp. 4972–82. New York: ACM
95. Tanabe T, Yano H, Iwata H. 2016. Properties of proprioceptive sensation with a vibration speaker-type
non-grounded haptic interface. In 2016 IEEE Haptics Symposium, pp. 21–26. New York: IEEE
96. Amemiya T, Gomi H. 2016. Active manual movement improves directional perception of illusory force.
IEEE Trans. Haptics 9:465–73
97. Klatzky RL, Lederman SJ, Hamilton C, Grindley M, Swendsen RH. 2003. Feeling textures through a
probe: effects of probe and surface geometry and exploratory factors. Atten. Percept. Psychophys. 65:613–31
98. Lederman SJ, Klatzky RL, Hamilton CL, Ramsay GI. 1999. Perceiving surface roughness via a rigid
probe: effects of exploration speed and mode of touch. Haptics-e 1:1
99. Campion G, Hayward V. 2005. Fundamental limits in the rendering of virtual haptic textures. In First
Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator
Systems, pp. 263–70. New York: IEEE
100. Otaduy MA, Lin MC. 2008. Rendering of textured objects. In Haptic Rendering: Foundations, Algorithms,
and Applications, ed. M Lin, M Otaduy, pp. 371–93. Boca Raton, FL: CRC
101. Okamura AM, Kuchenbecker KJ, Mahvash M. 2008. Measurement-based modeling for haptic display.
In Haptic Rendering: Foundations, Algorithms, and Applications, ed. M Lin, M Otaduy, pp. 443–67. Boca
Raton, FL: CRC
102. Okamura AM, Webster RJ III, Nolin JT, Johnson KW, Jafry H. 2003. The haptic scissors: cutting
in virtual environments. In 2003 IEEE International Conference on Robotics and Automation, pp. 828–33.
New York: IEEE
12.22 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
103. Takeuchi Y, Kamuro S, Minamizawa K, Tachi S. 2012. Haptic duplicator. In Proceedings of the 2012
Virtual Reality International Conference, art. 30. New York: ACM
104. Saga S, Raskar R. 2012. Feel through window: simultaneous geometry and texture display based on
lateral force for touchscreen. In SIGGRAPH Asia 2012 Emerging Technologies,art.8.NewYork:ACM
105. Loomis JM. 1992. Distal attribution and presence. Presence Teleoper. Virtual Environ. 1:113–19
106. Guruswamy VL, Lang J, Lee WS. 2009. Modeling of haptic vibration textures with infinite-impulse-
response filters. In 2009 IEEE International Workshop on Haptic Audio Visual Environments and Games,
pp. 105–10. New York: IEEE
107. Romano JM, Kuchenbecker KJ. 2012. Creating realistic virtual textures from contact acceleration data.
IEEE Trans. Haptics 5:109–19
108. Culbertson H, Unwin J, Kuchenbecker KJ. 2014. Modeling and rendering realistic textures from un-
constrained tool-surface interactions. IEEE Trans. Haptics 7:381–93
109. Meyer DJ, Wiertlewski M, Peshkin MA, Colgate JE. 2014. Dynamics of ultrasonic and electrostatic
friction modulation for rendering texture on haptic surfaces. In 2014 IEEE Haptics Symposium, pp. 63–
67. New York: IEEE
110. Hoshi T, Takahashi M, Iwamoto T, Shinoda H. 2010. Noncontact tactile display based on radiation
pressure of airborne ultrasound. IEEE Trans. Haptics 3:155–65
111. Hasegawa K, Shinoda H. 2013. Aerial display of vibrotactile sensation with high spatial-temporal res-
olution using large-aperture airborne ultrasound phased array. In 2013 IEEE World Haptics Conference,
pp. 31–36. New York: IEEE
112. Monnai Y, Hasegawa K, Fujiwara M, Yoshino K, Inoue S, Shinoda H. 2014. HaptoMime: mid-air
haptic interaction with a floating virtual screen. In Proceedings of the 27th Annual ACM Symposium on User
Interface Software and Technology, pp. 663–67. New York: ACM
113. Carter T, Seah SA, Long B, Drinkwater B, Subramanian S. 2013. UltraHaptics: multi-point mid-air
haptic feedback for touch surfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface
Software and Technology, pp. 505–14. New York: ACM
114. Long B, Seah SA, Carter T, Subramanian S. 2014. Rendering volumetric haptic shapes in mid-air using
ultrasound. ACM Trans. Graph. 33:181
115. Mazzone A, Kunz A. 2005. Sketching the future of the SmartMesh wide area haptic feedback device by
introducing the controlling concept for such a deformable multi-loop mechanism. In 2005 IEEE World
Haptics Conference, pp. 308–15. New York: IEEE
116. Klare S, Peer A. 2014. The formable object: a 24-degree-of-freedom shape-rendering interface.
IEEE/ASME Trans. Mechatron. 20:1360–71
117. Winck R, Kim J, Book WJ, Park H. 2012. Command generation techniques for a pin array using the
SVD and the SNMF. IFAC Proc. Vol. 45:411–16
118. Hayward V, Cruz-Hernandez M. 2000. Tactile display device using distributed lateral skin stretch. In
Proceedings of the Haptic Interfaces for Virtual Environment and Teleoperator Systems Symposium, Vol. 69,
pp. 1309–14. New York: ASME
119. Follmer S, Leithinger D, Olwal A, Hogge A, Ishii H. 2013. inFORM: dynamic physical affordances and
constraints through shape and object actuation. In Proceedings of the 26th Annual ACM Symposium on User
Interface Software and Technology, pp. 417–26. New York: ACM
120. Leithinger D, Follmer S, Olwal A, Ishii H. 2014. Physical telepresence: shape capture and display for
embodied, computer-mediated remote collaboration. In Proceedings of the 27th Annual ACM Symposium
on User Interface Software and Technology, pp. 461–70. New York: ACM
121. Leithinger D, Follmer S, Olwal A, Luescher S, Hogge A, et al. 2013. Sublimate: state-changing virtual
and physical rendering to augment interaction with shape displays. In CHI ’13: Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems, pp. 1441–50. New York: ACM
122. Rossignac J, Allen M, Book W, Glezer A, Ebert-Uphoff I, et al. 2003. Finger sculpting with digital
clay: 3D shape input and output through a computer-controlled real surface. In 2003 Shape Modeling
International, pp. 229–31. New York: IEEE
123. Majidi C. 2014. Soft robotics: a perspective—current trends and prospects for the future. Soft Robot.
1:5–11
www.annualreviews.org Haptics 12.23
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
124. Steltz E, Mozeika A, Rembisz J. 2010. Jamming as an enabling technology for soft robotics. SPIE Proc.
7642:764225
125. Steltz E, Mozeika A, Rodenberg N, Brown E, Jaeger H. 2009. JSEL: jamming skin enabled locomotion.
In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5672–77. New York: IEEE
126. Follmer S, Leithinger D, Olwal A, Cheng N, Ishii H. 2012. Jamming user interfaces: programmable
particle stiffness and sensing for malleable and shape-changing devices. In Proceedings of the 25th Annual
ACM Symposium on User Interface Software and Technology, pp. 519–28. New York: ACM
127. Yao L, Niiyama R, Ou J, Follmer S, Della Silva C, Ishii H. 2013. PneUI: pneumatically actuated soft
composite materials for shape changing interfaces. In Proceedings of the 26th Annual ACM Symposium on
User Interface Software and Technology, pp. 13–22. New York: ACM
128. Stanley AA, Okamura AM. 2015. Controllable surface haptics via particle jamming and pneumatics.
IEEE Trans. Haptics 8:20–30
129. Mullenbach J, Shultz C, Piper AM, Peshkin MA, Colgate JE. 2013. Surface haptic interactions with a
TPad tablet. In Proceedings of the Adjunct Publication of the 26th Annual ACM Symposium on User Interface
Software and Technology, pp. 7–8. New York: ACM
130. Bau O, Poupyrev I, Israr A, Harrison C. 2010. TeslaTouch: electrovibration for touch surfaces. In
Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 283–92.
New York: ACM
131. Winfield L, Glassmire J, Colgate JE, Peshkin M. 2007. T-PaD: tactile pattern display through variable
friction reduction. In 2007 IEEE World Haptics Conference, pp. 421–26. New York: IEEE
132. Takasaki M, Kotani H, Mizuno T, Nara T. 2005. Transparent surface acoustic wave tactile display. In
2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3354–59. New York: IEEE
133. Mullenbach J, Johnson D, Colgate J, Peshkin M. 2012. ActivePaD surface haptic device. In 2012 IEEE
Haptics Symposium, pp. 407–14. New York: IEEE
134. Matsumoto K, Ban Y, Narumi T, Yanase Y, Tanikawa T, Hirose M. 2016. Unlimited corridor: redirected
walking techniques using visuo haptic interaction. In ACM SIGGRAPH 2016 Emerging Technologies,art.
20. New York: ACM
135. Yokokohji Y. 2005. Designing an encountered-type haptic display for multiple fingertip contacts based
on the observation of human grasping behaviors. Int. J. Robot. Res. 24:717–29
136. Lederman SJ, Jones LA. 2011. Tactile and haptic illusions. IEEE Trans. Haptics 4:273–94
137. Klatzky RL, Lederman SJ, Reed C. 1987. There’s more to touch than meets the eye: the salience of
object attributes for haptics with and without vision. J. Exp. Psychol. Gen. 116:356
138. Rock I, Victor J. 1964. Vision and touch: an experimentally created conflict between the two senses.
Science 143:594–96
139. L´
ecuyer A. 2009. Simulating haptic feedback using vision: a survey of research and applications of
pseudo-haptic feedback. Presence Teleoper. Virtual Environ. 18:39–53
140. Jang I, Lee D. 2014. On utilizing pseudo-haptics for cutaneous fingertip haptic device. In 2014 IEEE
Haptics Symposium, pp. 635–39. New York: IEEE
141. L´
ecuyer A, Burkhardt JM, Le Biller J, Congedo M. 2005. “A4”: a technique to improve perception of
contacts with under-actuated haptic devices in virtual reality. In 2005 IEEE World Haptics Conference,
pp. 316–22. New York: IEEE
142. Ban Y, Narumi T, Tanikawa T, Hirose M. 2014. Displaying shapes with various types of surfaces
using visuo-haptic interaction. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and
Technology, pp. 191–96. New York: ACM
143. Ban Y, Kajinami T, Narumi T, Tanikawa T, Hirose M. 2012. Modifying an identified curved surface
shape using pseudo-haptic effect. In 2012 IEEE Haptics Symposium, pp. 211–16. New York: IEEE
144. Ban Y, Kajinami T, Narumi T, Tanikawa T, Hirose M. 2012. Modifying an identified angle of edged
shapes using pseudo-haptic effects. In Haptics: Perception, Devices, Mobility, and Communication: EuroHaptics
2012, ed. P Isokoski, J Springare, pp. 25–36. Berlin: Springer
145. Kohli L. 2010. Redirected touching: warping space to remap passive haptics. In 2010 IEEE Symposium
on 3D User Interfaces (3DUI), pp. 129–30. New York: IEEE
146. Azmandian M, Hancock M, Benko H, Ofek E, Wilson AD. 2016. Haptic retargeting: dynamic repurpos-
ing of passive haptics for enhanced virtual reality experiences. In Proceedings of the 2016 CHI Conference
on Human Factors in Computing Systems, pp. 1968–79. New York: ACM
12.24 Culbertson ·Schorr ·Okamura
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
AS01CH12_Okamura ARI 29 January 2018 10:42
RELATED RESOURCES
Bensmaia SJ, Miller LE. 2014. Restoring sensorimotor function through intracortical interfaces:
progress and looming challenges. Nat. Rev. Neurosci. 15:313–25
Choi S, Kuchenbecker KJ. 2013. Vibrotactile display: perception, technology, and applications.
Proc. IEEE 101:2093–104
Kappassov Z, Corrales JA, Perdereau V. 2015. Tactile sensing in dexterous robot hands. Robot.
Auton. Syst. 74:195–220
Lederman SJ, Jones LA. 2011. Tactile and haptic illusions. IEEE Trans. Haptics 4:273–94
Lin MC, Otaduy M, eds. 2008. Haptic Rendering: Foundations, Algorithms, and Applications. Boca
Raton, FL: CRC
www.annualreviews.org Haptics 12.25
Review in Advance first posted on
February 5, 2018. (Changes may
still occur before final publication.)
Annu. Rev. Control Robot. Auton. Syst. 2018.1. Downloaded from www.annualreviews.org
Access provided by Stanford University - Main Campus - Robert Crown Law Library on 02/14/18. For personal use only.
... [1][2][3] As the sense of touch is pervasive and essential in our lives, interfacing with our sense of touch through a haptic device is a promising way to augment information delivery that currently relies heavily on vision and auditory senses. [4][5][6][7] Furthermore, alongside advances in virtual and augmented reality technologies, the demand for high-end wearable haptic devices has been steadily increasing in recent years. 4,8,9 Haptic technology is already widely used to enhance interactions with machines, including through smartphones, wearable electronics, gaming gear, and remote controllers. ...
... [4][5][6][7] Furthermore, alongside advances in virtual and augmented reality technologies, the demand for high-end wearable haptic devices has been steadily increasing in recent years. 4,8,9 Haptic technology is already widely used to enhance interactions with machines, including through smartphones, wearable electronics, gaming gear, and remote controllers. [10][11][12][13] A wearable haptic display is a device that can provide continuous interaction at a high data transfer rate by using a two-dimensional array of actuators to exploit the spatial sensing capabilities of the skin 14,15 to communicate information patterns and, in principle, to simulate sensations of touch through various motion patterns. ...
... An eccentric rotating mass vibration motor is one popular option, because of its simple working principle, accessibility, reliability, and cost-effectiveness. 4,21,25 However, their long rise time, energy consumption, and the strong correlation between frequency and amplitude make it challenging to create rich tactile expressions. Voice coil actuators are a favored option to create a broader range of expressions based on their large displacement and wide bandwidth. ...
Article
With advances in mobile computing and virtual/augmented reality technologies, communicating through touch using wearable haptic devices is poised to enrich and augment current information delivery channels that typically rely on sight and hearing. To realize a wearable haptic device capable of effective data communication, both ergonomics and haptic performance (i.e., array size, bandwidth, and perception accuracy) are essential considerations. However, these goals often involve challenging and conflicting requirements. We present an integrated approach to address these conflicts, which includes incorporating multilayered dielectric elastomer actuators, a lumped-parameter model of the skin, and a wearable frame in the design loop. An antagonistic arrangement-consisting of an actuator deforming the skin-was used to achieve effective force transmission while maintaining a low profile, and the effect of the wearable frame and structure was investigated through lumped-model analysis and human perception studies.
... Haptic technologies are computational systems and applications aiming to artificially reproduce the sense of touch (Basdogan, Giraud, Levesque, & Choi, 2020;Culbertson, Schorr, & Okamura, 2018;Kuchenbecker, 2018;Prattichizzo, Otaduy, Kajimoto, & Pacchierotti, 2019). Research in the field of haptics is over a century old (Parisi, 2018). ...
Article
Full-text available
The sense of touch is central to fashion and luxury, because dress - garments and accessories - is experienced with and on the body. But on e-commerce websites, dress is dematerialized, and touch sensation simulated using audio-visual means. The growth of digital fashion communication during the COVID-19 pandemic has spotlighted this issue. However, developments in haptic technologies suggest that new modes of tactile interaction could be available in the online domain. Through interviews with fourteen fashion and luxury executives in Italy and Switzerland, this article investigates touch digitalization strategies during COVID-19 and the potential which haptics may hold to fill the online tactile void, using two surface haptic technologies as prompts. By doing so, this research fills a gap in the study of digital fashion and haptics, to the benefit of the academic and practitioner communities. Results indicate that executives deployed a host of visualization tactics during lockdown, reinforcing their conviction that still and moving images furnish an acceptable surrogate for touch sensation in the online domain. In fact, participants questioned the value of mediated tactile feedback provided by surface haptic devices. However, these experts also concurred that the technology holds promise, envisioning future use of haptic technologies in digital business-to-business and end-consumer markets. These could include haptic data integration within virtual prototyping, haptically enriched digital content for customer engagement, multisensory brand promotions in mixed or virtual reality, and phygital experiences.
... Soft haptic interfaces also easily support a range of device types distinguished by the method of interaction: graspable, wearable, or touchable [58]. This method of interaction can have a large impact on the usability of the devices. ...
Preprint
Full-text available
Humans can leverage physical interaction to teach robot arms. As the human kinesthetically guides the robot through demonstrations, the robot learns the desired task. While prior works focus on how the robot learns, it is equally important for the human teacher to understand what their robot is learning. Visual displays can communicate this information; however, we hypothesize that visual feedback alone misses out on the physical connection between the human and robot. In this paper we introduce a novel class of soft haptic displays that wrap around the robot arm, adding signals without affecting interaction. We first design a pneumatic actuation array that remains flexible in mounting. We then develop single and multi-dimensional versions of this wrapped haptic display, and explore human perception of the rendered signals during psychophysic tests and robot learning. We ultimately find that people accurately distinguish single-dimensional feedback with a Weber fraction of 11.4%, and identify multi-dimensional feedback with 94.5% accuracy. When physically teaching robot arms, humans leverage the single- and multi-dimensional feedback to provide better demonstrations than with visual feedback: our wrapped haptic display decreases teaching time while increasing demonstration quality. This improvement depends on the location and distribution of the wrapped haptic display. You can see videos of our device and experiments here: https://youtu.be/yPcMGeqsjdM
... Timing synchronization and strength control of this specific group is correlated with their interaction using the elastic display, which encourages us to use an elastic display as a natural haptic device (cp. [7]). Although the current technology used to investigate elastic displays is still experimental (see Fig. 1), work on advanced materials for future flexible displays is ongoing [24]. ...
Article
Humans perceive an interaction force through the kinesthetic sense or tactile sense. When viewing the image, they estimate the interaction force on the basis of pseudo-haptics. The interaction force of a robot is traditionally measured using a contact-type tactile sensor or a force/torque (F/T) sensor. In this work, we propose a method for estimating the interaction force between a robot and objects during grasping and picking. The method is based on images, without involving an F/T sensor or a tactile sensor. For undeformable objects, more precise force estimation was achieved by simultaneously using RGB and depth images, the robot position, and electrical current. We propose a deep neural network structure that combines DenseNet and a Transformer encoder/decoder for predicting the interaction force. We verified proposed network with generated DB which has recorded the interaction with 41 objects. Additionally, we compared the results with changing the inputs of the network. Our model could estimate the interaction force from various input modalities for both known objects and unseen objects during its training. The results clearly indicate that the proposed method produces the best results compared with other models, with less than 3% error in estimating the interaction force.
Article
Full-text available
In this work, we propose a qualitative immersion evaluation technique based on a pseudo-haptic-based user-specific virtual object weight recognition algorithm and an immersive experience questionnaire (IEQ). The proposed weight recognition algorithm is developed by considering the moving speed of a natural hand tracking-based, user-customized virtual object using a camera in a VR headset and the realistic offset of the object’s weight when lifting it in real space. Customized speeds are defined to recognize customized weights. In addition, an experiment is conducted to measure the speed of lifting objects by weight in real space to obtain the natural object lifting speed weight according to the weight. In order to evaluate the weight and immersion of the developed simulation content, the participants’ qualitative immersion evaluation is conducted through three IEQ-based immersion evaluation surveys. Based on the analysis results of the experimental participants and the interview, this immersion evaluation technique shows whether it is possible to evaluate a realistic tactile experience in VR content. It is predicted that the proposed weight recognition algorithm and evaluation technology can be applied to various fields, such as content production and service support, in line with market demand in the rapidly growing VR, AR, and MR fields.
Chapter
In this chapter, we propose the foundations of a new field, perception engineering, to unify and guide XR research in human perception. The key idea is that designing, creating, implementing, and analyzing perceptual illusions themselves are the engineering focus, rather than devices. Perception engineering follows a dynamical systems approach to the human–XR device pairing by leveraging techniques from mathematical modeling, perceptual psychology, neuroscience, and robotics to better understand how the perceptual experience itself may be engineered. We then give attention to the current state and potential shortcomings of human perception and XR research, and set goals for the field to aspire toward best practices, inclusivity, and open‐source modular technology.
Book
Full-text available
The 8th annual International Conference of the Immersive Learning Research Network (iLRN2022) was the first iLRN event to offer a hybrid experience, with two days of presentations and activities on the iLRN Virtual Campus (powered by ©Virbela), followed by three days on location at the FH University of Applied Sciences BFI in Vienna, Austria.
Conference Paper
Full-text available
In the past, tactile displays were of one of two kinds: they were either shape displays, or relied on distributed vibrotactile stimulation. A tactile display device is described in this paper which is distinguished by the fact that it relies exclusively on lateral skin stretch stimulation. It is constructed from an array of 64 closely packed piezoelectric actuators connected to a membrane. The deformations of this membrane cause an array of 112 skin contactors to create programmable lateral stress fields in the skin of the finger pad. Some preliminary observations are reported with respect to the sensations that this kind of display can produce.
Article
Full-text available
In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand.
Conference Paper
Industrial applications of haptic feedback systems that add dexterity to telemanipulators have been limited due to their complexity, low reliability, and high cost. A promising and attractive alternative technology for industry is vibrotactile feedback. These systems are simple and can be added at low cost to existing telerobotic systems to provide robust contact information. We have developed a single channel prototype vibrotactile sensor and display system for a high capacity deep sea remote manipulator, the Schilling Robotic Systems TITAN-II. The vibration sensor consists of a pair of steel plates machined to fit inside the gripper jaws. Embedded between the plates are piezoelectric film strips molded into a rugged silicone rubber layer. Impact and frequency response tests indicate the sensor is durable to the extreme loading and sensitive to a large range of vibration frequencies of the industrial setting. Tests of the prototype system on the TITAN-II again proved the sensor to be rugged and durable while also being sensitive. In informal tests, operators found the system enhanced operation of the robot.
Article
Exoskeletons and active prostheses promise to enhance human mobility, but few have succeeded. Optimizing device characteristics on the basis of measured human performance could lead to improved designs. We have developed a method for identifying the exoskeleton assistance that minimizes human energy cost during walking. Optimized torque patterns from an exoskeleton worn on one ankle reduced metabolic energy consumption by 24.2 ± 7.4% compared to no torque. The approach was effective with exoskeletons worn on one or both ankles, during a variety of walking conditions, during running, and when optimizing muscle activity. Finding a good generic assistance pattern, customizing it to individual needs, and helping users learn to take advantage of the device all contributed to improved economy. Optimization methods with these features can substantially improve performance.
Conference Paper
WAVES, a Wearable Asymmetric Vibration Excitation System, is a novel wearable haptic device for presenting three dimensions of translation and rotation guidance cues. In contrast to traditional vibration feedback, which usually requires that users learn to interpret a binary cue, asymmetric vibrations have been shown to induce a pulling sensation in a desired direction. When attached to the fingers, a single voicecoil actuator presents a translation guidance cue and a pair of voicecoil actuators presents a rotation guidance cue. The directionality of mechanoreceptors in the skin led to our choice of the location and orientation of the actuators in order to elicit very strong sensations in certain directions. For example, users distinguished a "left" cue versus a "right" cue 94.5% of the time. When presented with one of six possible direction cues, users on average correctly identified the direction of translation cues 86.1% of the time and rotation cues 69.0% of the time.
Conference Paper
One of the main barriers to immersivity during object manipulation in virtual reality is the lack of realistic haptic feedback. Our goal is to convey compelling interactions with virtual objects, such as grasping, squeezing, pressing, lifting, and stroking, without requiring a bulky, world-grounded kinesthetic feedback device (traditional haptics) or the use of predetermined passive objects (haptic retargeting). To achieve this, we use a pair of finger-mounted haptic feedback devices that deform the skin on the fingertips to convey cutaneous force information from object manipulation. We show that users can perceive differences in virtual object weight and that they apply increasing grasp forces when lifting virtual objects as rendered mass is increased. Moreover, we show how naive users perceive changes of a virtual object's physical properties when we use skin deformation to render objects with varying mass, friction, and stiffness. These studies demonstrate that fingertip skin deformation devices can provide a compelling haptic experience appropriate for virtual reality scenarios involving object manipulation.
Article
Virtual reality systems would benefit from a compelling force sensory substitute when workspace or stability limitations prevent the use of kinesthetic force feedback systems. We present a wearable fingertip haptic device with the ability to make and break contact in addition to rendering both shear and normal skin deformation to the fingerpad. A delta mechanism with novel bias spring and tether actuator relocation method enables the use of high-end motors and encoders, allowing precise device control: 10 Hz bandwidth and 0.255 mm RMS tracking error were achieved during testing. In the first of two experiments, participants determined the orientation of a stiff region in a surrounding compliant virtual surface with an average angular error of 7.6, similar to that found in previous studies using traditional force feedback. In the second experiment, we evaluated participants' ability to interpret differences in friction. The Just Noticeable Difference (JND) of surface friction coefficient discrimination using our skin deformation device was 0.20, corresponding with a reference friction coefficient of 0.5. While higher than that found using kinesthetic feedback, this demonstrates that users can perceive differences in surface friction without world-grounded kinesthetic forces. These experiments show that three DoF skin deformation enables both stiffness and friction discrimination capability in the absence of kinesthetic force feedback.
Conference Paper
Ultrasound-based interventions require experience and good hand-eye coordination. Especially for non-experts, correctly guiding a handheld probe towards a target, and staying there, poses a remarkable challenge. We augment a commercial vision-based instrument guidance system with haptic feedback to keep operators on target. A user study shows significant improvements across deviation, time, and ease-of-use when coupling standard ultrasound imaging with visual feedback, haptic feedback, or both.
Conference Paper
This paper describes the development of a thumb-sized force display for experiencing a kinesthetic illusory sensation of being continuously pushed or pulled. We previously succeeded in creating a sensation of being pulled with a prototype based on a crank-slider mechanism, but recently we did so with a thumb-sized actuator that oscillates asymmetrically. With this tiny and light force display, the directed force sensation is perceived just as strongly as with the previous larger prototypes. We conducted a user study using the method of paired comparisons. The results show that a specific vibrator with a 7-ms pulse at 40 Hz induces the sensation most clearly and effectively.
Article
Intracortical microstimulation of the somatosensory cortex offers the potential for creating a sensory neuroprosthesis to restore tactile sensation. Whereas animal studies have suggested that both cutaneous and proprioceptive percepts can be evoked using this approach, the perceptual quality of the stimuli cannot be measured in these experiments. We show that microstimulation within the hand area of the somatosensory cortex of a person with long-term spinal cord injury evokes tactile sensations perceived as originating from locations on the hand and that cortical stimulation sites are organized according to expected somatotopic principles. Many of these percepts exhibit naturalistic characteristics (including feelings of pressure), can be evoked at low stimulation amplitudes, and remain stable for months. Further, modulating the stimulus amplitude grades the perceptual intensity of the stimuli, suggesting that intracortical microstimulation could be used to convey information about the contact location and pressure necessary to perform dexterous hand movements associated with object manipulation.