Conference PaperPDF Available

The Haptic Tabletop Puck: Tactile feedback for interactive tabletops


Abstract and Figures

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.
Content may be subject to copyright.
The Haptic Tabletop Puck:
Tactile Feedback for Interactive Tabletops
Nicolai Marquardt, Miguel A. Nacenta, James E. Young,
Sheelagh Carpendale, Saul Greenberg, Ehud Sharlin
Interactions Lab, University of Calgary,
2500 University Drive NW, Calgary, T2N 1N4, CANADA
{nicolai.marquardt, miguel.nacenta, james.young, sheelagh.carpendale, saul.greenberg, ehud.sharlin}
In everyday life, our interactions with objects on real tables
include how our fingertips feel those objects. In
comparison, current digital interactive tables present a
uniform touch surface that feels the same, regardless of
what it presents visually. In this paper, we explore how
tactile interaction can be used with digital tabletop
surfaces. We present a simple and inexpensive device – the
Haptic Tabletop Puck – that incorporates dynamic,
interactive haptics into tabletop interaction. We created
several applications that explore tactile feedback in the area
of haptic information visualization, haptic graphical
interfaces, and computer supported collaboration. In
particular, we focus on how a person may interact with the
friction, height, texture and malleability of digital objects.
Haptic, tactile, tabletops, height, malleability, friction.
ACM Classification Keywords
H5.2. Information interfaces and presentation (User
Interfaces): Haptic I/O.
Surfaces, and the objects we use on them, provide
important tactile information that is useful for different
tasks. For example, the texture of paper allows us to
differentiate it from the table, the relief of objects such as
rulers allow us to lay them out accurately (Figure 1.Left),
and sometimes it is possible to interpret information that is
encoded in the characteristics of the surface (Figure
1.Right). In contrast, when using digital objects in current
interactive tabletops all tactile cues are sacrificed by most
current devices, which are limited to the uniform,
unchanging tactile feel of the table’s surface.
Our overall research goal is to leverage dynamic and
interactive tactile information for use with digital tabletops.
In this paper, we specifically focus on the representation of
three sources of tactile information: vertical relief, the
malleability of materials, and horizontal friction with the
surface (Figure 2). As we will explain, we not only use
these variables to make the touch interface more realistic,
but also explore interaction techniques and abstract
mappings that extend the interface in new ways.
Ideally, we envision implementing future digital surfaces
that provide rich and consistent tactile feedback across the
entire interaction surface. However, as far as we know the
technology for this is not available and such a solution will
still require considerable research. For our explorations we
devised the Haptic Tabletop Puck (HTP), a low-cost haptic
device that can be used atop the table. This device allows
us to explore the following dimensions of tactile feedback
on the tabletop:
height and texture: the relief and tactile feel that
corresponds to elements displayed on the table. The
HTP creates these through a servo motor that
continually adjusts the height of a rod atop the device.
Figure 1. Using texture and relief for drawing (left).
Exploring a geological map (right).
Figure 2. The three information sources (from left to
right): height, malleability, and friction.
Permission to make digital or hard copies of all or part of this work fo
ersonal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. To copy otherwise,
to republish, to post on servers or to redistribute to lists, requires prio
specific permission and/or a fee.
TS '09, November 23-25 2009, Banff, Alberta, Canada
Copyright © 2009 978-1-60558-733-2/09/11... $10.00
malleability: how different materials respond to touch
and pressure. The HTP can reproduce these through
the haptic loop formed by the height of the rod and a
pressure sensor.
friction: resistance to movement in directions parallel
to the table plane. A servo in the HTP pushes a rubber
plate through its bottom against the surface.
location: different positions of the table should provide
appropriate haptic feedback. Fiduciary markers on the
HTP base allows the surface to track the HTPs location
and orientation.
multiplicity: the ability to provide multi-user or multi-
point haptic feedback simultaneously, for example for
collaborative support. Several HTPs can be used
simultaneously on the table.
The contribution of this paper is our exploration of how
haptics can be brought to interactive tables. This
encompasses the design of the HTP but, most importantly,
the question of how it can be applied to real scenarios to
enhance information exploration, interaction, and
collaboration. In the remainder of this paper, we briefly
summarize related work, present the design and
implementation of the HTP, describe several scenarios and
interaction techniques that use the HTP, and finish with a
summary of our explorations and the lessons learned.
Research about haptic force-feedback [32] and tactile
interfaces [1] has a long history in the area of virtual reality
[13,7], telepresence [32,17], and interfaces for visually-
impaired people [15,2,35]. Physiological and psychological
studies [11,10,23] lay the foundation of how to use this
additional channel of haptic sensation.
Haptics have been explored by augmenting the traditional
GUI mouse interaction. Zeleznik’s pop through tactile
buttons [36] provide two distinct click levels, Schneider’s
magnetically-actuated mouse [31] can apply frictional force
during movement, and Chu’s haptic mousewheel [6]
enables pressure sensitive controls. This technique uses
mouse input, making it impractical on interactive tabletops.
The SensAble PHANToM [20] is a 6 degree of freedom
force feedback device, often used for haptics in virtual
reality systems. Ikit’s Visual Haptic Workbench [13]
combines it with 3D projections to allow interaction in
immersive environments. However, the PHANToM is a
relatively expensive device, has a limited actuation radius,
and the mechanics occupy a large space, which makes it
less suitable for multi-user tabletop interaction.
The Actuated Workbench [22] and Pico [25] enable
actuation of objects on a tabletop by applying magnetic
forces. More recent work by Fukushima [9] controls
friction of objects through vibration of the surface. These
are powerful actuation mechanisms for tangible user
interfaces, but they do not provide haptic vertical relief or
malleability feedback. Shape displays, such as FEELEX
[14] and Lumen [27], bring three-dimensionality to a table
surface by using a matrix of movable rods (located under
the tabletop surface) actuated by servos or shape memory
alloys. While there is ongoing research to increase the
number of haptic pixels in these systems [27], they
currently only provide a small number of actuated points
due to mechanical constraints (13x13 haptic pixels for
Lumen, 6x6 haptic pixels for FEELEX, both in an area of
around 25x25 cm).
There has been considerable work exploring the concept of
haptic icons (e.g., [18,19,24]), where abstract messages are
delivered through touch symbols. This approach has been
applied to situations when no visual information is
available (e.g., for non-visual control of mobile devices
[18,24]). Although this concept can be applied to our
explorations, our focus is on the input/output device and a
new platform (tabletops), rather than the explicit study of
the expressiveness of haptic symbolic languages.
The TouchEngine actuator [26,28] and the Active Click
mechanism [8] provide tactile feedback for small touch
screens. Due to the mechanical construction of these
feedback mechanisms, the vibrotactile feedback is always
applied to the complete screen surface, which makes this
approach much less scalable to multi-user applications on
larger surfaces. By using gloves with vibration actuators
[5] or finger mounted actuators as in Rekimoto’s
SenseableRays [29], it is possible to provide this
vibrotactile feedback independently of the actual screen
size. Lee [16] created a similar effect by using a solenoid
attached to a pen. While the vibrotactile feedback of these
systems can create diverse tactile sensations by changing
the intensity and frequency of the vibration, they do not
allow variations in vertical relief, malleability, or friction.
These diverse haptic feedback devices have been applied to
a wide variety of applications. One category is haptic
visualization [30], where people can explore different kinds
of information in addition to the graphical visualization.
Examples are tactile exploration of geographical maps [4],
images [15], graphs [35], or large scientific datasets [13].
Haptic feedback can also augment the interaction with
traditional GUI elements. Examples are dynamic physical
shapes as buttons (e.g., [12,27]), vibrotactile feedback to
augment clicking (e.g., [29,16]), or tactile feedback of
intervals on a slider [28].
The HTP was developed through a process of iterative
exploratory design, in which brainstorming and prototyping
were undertaken in parallel, mutually informing each other,
to arrive at the current HTP design. We will start by giving
a technical description of our prototype, followed by a
more in-depth discussion of the expressiveness in relation
to the use of our current prototype in applications.
Design and prototypes
The HTP concept is based on the combination of four main
elements: a mechanically activated rod that moves
vertically within a column to reach different heights above
the table, a sensor on top of the rod that detects the amount
of pressure being applied to its top, a standard fiducial
marker (Microsoft Surface Bytetag) underneath the device
that allows the interactive tabletop to locate the puck on the
surface, and a brake pad on the bottom that can increase the
friction against the table. The basic elements are displayed
on the diagram of the current HTP prototype (Figure 3.Left
and Center). A photograph of the current iteration is shown
in Figure 3.Right.
Our current implementation is designed to work on top of a
Microsoft Surface1. The device uses Hitech HS-55/HS-
65HB micro servomotors to control the vertical position of
the rod and the brake. Figure 4 shows these two servos,
each using a mechanical arm to move the rod or the break
respectively. An Interlink Electronics Circular Force
Sensing Resistor as pressure sensor is located on top of the
rod. The components are encased into a wooden box with
felt bottom. We chose a small rod as the moving part (as
opposed to making the whole top of the device mobile) so
that its vertical position with respect to the encasing could
be felt through simultaneous touch of the finger on rod and
encasing, or through the relationship between the sensing
finger and the other fingers holding the device. The height
range of the current prototypes is 9mm. We chose to align
the output vertically, because this makes the device stable
when pressing it from above, and allows for a good
measure of pressure without compromising the position of
the device on the table.
To provide friction, the device has a servo-activated rubber
surface that comes out of the bottom of the device through
a rectangular opening on the base. This mechanism
increases the friction of the bottom of the device against the
table, making it more difficult to drag around. Currently,
this mechanism supports only being either on or off.
Limitations of the prototype
The size of our version of the device is dictated by the size
and technology of the components. Currently the device is
a 69x69x40mm box (see Figure 3 and 4). However, we
believe that by applying different technologies and a more
compact design it will be possible to build a much smaller
device that could be dragged with a single finger.
The servos that we use are inexpensive and small, and
therefore high-resolution haptic feedback is not possible.
The servo motor and the sensor of current prototypes are
controlled through Phidgets boards, and are therefore
tethered. This current implementation of HTP, however,
has allowed us to explore a multitude of applications.
It is possible that additional properties may arise from a
smaller footprint and a higher haptic resolution. However,
while we are already developing new, smaller, and wireless
versions of the HTP, the focus of our work is on the
concept and its applications, not on the particulars of the
hardware. As a consequence the rest of the paper will focus
on these factors.
HTP provides a variety of different channels capable of
measuring and representing different types of information.
The most obvious output channel is the height of the rod
along the vertical axis. Using the position of the HTP to
Fiducial Marker
Figure 3. Diagram of the current prototype from the top (left) and from the bottom (center),
and photograph of the current prototype (right).
Figure 4. Inside view of the prototype.
determine a height value gives a simple way to represent
relief in maps (see Section 4.1. and [4]). However, height
can also be used to represent more abstract information; for
example, dynamic sinusoidal vibration with varying
frequency or amplitude can convey information [30]. By
measuring the contact pressure between the finger and the
rod we get a continuous source of input that is more
expressive than the more common binary mouse button [3].
The combination of pressure and height can enable
dynamic force feedback, through which we simulate the
malleability of different materials, or mechanical devices
(e.g. buttons).The brake in the bottom of the device
provides extra friction along the directions parallel to the
surface plane. The friction can also be dependent on the
location of the device.
The variables described so far can be controlled
independently and can be associated to the position of the
device on the table, dramatically increasing the versatility
of the otherwise single-point haptic feedback. Moreover,
since the table is massively multi-touch (MMT), many
HTPs can be used concurrently on the table, limited only
by the size of our prototype. Thus the haptic feedback can
react to other elements on the table (e.g., other pucks and
finger touches).
Prototyping has allowed us to explore many tactile
interaction techniques for tabletops. In this section we
showcase some of the new avenues that a haptic device like
HTP can open for interaction on tabletops. The techniques
are introduced through example scenarios loosely
organized into three groups: tactile exploration, tactile-
enhanced interaction, and collaboration. Please, note that
the techniques presented here are not restricted to the
particular scenario that we use to introduce them.
Tactile Exploration
Besides using texture and height information to represent
relief, the tactile channel can also be used in other ways to
explore data. Tactile information can be particularly
effective when other channels, such as visual and aural, are
already crowded.
We have implemented a map application for the
exploration of geographical features (Figure 5.a,b). We
map the relief of the terrain to the height of the rod (Figure
5.c represents the height map), use different tactile material
responses to represent different types of terrain (e.g.,
vegetation is soft, mountains are “breaky”– Figure 5.d
shows the varying materials), and map ocean temperature
to different vibration frequencies (from 0Hz for cold areas
to 5Hz for warm water–Figure 5.e). As these multiple
properties influence the height of the single rod, we
combine them together in different ratios depending on the
application. For instance, an equal ratio would assign one
third of the height to relief information, one to represent the
oscillation, and one to represent the material response.
We have also observed that the large puck often made it
difficult to understand which map position was being
represented. Therefore, we attached a virtual arrow to the
puck to enable fine-grained selection. This arrow points to
a location a few centimeters in front of the HTP (Figure
The haptic channel can be used to offload information from
the visual channel. We developed a painting exploration
application where we map rod height to areas of interest, in
this case the difference between a painting and its infrared
(or x-ray) image. This allows the observer to explore, for
example, pentimenti (previous versions of a region of the
painting that have been painted over by the artist) in place
without multiple views or visual artifacts. The person
exploring the painting is able to directly feel the hidden
areas of the painting, and can visually reveal them in-place
by pressing on the rod to activate a circular lens that shows
the alternative layer (see Figure 6 and video figure).
Figure 6. Map exploration with HTP (A), geographical
map (B), and layers of tactile information (C,D,E). Images
courtesy of NASA. []
Figure 5. The puck reveals hidden content in a particular area
of the painting by raising the rod (left). By pressing on it the
x-ray layer can be revealed (right)
Tactually-Enhanced Interaction
Haptics can be useful beyond exploration of data; for
example, tactile cues can help the creation of new data
(e.g., new pictures) as well as the interaction with GUI
elements [16]. We have created a painting application that
uses height feedback and pressure sensitivity to enhance
control. Different layers of the picture are signaled to the
user through different height levels. A slight pressure on
the rod will activate the brush; however, if the brush moves
across layers, it will not paint over the new layer unless the
pressure is increased (Figure 7 and video figure). This
allows effective control of the effects of the brush without
obscuring the drawing itself.
Height and malleability can also be used to enhance GUIs
[25,26]. For example, we implemented a sample webpage-
style button, where the state can be signaled by its
elevation. If the button is pressed slightly, it can have a
tactile response similar to that of real buttons, and if it is
pushed down with a certain pressure, the button (and the
rod) can stay down to both communicate state to the user
and to prevent it from being pushed again (see Figure 8).
The level of pressure required to push the button can also
signal the importance of the action (as suggested by Chu
and colleagues [6] for mouse interaction).
Using height and pressure combined can also help with
object layout and manipulation. For example, in our office
layout application, furniture tends to stay in a particular
room unless the pressure applied is enough to bring the
object across the wall. To signal that the object cannot
move beyond a certain point, we apply Wigdor’s No touch
left behind approach [34], where the arrow stretches to
keep the connection between the puck and the object while
the object stays behind. Moreover, we provide feedback
through the puck’s brake, which makes the puck harder to
drag, immediately warning the user that their actions are
not having an effect (see sequence in Figure 9).
The same approach can be used to enhance existing layout
interaction techniques such as alignment tools (e.g.,
guides), which often contribute to the visual clutter in
applications. Giving feedback through differences in height
seems particularly appropriate in these cases, as it simulates
the edges and grooves of real world tools.
Haptics for collaboration
Tabletops are often used by groups, where in particular
haptics can be leveraged for floor control and interaction
awareness. To explore this, we have extended the office
layout application to enable several people to work on it at
the same time.
We created a mode in which objects can only be
manipulated by one user at a time. Our tactile negotiation
mechanism creates the constraint that only one HTP at a
time can be pressed down on any given object (the one that
has control). When another person wants to take control,
they can push the rod down, which will push the rod of the
other (controlling) HTP upward, indicating a desire to take
the floor (see Figure 10.Left). This mechanism is subtle,
but difficult to ignore, and does not require any visual
symbols. Moreover, this technique can be used to negotiate
control in a richer manner; for example, a collaborator can
try to request control by subtly pushing on the rod, and the
current controller can easily deny it by keeping a strong
pressure on it.
We also used the office layout application to implement a
tactile awareness technique. In our prototype, any clicks or
dragging actions executed through a puck generate “tactile
ripples” that propagate across the table, making other HTPs
oscillate (Figure 10.Right). The oscillation parameters
(shape of the signal, amplitude and frequency) can also
transmit useful information to collaborators; for example
the identity and distance of a user generating an event.
During development we have noticed that small amplitude
signals of moderate frequency can be easily noticed and do
not generate significant interference with the ongoing
These haptic techniques can be useful to transmit
awareness information without requiring a shift in visual
attention. We believe that these techniques would be most
useful when other forms of awareness information (e.g.,
peripheral vision, sound, the arms and bodies of
collaborators) is harder to gather; for example, when
people work on very large tables [21], or if they work
Figure 7. Adding an item to the cart takes little pressure (left),
but actually making the purchase needs a strong click (center).
Once the purchase is done the button stays down and cannot
be pressed (right).
Figure 8. Light pressure activates the brush (left). When the
puck enters a new layer, the height changes, and the brush is
only active on the lower layer (center). By pressing hard, the
brush becomes active in both layers (right).
collaboratively with a shared tabletop workspace from
distant locations [33].
Although we have not yet conducted extensive formal
evaluation of the prototypes, we have had the opportunity
to informally observe many visitors of the lab and
colleagues (not directly involved in the project) using the
HTP. These informal and early observations have provided
valuable insights regarding how people approach such a
device and opened directions for future exploration.
We observed that people begin haptic exploration on the
tabletop in different ways. Most people began immediately
to move the device around the surface and actively
explored the haptic content. While vertical relief and
friction feedback were immediately evident, it took more
time to notice the malleability of objects (as it requires
people to apply actual pressure to the rod). However, after
people initially experienced the diversity of malleability,
they often continued to explore this dimension over the
tabletop surface and across applications. People reported
that the haptic feedback through touching was a very active
and engaging experience.
We also noticed people holding and grasping the device in
different ways. Most commonly, they placed the tip of their
index finger on the top of the rod. However, sometimes
they placed their entire finger or even the palm of their
hand across the rod. Usually when the rod moved, people
followed the up and down movements with their finger. In
a few situations, however, people left their finger statically
in the space above the rod, resulting in a mere tapping of
the rod against the finger. One person even preferred to just
visually observe the vertical height changes of the rod, for
instance the slow wave oscillations associated to the water-
ripples video, or the up and down movement when
exploring a visualized stone relief. People also sometimes
closed their eyes to explore the content solely through the
haptic sensation.
We noticed certain limits on how much information can be
perceived through the haptic feedback (as noted in earlier
research, e.g., in [28]). Complex combinations of
information can make it difficult to differentiate changes;
for example, simultaneous oscillation with high-amplitude
height changes can made it challenging to distinguish
materials through malleability.
The HTP has been a valuable device for our research in the
haptic tabletop space. During the design and different
iterations of the device we have come to appreciate how a
simple and inexpensive device can open up new areas of
exploration. By choosing small size, low cost components
and mobility over high fidelity feedback and many degrees
of freedom, we have been able to quickly build many
interaction techniques and prototype multiple applications.
We have learned that the flexibility and possibilities of the
device come from the specific combination of variables. By
themselves, height, pressure and friction are already
powerful, but the combination amounts to more than the
sum of the parts; this is exemplified by the drawing and
layout examples, where height, pressure
and malleability form meaningful
interaction styles. We have also found that
the integration of input and output
capabilities within the same device offers
new exciting possibilities for the design of
novel interaction techniques. However, our
explorations have only begun to look at
this integration, and further research in this
area is needed.
What makes our approach powerful is that
the input and output haptic variables can
vary with the position of the puck on the
table, effectively multiplying the
interaction bandwidth of the device. We
find that the combination of tabletops and
haptics is particularly synergistic for
several reasons. First, people already make
use of tactile information naturally by
Figure 10. The floor control mechanism links two or more HTPs so that all
collaborators can know if they are in control or when control is being requested
from them (left). Actions in one part of the screen generate tactile ripples that
can be perceived by other pucks on the table (right). The ripples on the figure are
only visible for illustration purposes.
Figure 9. Light pressure grabs the object (left), but when
trying to move it beyond the room’s boundaries, horizontal
friction is applied, the object will not move, and the red arrow
stretches (center). If enough pressure is applied, the object
goes to the intended place.
touching and feeling surfaces on tables (e.g. Figure 1);
second, tabletops afford placing objects on top of them, and
pressing against the surface, which makes for a stable
haptic loop; and third, the tracking capabilities of the table
allow the device to be lightweight, simple, and easy to
One of the first impulses when building haptic tabletops is
to be able to reproduce the tactile characteristics of objects
on the tabletop; however, we soon discovered that beyond
reproducing realistic tactile textures and mountain reliefs,
abstract haptic mappings can be powerful and compelling,
such as in the floor control example or the drawing and
layout applications.
By building the prototype applications we also found some
unexpected new possibilities and constraints. For example,
we found that mapping the haptic feedback to the location
right underneath the rod might feel like the most
“physically accurate” (it simulates an actual rod in contact
with a carved surface); however, in our applications the
haptic channel acts in conjunction with the visual channel,
and hiding the most relevant visual information under the
puck does not make sense. Instead, we draw an arrow that
connects the device with the sensed pixel. Since the arrow
maintains its connection with the puck, interaction still
feels direct, but there is almost no obstruction. Moreover,
since the arrow is dynamically drawn on the screen, it can
be used in new and creative ways to provide feedback. The
exploration of the space between direct and indirect input
devices is an interesting field of enquiry in itself.
Although the interaction techniques that we built for our
prototype applications were found very compelling to the
people that informally tested our system, it is difficult to
know if people will want to use the haptic channel in real
use situations. In general, there are few applications for
which we cannot find a visual-only alternative; however,
the haptic channel can be valuable for scenarios in which
visual clutter is an issue, or where the more compelling
tactile sensations are an asset.
Nevertheless, we believe that now is an appropriate time to
explore haptics for interactive tabletops; the interface of the
tabletop is still maturing and there is an opportunity to
make haptics one of the available modalities.
With the Haptic Tabletop Puck we introduced an
inexpensive hardware design for a device that allows us to
explore haptic feedback on tabletops. We have explored
applications that use three dimensions of haptic feedback:
vertical relief, malleability of materials, and horizontal
friction on the surface. Our prototype applications
showcase how adding a haptic layer to the interactive
surface experience can augment the existing visual and
touch modalities.
In the future, we intend to improve our understanding of
the value of haptics in the tabletop experience through
more formal evaluations. We are also in the process of
improving the hardware design of our device and we plan
to explore how different form factors can affect the user
experience. We already received useful comments that
suggest that smaller and un-tethered HTPs could support an
even wider spectrum of new haptic tabletop applications;
for example, smaller pucks could be put under several or
all of the fingers of a person and support massively multi-
touch haptic interaction. General improvements in the
resolution of the haptic feedback (such as a faster haptic
loop, and multiple levels of friction) can also be valuable.
At the application level we are also considering other
collaborative scenarios that can make use of a variety of
haptic interaction techniques.
This research is partially funded by the iCORE/NSERC/
SMART Chair in Interactive Technologies, by Alberta
Ingenuity, iCORE, NSERC, and by our industrial sponsor
SMART Technologies Inc.
1. Benali-Khoudja, M., Hafez, M., Alex, J., and
Kheddar, A. Tactile Interfaces: a State-of-the-Art
Survey. Int. Symp. on Robotics, (2004), 721--726.
2. Biagiotti, L., Borghesan, G., and Melchiorri, C. A
Multimodal Haptic Mouse for Visually Impaired
People. Proc. of ENACTIVE '05, (2005).
3. Cechanowicz, J., Irani, P., and Subramanian, S.
Augmenting the mouse with pressure sensitive input.
Proc. of CHI '07, ACM (2007), 1385-1394.
4. Chang, A., Gouldstone, J., Zigelbaum, J., and Ishii, H.
Pragmatic haptics. Proc. of TEI '08, ACM (2008),
5. Chou Wusheng, Wang Tianmiao, and Hu Lei. Design
of data glove and arm type haptic interface. Proc. of
HAPTICS '03. (2003), 422-427.
6. Chu, G., Moscovich, T., and Balakrishnan, R. Haptic
Conviction Widgets. Proc. of GI '09, (2009).
7. Dominjon, L., Perret, J., and Lecuyer, A. Novel
devices and interaction techniques for human-scale
haptics. The Visual Computer 23, 4 (2007), 257-266.
8. Fukumoto, M. and Sugimura, T. Active click: tactile
feedback for touch panels. CHI '01 extended abstracts,
ACM (2001), 121-122.
9. Fukushima, S., Hashimoto, Y., and Kajimoto, H.
Tabletop interface using a table's circular vibration
and controllable friction. CHI '08 extended abstracts,
ACM (2008), 3801-3806.
10. Gibson, J. Observations on active touch.
Psychological Review 69, (1962), 477-491.
11. Hale, K. and Stanney, K. Deriving haptic design
guidelines from human physiological, psychophysical,
and neurological foundations. Computer Graphics and
Applications, IEEE 24, 2 (2004), 33-39.
12. Harrison, C. and Hudson, S.E. Providing dynamically
changeable physical buttons on a visual display. Proc.
of CHI '09, ACM (2009), 299-308.
13. Ikits, M. and Brederson, D. The Visual Haptic
Workbench. In The Visualization Handbook. Elsevier
Academic Press (2005), 431-449.
14. Iwata, H., Yano, H., Nakaizumi, F., and Kawamura,
R. Project FEELEX: adding haptic surface to graphics.
Proc. of SIGGRAPH '01, ACM (2001), 469-476.
15. Kurze, M. TGuide: a guidance system for tactile image
exploration. Proc. of ASSETS '98, ACM (1998), 85-
16. Lee, J.C., Dietz, P.H., Leigh, D., Yerazunis, W.S., and
Hudson, S.E. Haptic pen: a tactile feedback stylus for
touch screens. Proc. of UIST '04, ACM (2004), 291-
17. Li, K.A., Baudisch, P., Griswold, W.G., and Hollan,
J.D. Tapping and rubbing: exploring new dimensions
of tactile feedback with voice coil motors. Proc. of
UIST '08, ACM (2008), 181-190.
18. Luk, J., Pasquero, J., Little, S., MacLean, K.,
Levesque, V., and Hayward, V. A role for haptics in
mobile interaction: initial design using a handheld
tactile display prototype. Proc. of CHI '06, ACM
(2006), 171-180.
19. MacLean, K. and Enriquez, M. Perceptual Design of
Haptic Icons. Proc. of EuroHaptics '03, (2003).
20. Massie, T.H. and Salisbury, J.K. The phantom haptic
interface: A device for probing virtual objects. Proc.
of the Symposium on Haptic Interfaces for Virtual
Environment and Teleoperator Systems, (1994), 295-
21. Nacenta, M.A., Aliakseyeu, D., Subramanian, S., and
Gutwin, C. A comparison of techniques for multi-
display reaching. Proc. of CHI '05, ACM (2005), 371-
22. Pangaro, G., Maynes-Aminzade, D., and Ishii, H. The
actuated workbench: computer-controlled actuation in
tabletop tangible interfaces. Proc. of UIST '02, ACM
(2002), 181-190.
23. Parkes, A., Poupyrev, I., and Ishii, H. Designing
kinetic interactions for organic user interfaces.
Commun. ACM 51, 6 (2008), 58-65.
24. Pasquero, J., Luk, J., Levesque, V., Wang, Q.,
Hayward, V., and MacLean, K. Haptically Enabled
Handheld Information Display With Distributed
Tactile Transducer. IEEE Transactions on
Multimedia, 9, 4 (2007), 746-753.
25. Patten, J. and Ishii, H. Mechanical constraints as
computational constraints in tabletop tangible
interfaces. Proc. of CHI '07, ACM (2007), 809-818.
26. Poupyrev, I., Maruyama, S., and Rekimoto, J.
Ambient touch: designing tactile interfaces for
handheld devices. Proc. of UIST '02, ACM (2002),
27. Poupyrev, I., Nashida, T., Maruyama, S., Rekimoto, J.,
and Yamaji, Y. Lumen: interactive visual and shape
display for calm computing. SIGGRAPH '04 emerging
technologies, ACM (2004), 17.
28. Poupyrev, I., Okabe, M., and Maruyama, S. Haptic
feedback for pen computing: directions and strategies.
CHI '04 extended abstracts, ACM (2004), 1309-1312.
29. Rekimoto, J. SenseableRays: opto-haptic substitution
for touch-enhanced interactive spaces. Proc. of CHI
'09, ACM (2009), 2519-2528.
30. Roberts, J.C. and Paneels, S. Where are we with
Haptic Visualization? Proc. of EuroHaptics and
World Haptics '07. (2007), 316-323.
31. Schneider, C., Mustufa, T., and Okamura, A.M. A
magnetically-actuated friction feedback mouse. Proc.
of EuroHaptics '04, (2004), 330–337.
32. Stone, R.J. Haptic Feedback: A Potted History, From
Telepresence to Virtual Reality. Proc. of First Int.
Workshop on Haptic-Computer Interaction, (2000).
33. Tuddenham, P. and Robinson, P. Territorial
coordination and workspace awareness in remote
tabletop collaboration. Proc. of CHI '09, ACM (2009),
34. Wigdor, D. No touch left behind. Microsoft
Expression Video,
expression/newsletter/2009-01/default.aspx, 2009.
35. Yu, W., Ramloll, R., and Brewster, S.A. Haptic
Graphs for Blind Computer Users. Proc. of 1st Int.
Workshop on Haptic Human-Computer Interaction,
Springer (2001), 41-51.
36. Zeleznik, R., Miller, T., and Forsberg, A. Pop through
mouse button interactions. Proc. of UIST '01, ACM
(2001), 195-196.
... We are interested in how touch can affect the multimodal experience of art and how it can enhance the immersive experience of the viewers in a virtual environment. Some solutions already suggested haptic exploration using bodysuits (Giordano et al., 2015), haptic brushes (Son et al., 2018b), textured reliefs (Reichinger et al., 2011), exoskeletons (Frisoli et al., 2005), force feedback (Dima et al., 2014), surface haptics (Ziat et al., 2021), vibrotactile (Marquardt et al., 2009), thermal (Hribar and Pawluk, 2011), or mid-air haptics (Vi et al., 2017) to interact with an art installation on a screen, in a virtual reality setting, or enhanced tactile walls in museums. In these types of exhibitions and systems where touch is at the center of the artistic piece or movement, the interaction is highly encouraged. ...
Full-text available
Virtual reality has been used in recent years for artistic expression and as a tool to engage visitors by creating immersive experiences. Most of these immersive installations incorporate visuals and sounds to enhance the user’s interaction with the artistic pieces. Very few, however, involve physical or haptic interaction. This paper investigates virtual walking on paintings using passive haptics. More specifically we combined vibrations and ultrasound technology on the feet using four different configurations to evaluate users’ immersion while they are virtually walking on paintings that transform into 3D landscapes. Results show that participants with higher immersive tendencies experienced the virtual walking by reporting illusory movement of their body regardless the haptic configuration used.
... The Haptic Tabletop Puck [34], the VTPlayer [35], among others ( [33,[36][37][38] offer comprehensive reviews) are solutions merging cutaneous cues on one finger with kinesthetic cues, to perceive simple geometrical sketches in a partial or total absence of vision. Laterotactile displays [39] convey the illusion of exploring height profiles on a flat surface by generating lateral skin deformation. ...
Full-text available
Background In this work, we present a novel sensory substitution system that enables to learn three dimensional digital information via touch when vision is unavailable. The system is based on a mouse-shaped device, designed to jointly perceive, with one finger only, local tactile height and inclination cues of arbitrary scalar fields. The device hosts a tactile actuator with three degrees of freedom: elevation, roll and pitch. The actuator approximates the tactile interaction with a plane tangential to the contact point between the finger and the field. Spatial information can therefore be mentally constructed by integrating local and global tactile cues: the actuator provides local cues, whereas proprioception associated with the mouse motion provides the global cues. Methods The efficacy of the system is measured by a virtual/real object-matching task. Twenty-four gender and age-matched participants (one blind and one blindfolded sighted group) matched a tactile dictionary of virtual objects with their 3D-printed solid version. The exploration of the virtual objects happened in three conditions, i.e., with isolated or combined height and inclination cues. We investigated the performance and the mental cost of approximating virtual objects in these tactile conditions. Results In both groups, elevation and inclination cues were sufficient to recognize the tactile dictionary, but their combination worked at best. The presence of elevation decreased a subjective estimate of mental effort. Interestingly, only visually impaired participants were aware of their performance and were able to predict it. Conclusions The proposed technology could facilitate the learning of science, engineering and mathematics in absence of vision, being also an industrial low-cost solution to make graphical user interfaces accessible for people with vision loss.
... In the same regard, swarm interfaces rearrange themselves to display different shapes. These have mainly been developed in the real world [43,79,86,99,149,150] but slowly take off as VR user interfaces [190] (Figure 1 -4). Indeed, while these latter devices are used as desktop interfaces, the swarm robot idea has extended to the air, with drones for instance [54,70,81,122,160]. ...
Full-text available
Haptic feedback has become crucial to enhance the user experiences in Virtual Reality (VR). This justifies the sudden burst of novel haptic solutions proposed these past years in the HCI community. This article is a survey of Virtual Reality interactions, relying on haptic devices. We propose two dimensions to describe and compare the current haptic solutions: their degree of physicality, as well as their degree of actuation. We depict a compromise between the user and the designer, highlighting how the range of required or proposed stimulation in VR is opposed to the haptic interfaces flexibility and their deployment in real-life use-cases. This paper (1) outlines the variety of haptic solutions and provides a novel perspective for analysing their associated interactions, (2) highlights the limits of the current evaluation criteria regarding these interactions, and finally (3) reflects the interaction, operation and conception potentials of "encountered-type of haptic devices".
... [5] brings interesting knowledge about the potential of rich tactile notifications on smartphones with different locations for actuators, and intensity and temporal variations of vibration. Ref. [6] present a simple and inexpensive device that incorporates dynamic and interactive haptics into tabletop interaction. In particular, they focus on how a person may interact with the friction, height, texture and malleability of digital objects. ...
Full-text available
In this paper, we present the results of an empirical study that aims to evaluate the performance of sighted and blind people to discriminate web page structures using vibrotactile feedback. The proposed visuo-tactile substitution system is based on a portable and economical solution that can be used in noisy and public environments. It converts the visual structures of web pages into tactile landscapes that can be explored on any mobile touchscreen device. The light contrasts overflown by the fingers are dynamically captured, sent to a micro-controller, translated into vibrating patterns that vary in intensity, frequency and temperature, and then reproduced by our actuators on the skin at the location defined by the user. The performance of the proposed system is measured in terms of perception of frequency and intensity thresholds and qualitative understanding of the shapes displayed.
... One way to add additional haptic feedback to a traditional interface is via an add-on haptic device, such as a handheld puck [ [16]], a haptic stylus [ [25]], or a mobile robot [ [9]]. With this approach, the user holds the device in their hand while exploring a 2D surface or 3D space. ...
Conference Paper
Tactile graphics are a common way to present information to people with vision impairments. Tactile graphics can be used to explore a broad range of static visual content but aren't well suited to representing animation or interactivity. We introduce a new approach to creating dynamic tactile graphics that combines a touch screen tablet, static tactile overlays, and small mobile robots. We introduce a prototype system called RoboGraphics and several proof-of-concept applications. We evaluated our prototype with seven participants with varying levels of vision, comparing the RoboGraphics approach to a flat screen, audio-tactile interface. Our results show that dynamic tactile graphics can help visually impaired participants explore data quickly and accurately.
... The other category is to display the interaction information with the touch screen by means of extrinsic haptic feedback. In terms of friction display, Marquardt et al. [30] designed a hand-held haptic tabletop puck that produces high friction by controlling the contact of the rubber plate with the touch screen and simulates the height and stiffness of the surface through a rod that contacts the fingertip. Wintergerst et al. [31] introduced a simple pen-type device consisting of an electromagnetic coil and a steel ball. ...
The haptic interface plays an increasingly important role in enhancing the realism and immersion of the user's interaction with the touch screen. Inspired by the wearable haptic system, this paper proposes a finger wearable device called FW-Touch for touch screen interaction. The device provides normal force, lateral force and vibrotactile feedback for the interaction of the finger and the touch screen through three internally integrated actuators. By displaying the hardness, friction and roughness of a virtual surface, the device is capable of simulating the active exploration and sensing process of the finger on a real surface. This paper describes the design and specifications of the FW-Touch, and details the design process of a magnetorheological (MR) foam actuator that uses a Hall sensor to correct the output force. Through physical measurements and psychophysical experiments, we comprehensively evaluated the force feedback performance of the FW-Touch and its ability in displaying the stiffness and friction of the virtual surface. The results show that improving the accuracy of force feedback is necessary for virtual stiffness display, and the accuracy and effectiveness of the FW-Touch in displaying virtual surface features can be confirmed from the measured stiffness and friction Weber fractions.
Full-text available
Direct interface circuits are a simple, inexpensive alternative for the digital conversion of a sensor reading, and in some of these circuits only passive calibration elements are required in order to carry out this conversion. In the case of resistive sensors, the most accurate methods of calibration, namely two-point calibration method (TPCM) and fast calibration methods I and II (FCMs I and II), require two calibration resistors to estimate the value of a sensor. However, although FCMs I and II considerably reduce the time necessary to estimate the value of the sensor, this may still be excessiveincertainapplications,suchaswhenmakingrepetitivereadingsofasensororreadingsofa large series of sensors. For these situations, this paper proposes a series of calibration methods that decrease the mean estimation time. Some of the proposed methods (quasi single-point calibration methods)arebasedontheTPCM,whileothers(fastquasisingle-pointcalibrationmethods)makethe most of the advantages of FCM. In general, the proposed methods significantly reduce estimation times in exchange for a small increase in errors. To validate the proposal, a circuit with a Xilinx XC3S50AN-4TQG144C FPGA has been designed and resistors in the range (267.56 Ω, 7464.5 Ω) have been measured. For 20 repetitive measurements, the proposed methods achieve time reductions of up to 61% with a relative error increase of only 0.1%.
Full-text available
This work describes early research activity on a hap- tic mouse interface. This project aims to join tactile and kinesthetic feedbacks in order to improve the accessibil- ity of the PC to visually impaired persons. In this phase a planar haptic interface has been developed. It is based on two linear motors, which allows a simple mechanical structure, with high stiffness and without backlash. The system is backdrivable but to reduce the friction effects and therefore enhance its transparency, the device has been equipped with a load cell along each axis. A suit- able controller exploiting a force loop has been devel- oped and some significant tasks experimentally verified. In particular, since this haptic mouse will be applied to interact with graphical user interfaces, the tracking of predefined paths and the rendering of geometrical shape have been tested.
Full-text available
"Active Click" is a new interface mechanism for addling tactile feedback to touch panels. A small actuator is attached to a body of PDA or the backside of a touch panel. The tactile feedback, created by driving the actuator with a short pulse, is perceived by the grasping hand or tapping finger-tip when the panel is tapped. Active Click is effective in improving the input speed of touch panel operation especially in noisy situations. Active click is also useful for large touch panel devices such as public information terminals or ATMs.
Full-text available
Virtual reality techniques allow one to interact with synthetic worlds, i.e. virtual environments. Tactile feedback means conveying parameters such as roughness, rigidity, and temperature. These information and many others are obtained thanks to the sense of touch. Tactile feedback is of prime importance in many applications. In this paper we examine the state-of-the-art in tactile interfaces design. Thorough reviews of the literature reveal a significant amount of publications concerning tactile human-machine interfaces, especially onwards the ~1990. This paper reports the progress in tactile interfaces technology in the following areas: teleoperation, telepresence, sensory substitution, 3D surface generation, Braille systems, laboratory prototypes, and games. Different parameters and specifications required to generate and feed back tactile sensation are summarized.
Full-text available
We have designed and built a passive haptic display by inte-grating an electromagnet into an infra-red optical mouse. When the elec-tromagnet is not energized, the mouse acts normally, with the same low friction and only a slight increase in mass. Energizing the electromagnet attracts the mouse towards a ferromagnetic mouse pad, increasing the normal force, and thus the frictional force, between the mouse and the mouse pad. This allows a two-dimensional display of varying frictional forces. Our hypothesis was that an increase in friction over particular targets could enable the user to travel more quickly between those tar-gets. In addition to several demonstrations, we designed a performance experiment to quantitatively determine how user performance is affected by friction feedback during a targeting task. The results of this exper-iment show a conclusive relationship between the field strength of the magnet and the targeting time.
Conference Paper
This paper describes the PHANTOM haptic interface - a device which measures a user's finger tip position and exerts a precisely controlled force vector on the finger tip. The device has enabled users to interact with and feel a wide variety of virtual objects and will be used for control of remote manipulators. This paper discusses the design rationale, novel kinematics and mechanics of the PHANTOM. A brief description of the programming of basic shape elements and contact interactions is also given.
The bulk of applications for haptic feedback employ direct rendering approaches wherein a user touches a virtual model of some "real" thing, often displayed graphically as well. We propose a new class of applications based on abstract messages, ranging from "haptic icons" – brief signals conveying an ob-ject's or event's state, function or content – to an expressive haptic language for interpersonal communication. Building this language requires us to understand how synthetic haptic signals are perceived, and what they can mean to us. Experiments presented here address the perception question by using an effi-cient version of Multidimensional Scaling (MDS) to extract perceptual axes for complex haptic icons: once this space is mapped, icons can be designed to maximize both differentiability and individual salience. Results show that a set of icons constructed by varying the frequency, magnitude and shape of 2-sec, time-invariant wave shapes map to two perceptual axes, which differ depending on the signals' frequency range; and suggest that expressive capability is maxi-mized in one frequency subspace.
Conference Paper
I present a guidance system for blind people exploring tactile graphics. The system is composed of a new device using eight vibrating elements to output directional information and a guidance software controlling the device. The evaluation of the system is also described.