A Low-cost Sensor Glove with Vibrotactile Feedback and Multiple
Finger Joint and Hand Motion Sensing for Human-Robot Interaction
P. Weber1, E. Rueckert1,2, R. Calandra1,2, J. Peters2,3and P. Beckerle1,4,5
Abstract— Sensor gloves are widely adopted input devices
for several kinds of human-robot interaction applications.
Existing glove concepts differ in features and design, but
include limitations concerning the captured ﬁnger kinematics,
position/orientation sensing, wireless operation, and especially
economical issues. This paper presents the DAGL OVE which
addresses the mentioned limitations with a low-cost design
(ca. 300 e). This new sensor glove allows separate measure-
ments of proximal and distal ﬁnger joint motions as well
as position/orientation detection with an inertial measurement
unit (IMU). Those sensors and tactile feedback induced by
coin vibration motors at the ﬁngertips are integrated within
a wireless, easy-to-use, and open-source system. The design
and implementation of hardware and software as well as
proof-of-concept experiments are presented. An experimental
evaluation of the sensing capabilities shows that proximal and
distal ﬁnger motions can be acquired separately and that hand
position/orientation can be tracked. Further, teleoperation of
the iCub humanoid robot is investigated as an exemplary
application to highlight the potential of the extended low-cost
glove in human-robot interaction.
Sensor gloves have various uses in robotics and human-
robot interaction such as learning manipulation tasks from
human demonstrations , , , rehabilitation , or
investigations of psychological issues . In manipulation,
the transformation from human to robot motions is an
important issue since these usually do not match perfectly,
e.g., due to kinematic differences. A possible solution for this
issue is active learning which relies on mapping from human
to robot kinematics , , . Alternately, the operator
directly controls the robot hand through an instantaneous
mapping from sensor glove measurements to control actions
in passive approaches , . Considering the latter class of
techniques, human operators can adapt and compensate for
limitations of the robot and kinematic mapping errors.
Providing additional degrees of freedom should lead to
better use of complex robots such as the humanoid robot
The research leading to these results has received funding from the
European Community’s Seventh Framework Programme (FP7/2007–2013)
under grant agreement #600716 (CoDyCo).
at Darmstadt, Darmstadt, Germany
2Intelligent Autonomous Systems Lab
3Robot Learning Group, Max-Planck Institute for Intelligent Systems,
Tuebingen, Germany email@example.com
4Institute for Mechatronic Systems in Mechanical Engineering,
Fig. 1. The low-cost sensor glove can be used for teleoperation of complex
robots with ﬁve ﬁnger hands such as the iCub humanoid (shown in the
picture). If the robot hand exhibits sensors, vibrotactile feedback can be
provided to the operator through motors at the ﬁngertips.
iCub ,  (see Figure 1). Furthermore, vibrotactile
feedback could improve human-robot interaction by giving
the user a feeling of ownership . Yet, potential beneﬁts
depend on the application: for instance, a combined degree
of freedom per ﬁnger might be sufﬁcient in certain reha-
bilitation robots , while additional ones could improve
exploring body schema integration ,  and can be
crucial in hand exoskeletons .
In contrast to commercial and rather high-priced data
gloves such as the Cyberglove , many low-cost gloves
do not provide more than one degree of freedom (DoF) per
ﬁnger, e.g., , , . Besides resistive sensors ,
, marker-based motion capturing , optical linear
encoders , or stretch sensors  are used to track
ﬁnger motions. While the glove from  provides ad-
ditional DoFs, it lacks other important features such as
hand position/orientation acquisition. The glove introduced
in  provides a wireless interface to the host but only
ﬁve DoFs. Although combining more DoFs and hand po-
sition/orientation sensing, the DJ Handschuh  is lim-
ited to a single ﬁnger. Alternative approaches to motion
acquisition rely on external tracking of the human hand by
fusing visual and gyroscope data  or using marker-based
measurement . The majority of gloves does not provide
vibrotactile feedback. One system that implements feedback
and combines it with the other features discussed above, is
the VR Data Glove . This glove aims at virtual reality
applications and only exhibits a single degree of freedom per
ﬁnger. An alternative feedback implementation is found in
the Hands Omni Glove  which generates contact forces
by inﬂatable bladders but uses external motion capturing.
As no low-cost glove combines multi-DoF ﬁnger tracking,
vibrotactile feedback, hand position/orientation detection
and wireless interfacing, this paper suggests the DAGLOVE.
The DAGL OVE is based on the open-source and low-
cost sensor glove described in , . It extends the
existing concept to provide the mentioned features and
makes use of electronic components that are affordable
and that simplify the development. The hardware design
of the new DAGL OVE is presented based on a brief
description of the preliminary glove in Section II before
data acquisition and software design are given in Section III.
Beyond presenting the improved glove design, the paper
qualitatively demonstrates the basic functionality of ﬁngers
and hand motion measurements in Section IV. To show
the beneﬁts of extended kinematic sensing with the glove,
it is shown how the additional degrees of freedom can be
exploited in teleoperation of the iCub humanoid robot. The
results of the paper are concluded in Section V.
II. GLOVE DESIGN
In the following sections, the design of the preliminary
glove and the re-designed DAGL OVE are described. Key
features of the DAGLOVE are the consideration of interpha-
langeal (IP) and metacarpophalangeal (MCP) joint ﬂexion
of the index, middle, ring, and pinky ﬁngers as well as
metacarpophalangeal I (MCP I) and carpometacarpal (CMC)
joint ﬂexion of the thumb (see Figure 2).
A. Preliminary glove
The preliminary glove used a single 4.5inch ﬂex sensor
per ﬁnger to measure ﬂexion and extension , .
The orientation of the hand and its position in space
were acquired by a marker-based motion tracking system
in . For this purpose, reﬂecting markers were ﬁxed on
the back of the glove and captured by multiple infrared
cameras installed in the laboratory. Coin vibration motors
(Solarbotics VPM2) were attached to the ﬁngertips to
provide vibrotactile feedback that can be controlled based
on tactile or pressure sensors of the teleoperated robotic
hand. In , the feedback was implemented to be
proportional to the contact forces occurring at the robotic
Fig. 2. Overview of the human hand articulations .
Fig. 3. Component placement: aPlacement of the ten ﬂex sensors. Sensor
Y, while X stands for the ﬁnger number (1=thumb, 2=index,
3=middle, 4=ring and 5=pinky) and Y stands for the sensor number on
each ﬁnger (thumb: 1=CMC&MCP I and 2=IP; other ﬁngers: 1=MCP and
2=PIP&DIP). bLocation of the ﬁve vibration motors (VM), the inertial
measurement unit (IMU) and the plug of the glove.
ﬁngertips and thereby inform the user during grasping. A
microcontroller-board (ARDUINO MEGA 2560) read the
sensor values and communicated with a host computer.
To exchange sensor and feedback data, an USB/COM-
Port interface connects the Universal Series Bus (USB)
of the host with the Serial Port (COM) of the microcontroller.
B. Requirements and redesign
The objectives of re-design of the preliminary glove re-
sulting in the new DAGL OV E are:
•Sensing complex motions
•Improvement of position/orientation sensing
•Extended motion possibilities and improved ergonomics
(e.g., through wireless transmission)
For this purpose, the new DAGL OVE uses ten ﬂex
sensors, the visual motion tracking is replaced by an
inertial measurement unit (IMU), the position of the
vibration motors is optimized and the electronics as well
as the software are completely redesigned as presented
1) Sensing complex motion: Ten 2.2inch ﬂex sensors
from SP EC T RA S YM BOL are implemented as depicted in
Figure 3. This facilitates the detection of more complex
motion-tasks and the acquisition of different joint motions
separately. Since ﬂexion and extension of the distal and
proximal interphalangeal joints are coupled in the human
hand , , it is sufﬁcient to use one ﬂex sensor placed
on the upper part of the ﬁnger. The second ﬂex sensor
is placed at the lower part of the ﬁnger to measure the
ﬂexion and extension of the metacarpophalangeal joint. At
the thumb, ﬂexion of the interphalangeal joint is measured by
Fig. 4. Exemplary motion sequences exploiting the degrees of freedom of
the glove: aStarting position. bFlexion of the PIP and DIP joints of the
index and middle ﬁnger. cFlexion of the PIP, DIP and MCP joints of the
index, middle, ring and pinky ﬁnger. dStarting position. eFlexion of the
CMC joint of the thumb. fFlexion of the IP and MCP I joints of the thumb.
a single ﬂex sensor. A second ﬂex sensor detects the coupled
ﬂexion and opposition movement of the metacarpophangeal
I and carpometacarpal joints exploiting their dependency
discussed in .
The additional degrees of freedom are acquired to meet the
requirement of facilitating more complex movements given
above. Examples of hand motions that present movements
which exploit the additional degrees of freedom are illus-
trated in Figure 4. A critical requirement is that the motion
of each joint must be sensed isolated from the movement of
the other joints. This is realized by placing the ﬂex sensors
as shown in Figure 3a.
To ﬁx the ﬂex sensors and allow them to move along
the ﬁnger axis but not orthogonal to it, they slid into small
pockets that are sewed on the top of the glove. The sensors
are ﬁxed at the side of their electronic connection pins and
their motion is guided by the pocket. The guidance and
ﬁxation of the sensors with these elastic pockets prevents
the sensors from being damaged during different ﬁnger
2) Position- and orientation-sensing: The DAG L OVE
further includes an IMU with nine degrees of freedom to
acquire hand motions and orientation. For this purpose a
INV EN SE NS E MPU -91 50-chip, which includes a 3-axis
accelerometer, a 3-axis gyroscope, and a 3-axis digital
compass (magnetometer) is implemented. To break out all
required pins of the MP U-9 150 to standard 0.1inch spaced
headers, a breakout version of this chip from SPARK FUN
(SPA RK FU N 9 DE G RE E S OF FR EED OM BRE AKOUT -
MPU- 915 0) is used. The board has an I2C-interface and is
centered on the back of the hand as it is shown in Figure 3b.
3) System integration and ergonomic aspects: The system
is designed as a standalone, easy-to-use, untethered, and
integrated solution which includes two main parts, i.e.,
sensory glove and electronic box. The glove is connected
Fig. 5. Exploded view of the electronic box with its components.
1Connector for the ﬂat ribbon cable. 2Elastic band. 3ON/OFF-button.
4Power-LED. 5Exchangeable battery-pack. 6Circuit board including
the ARDUINO MI CRO , BL UE TO OTH-transmitter, analog multiplexer (MUX)
and voltage regulation. 7Cover for the battery-pack. 8Micro-USB-Port.
length 130 mm. width 120 mm. height 50 mm.
to the electronic box with a ﬂexible ﬂat ribbon cable to
avoid disturbing the user. The connector of the electronic
box is shown in Figure 5. If the cable is unplugged, the
glove can easily be put on or off without any mechanical
disturbances. The connector of the glove is shown in Fig-
ure 3b. The electronic box includes the circuit board and
an exchangeable and rechargeable battery pack. The box
with its components is illustrated in Figure 5. The circuit
board and its functionality are presented in Section III. The
electronic box can be connected to an external PC over a
BLU ET OOTH -interface and together with its battery pack,
the sensory glove allows an easy-to-use wireless operation.
Due to its special shape and low weight, the 3D-printed box
can be attached to the upper arm of the user with elastic
bands. The compact design of the different components and
the wireless connection allow a high ﬂexibility to use this
low-cost sensory glove in different operation and motion
Further improvements are made regarding the ﬁxation of
the vibration motors to the glove. They are sewed on the
glove at the tip of the ﬁngertips, as close as possible to the
ﬁngernails. These new positions provide a more compact
ﬁxation, give the user a better feeling in the ﬁngertips and
do not disturb the user while grasping objects.
III. DATA ACQUISITION
In this section, the implemented electronic hardware as
well as the ﬁrmware that controls and monitors the whole
glove system are presented.
A. Glove electronic implementation
The whole system is controlled and monitored by an
ARDUINO MIC RO-board which is chosen for its small size
and overall good compatibility. On the one hand, it reads all
the data from the sensors and streams it to the host computer,
as it is illustrated in Figure 6. On the other hand, it can get
a command line from that device which deﬁnes the intensity
of the vibrotactile feedback of each ﬁnger. The connection to
the microcontroller can either be implemented over its own
Fig. 6. Simpliﬁed block diagram of the electronic implementation.
Micro-USB-Port or via BL U ET OOTH . The BLU E TO OT H-
connection is provided by an additional transceiver module
(AUKRU H C-0 6) connected to the UART-serial-interface
of the ARDUINO MIC RO. The IMU is connected to the
microcontroller through a serial I2C-interface. The intensity
of the vibrotactile feedback depends on the frequency and
amplitude of the corresponding vibration motor. The fre-
quency, as well as the amplitude, is directly proportional
to the motor input voltage and can be varied in a range
of 50 Hz to 220 Hz, respectively 1.2 N to 2.4 N. For this
purpose, the input voltage is regulated by an ampliﬁed pulse
width modulation (PWM) signal of the microcontroller-
board. The microcontroller includes 20 digital inputs and
outputs of which 7 can be used as PWM outputs and
12 as analog inputs. The sensory glove needs ﬁve PWM
outputs (vibration motors), four serial-port pins (IMU and
BLU ET OOTH -module) and 11 analog inputs (monitoring the
battery-level and 10 ﬂex sensors). As three of the analog
inputs have to be used as PWM outputs, the remaining nine
analog pins are insufﬁcient to read out all the ﬂex sensors.
That is the reason why a 16-channel analog multiplexer
(MUX) (CD 74H C40 67) is used in a breakout-board version
from SPARK FUN . This multiplexer allows to read 16 analog
channels using a single analog input (and ﬁve additional
digital outputs) on the ARDUINO-board. The exchangeable
battery-pack with 2400mA h provides enough power for a
constant operation of at least ﬁve hours. Figure 6 gives a
block diagram of the system, its components, and interfaces.
B. Glove software interfaces
The software is written in the ARDUINO IDE using
additional libraries in C++. The measured values from
the IMU are read using modiﬁed versions of the open
source libraries published on GIT HUB 1. Due to the analog
multiplexer (MUX), it is possible to read the values from
each ﬂex sensor subsequently with one analog input only.
Then the microcontroller checks for feedback-intensities
coming from an external device in the form of a string
command, e.g., from tactile sensors at the operated robots
ﬁnger tips. The command line can be detected on any
5 10 15 20 25 30
Metacarpophalangeal joint (MCP)
5 10 15 20 25 30
600 Distal/Proximal interphalangeal joints (DIP/PIP)
Fig. 7. Example of raw data recorded for both MCP and IP joints with
the ﬂex sensors for various hand poses (each color correspond to a different
ﬁnger). The ﬂexion of the different ﬁngers is visible from the data.
serial-communication-port, either on the Micro-USB-Port
or over the BL UE TOOT H-connection. Subsequently, the
values are extracted from this command line and the
vibration motors are controlled by a PWM-output that
is proportional to these values. In addition to that, the
software monitors the actual battery level. Finally it writes
a string command with all the values of the ﬂex sensors,
the IMU, the actual battery level and some control data
(actual time stamp and feedback intensities). This string is
either sent over the Micro-USB-Port or the BL UE TO OT H-
connection with a baudrate of 115200 bit/s. The main
software loop runs with a frequency of 25Hz. The glove
software interface, as well as detailed information about the
hardware implementation are freely available at https:
IV. EXP E RI M EN TAL EVAL UATI ON
As a ﬁrst proof-of-concept of the design and functionality
of the DAGLOVE, its capabilities and the performance of
its sensors are demonstrated. Moreover, a ﬁrst application
in teleoperation that beneﬁts from the separation of DIP/PIP
and MCP joints is presented.
A. Sensor capabilities
In the following sections, the functionality of the sensors
are presented by collecting the measured values from the
ﬂex sensors, as well as the IMU for exemplary ﬁnger and
1) Finger Sensors: This experiment demonstrates that the
placement of the sensors is appropriate to collect valuable
information about the ﬂexion of the single phalanges. To
reduce the measurement noise of the ﬂex sensors, a simple
moving average ﬁlter ˆxt=1
i=0xt−iwith a window n=10
is applied. More advanced ﬁltering techniques including
5 10 15 20 25
Fig. 8. Example of raw values recorded from the IMU gyroscope for
the widely used Butterworth ﬁlters in human grasping will
be investigated in future work. In Figure 7, an example of
the ﬁltered data collected from each individual ﬂex sensor
for the MCP joints is shown. The ﬂexion of each ﬁnger is
visible from the data. Flexion of other joints and ﬁngers is
observed for some DIP/PIP joint motions due to mechanical
couplings in the human hand.
2) IMU: To present the detection of the correct
orientation by the IMU, an example of the data collected
during various arm motions and poses is visualized in
Figure 8. As seen in the motion samples of Figure 8, the
aspect of the angular velocity curves behaves as expected
for each axis. To validate the precision and drift effects of
the gyroscope, accelerometer, and magnetometer, further
experiments will be performed in the future.
Fig. 10. The humanoid robot
iCub used in the experiments.
To investigate human-robot
interaction application, direct
teleoperation of a robotic
hand trough the DAGLOVE
is considered. The goal is to
demonstrate the beneﬁts of
the high number of ﬂex sen-
sors for grasping tasks. There-
fore, the humanoid robot iCub
shown in Figure 10 which
possess 9 DoF for each
hand  is used as a hard-
In the experiments, the ﬂex
sensor readings are directly
mapped to desired joint an-
gles in the iCub. Let qdenote
a joint angle with an operational range of [qmin,qmax ]. A ﬂex
sensor reading sis normalized and mapped to a desired joint
angle with q=qmin +(qmax −qmin )∗(s−smin)/(smax −smin ).
The operational ranges of the iCub ﬁnger joints and the
glove were obtained in a pre-processing phase. Note that
the iCub possess three DoF in the thumb (CMC, MCP I
and IP ﬂexion) while the DAGLOV E only has two (IP and
coupled CMC/MCP I ﬂexion). Thus for the CMC and MCP I
joint the same coupled CMC/MCP I ﬂexion signal was used.
Moreover, the ring and pinky ﬁngers of the iCub are coupled
Fig. 9. Various grasping poses with the DAGLOV E (left) and the
teleoperated iCub hand (right). The different grasp types make use of
different correlations between the ﬁnger joints, and as such beneﬁt from
the use of two separate ﬂex sensors for each ﬁnger.
and jointly controlled by a single DoF while the sensory
glove measures four separate DoF. Here we used the average
of these four readings for control.
In Figure 9, three grasp poses achieved using the
DAGLOVE to teleoperate the hand of the iCub are shown.
Performing all these grasp types would not be possible
using the same correlation matrix. For example, the grasp
in Figure 9b requires both MCP and IP joints to ﬂex, while
the grasp in Figure 9c only makes use of the IP joints.
Further, the additional thumb sensor enables the control
of the opposition of the thumb, which is crucial to stable
grasps of different sizes and shapes.
This paper presents a new sensor glove: the DAGL OVE.
The DAGL OV E is designed for human-robot interaction
research and therefore combines 2-DoF kinematic sens-
ing for all ﬁngers, vibrotactile feedback and hand posi-
tion/orientation acquisition. These aspects are integrated in
an easy-to-use and low-cost system with wireless connection
to the host computer. The key components comprise ten ﬂex
sensors, which are separately measuring proximal and distal
ﬁnger joint motions as well as the ﬂexion of the thumb
and the thumb saddle joints. An inertial measurement unit
facilitates detecting hand position and orientation. Finally,
coin vibration motors are attached to the ﬁngertips, provid-
ing vibrotactile feedback. Despite these improvements, the
overall material costs of the DAGL OV E is less than 300 e.
As a proof-of-concept, preliminary experiments are per-
formed to qualitatively examine the features of the DA-
GLOV E and their use. First, the separate acquisition of
proximal and distal ﬁnger motions as well as the tracking
of the hand movements with the integrated IMU are studied.
Following, the potential of the extended low-cost glove for
human-robot interaction is presented in a teleoperation sce-
nario with the iCub humanoid robot. Although a quantitative
assessment of sensor data quality is missing, these ﬁrst exper-
iments demonstrate that the separate sensing of proximal and
distal ﬁnger joint motion can enable teleoperating grasps with
increased complexity. Moreover, the additional detection of
thumb saddle joint motions enables grasping of ﬂat and soft
objects without deforming them.
Future works will focus on improving the electronics and
software to increase the operating frequency. The quality of
ﬂex sensor and IMU data should be quantitatively assessed
and ﬁlter implementations for these data should be tested.
Further potentials are the use of force instead of vibrotactile
 B. D. Argall, S. Chernova, M. Veloso, and B. Browning, “A survey
of robot learning from demonstration,” Robotics and autonomous
systems, vol. 57, no. 5, pp. 469–483, 2009.
 M. Fischer, P. van der Smagt, and G. Hirzinger, “Learning techniques
in a dataglove based telemanipulation system for the dlr hand,” in
Robotics and Automation, 1998. Proceedings. 1998 IEEE International
Conference on, vol. 2. IEEE, 1998, pp. 1603–1608.
 L. Fritsche, F. Unverzagt, J. Peters, and R. Calandra, “First-person
tele-operation of a humanoid robot,” in 15th IEEE-RAS International
Conference on Humanoid Robots, Nov. 2015, pp. 997–1002.
 G. Salvietti, I. Hussain, D. Cioncoloni, S. Taddei, S. Rossi, and
D. Prattichizzo, “Compensating hand function in chronic stroke pa-
tients through the robotic sixth ﬁnger.” IEEE Transactions on Neural
Systems and Rehabilitation Engineering, 2016.
 E. A. Caspar, A. de Beir, P. A. Magalhaes Da Saldanha da Gama,
F. Yernaux, A. Cleeremans, and B. Vanderborght, “New frontiers in
the rubber hand experiment: when a robotic hand becomes one’s own,”
Behavior Research Methods, vol. 6, pp. 1 – 12, 2014.
 C.-P. Tung and A. C. Kak, “Automatic learning of assembly tasks
using a dataglove system,” in Intelligent Robots and Systems 95.’Hu-
man Robot Interaction and Cooperative Robots’, Proceedings. 1995
IEEE/RSJ International Conference on, vol. 1. IEEE, 1995, pp. 1–8.
 R. Dillmann, O. Rogalla, M. Ehrenmann, R. Zollner, and M. Borde-
goni, “Learning robot behaviour and skills based on human demon-
stration and advice: the machine learning paradigm,” in ROBOTICS
RESEARCH-INTERNATIONAL SYMPOSIUM-, vol. 9, 2000, pp. 229–
 K. Gr¨
ave, J. St¨
uckler, and S. Behnke, “Learning motion skills from
expert demonstrations and own experience using gaussian process
regression,” in Robotics (ISR), 2010 41st International Symposium on
and 2010 6th German Conference on Robotics (ROBOTIK). VDE,
2010, pp. 1–8.
 M. Henriques, “Planning and control of anthropomorphic robotic
hands using synergies.”
 E. Rueckert, R. Lioutikov, R. Calandra, M. Schmidt, P. Beckerle,
and J. Peters, “Low-cost sensor glove with force feedback
for learning from demonstrations using probabilistic trajectory
representations,” in ICRA 2015 Workshop ”Tactile & force sensing
for autonomous, compliant, intelligent robots”, 2015. [Online].
 M. D’Alonzo and C. Cipriani, “Vibrotactile sensory substitution elicits
feeling of ownership of an alien hand,” PloS one, vol. 7, no. 11, p.
 I. Hussain, G. Salvietti, L. Meli, and D. Prattichizzo, “Using the
robotic sixth ﬁnger and vibrotactile feedback for grasp compensation
in chronic stroke patients,” in IEEE International Conference on
Rehabilitation Robotics, 2015.
 A. De Beir, E. A. Caspar, F. Yernaux, P. A. Magalhaes Da Saldanha da
Gama, B. Vanderborght, and A. Cleermans, “Developing new frontiers
in the rubber hand illusion: Design of an open source robotic hand to
better understand prosthetics,” in IEEE International Symposium on
Robot and Human Interactive Communication, 2014.
 A. M. Schmidts, L. Dongheui, and A. Peer, “Imitation learning of
human grasping skills from motion and force data,” in IEEE/RSJ
International Conference on Intelligent Robots and Systems, 2011.
 CyberGlove Systems LLC. (2016) Cyberglove systems. [Online].
 R. Gentner and J. Classen, “Development and evaluation of a low-
cost sensor glove for assessment of human ﬁnger movements in
neurophysiological settings,” Journal of Neuroscience Methods, vol.
178, pp. 138 – 147, 2008.
 Y. Han, “A low-cost visual motion data glove as an input device
to interpret human hand gestures,” IEEE Transactions on Consumer
Electronics, vol. 56 (2), pp. 501 – 209, 2010.
 K. Li, I.-M. Chen, S. H. Yeo, and C. K. Lim, “Development of ﬁnger-
motion capturing device based on optical linear encoder.” 2011.
 StretchSense. (2016) Stretchsense - smart, soft, stretchable sensors.
[Online]. Available: http://stretchsense.com/
 L. K. Simone, N. Sundarrajan, X. Luo, Y. Jia, and D. G. Kamper, “A
low cost instrumented glove for extended monitoring and functional
hand assessment,” Journal of Neuroscience Methods, vol. 160, pp. 335
– 348, 2007.
 H. Lohse and J.-L. Tirpitz. (2014) Dj handshuh. [On-
line]. Available: http://joanna.iwr.uni-heidelberg.de/projects/2014SS
 A. P. L. B´
o, M. Hayashibe, and P. Poignet, “Joint angle estimation
in rehabilitation with inertial sensors and its integration with kinect,”
IEEE International Conference on Engineering in Medicine and
 Q. Fu and M. Santello, “Tracking whole hand kinematics using ex-
tended kalman ﬁlter,” IEEE International Conference on Engineering
in Medicin and Biology, 2010.
 P. Christian. (2014) Homebrew vr data glove with haptic
feedback. [Online]. Available: https://www.youtube.com/watch?v=
 D. Ruth and M. Williams. (2015) Hands omni haptic glove lets
gamers feel virtual objects. [Online]. Available: http://news.rice.edu/
2015/04/22/gamers-feel- the-glove- from-rice- engineers- 2/
 M. Sch¨
unke, E. Schulte, and U. Schumacher, Prometheus - Allgemeine
Anatomie und Bewegungssystem : LernAtlas der Anatomie, 2nd ed.
Stuttgart: Thieme, 2007.
 J. Leijnse, P. Quesada, and C. Spoor, “Kinematic evaluation of
the ﬁnger’s interphalangeal joints coupling mechanism–variability,
ﬂexion-extension differences, triggers, locking swanneck deformities,
anthropometric correlations.” Journal of Biomechanics, vol. 43,
no. 12, pp. 2381 – 2393, 2010. [Online]. Available: http:
 J. Leijnse and C. Spoor, “Reverse engineering ﬁnger extensor
apparatus morphology from measured coupled interphalangeal
joint angle trajectories - a generic 2d kinematic model.” Journal of
Biomechanics, vol. 45, no. 3, pp. 569 – 578, 2012. [Online]. Available:
 Z.-M. Li and J. Tang, “Coordination of thumb joints during
opposition,” Journal of Biomechanics, vol. 40, no. 3, pp. 502 –
510, 2007. [Online]. Available: http://www.sciencedirect.com/science/
 G. Metta, G. Sandini, D. Vernon, L. Natale, and F. Nori, “The icub
humanoid robot: an open platform for research in embodied cognition,”
in Proceedings of the 8th workshop on performance metrics for
intelligent systems. ACM, 2008, pp. 50–56.
 D. Prattichizzo, F. Chinello, C. Pacchierotti, and M. Malvezzi, “To-
wards wearability in ﬁngertip haptics: A 3-dof wearable device for
cutaneous force feedback,” IEEE Transactions on Haptics, vol. 6,
no. 4, pp. 506–516, 2013.