Conference PaperPDF Available

A Low-cost Sensor Glove with Vibrotactile Feedback and Multiple Finger Joint and Hand Motion Sensing for Human-Robot Interaction

Authors:

Abstract and Figures

Sensor gloves are widely adopted input devices for several kinds of human-robot interaction applications. Existing glove concepts differ in features and design, but include limitations concerning the captured finger kinematics, position/orientation sensing, wireless operation, and especially economical issues. This paper presents the DAGLOVE which addresses the mentioned limitations with a low-cost design (ca. 300 e). This new sensor glove allows separate measurements of proximal and distal finger joint motions as well as position/orientation detection with an inertial measurement unit (IMU). Those sensors and tactile feedback induced by coin vibration motors at the fingertips are integrated within a wireless, easy-to-use, and open-source system. The design and implementation of hardware and software as well as proof-of-concept experiments are presented. An experimental evaluation of the sensing capabilities shows that proximal and distal finger motions can be acquired separately and that hand position/orientation can be tracked. Further, teleoperation of the iCub humanoid robot is investigated as an exemplary application to highlight the potential of the extended low-cost glove in human-robot interaction.
Content may be subject to copyright.
A Low-cost Sensor Glove with Vibrotactile Feedback and Multiple
Finger Joint and Hand Motion Sensing for Human-Robot Interaction
P. Weber1, E. Rueckert1,2, R. Calandra1,2, J. Peters2,3and P. Beckerle1,4,5
Abstract Sensor gloves are widely adopted input devices
for several kinds of human-robot interaction applications.
Existing glove concepts differ in features and design, but
include limitations concerning the captured finger kinematics,
position/orientation sensing, wireless operation, and especially
economical issues. This paper presents the DAGL OVE which
addresses the mentioned limitations with a low-cost design
(ca. 300 e). This new sensor glove allows separate measure-
ments of proximal and distal finger joint motions as well
as position/orientation detection with an inertial measurement
unit (IMU). Those sensors and tactile feedback induced by
coin vibration motors at the fingertips are integrated within
a wireless, easy-to-use, and open-source system. The design
and implementation of hardware and software as well as
proof-of-concept experiments are presented. An experimental
evaluation of the sensing capabilities shows that proximal and
distal finger motions can be acquired separately and that hand
position/orientation can be tracked. Further, teleoperation of
the iCub humanoid robot is investigated as an exemplary
application to highlight the potential of the extended low-cost
glove in human-robot interaction.
I. INTRODUCTION
Sensor gloves have various uses in robotics and human-
robot interaction such as learning manipulation tasks from
human demonstrations [1], [2], [3], rehabilitation [4], or
investigations of psychological issues [5]. In manipulation,
the transformation from human to robot motions is an
important issue since these usually do not match perfectly,
e.g., due to kinematic differences. A possible solution for this
issue is active learning which relies on mapping from human
to robot kinematics [6], [7], [8]. Alternately, the operator
directly controls the robot hand through an instantaneous
mapping from sensor glove measurements to control actions
in passive approaches [2], [9]. Considering the latter class of
techniques, human operators can adapt and compensate for
limitations of the robot and kinematic mapping errors.
Providing additional degrees of freedom should lead to
better use of complex robots such as the humanoid robot
The research leading to these results has received funding from the
European Community’s Seventh Framework Programme (FP7/2007–2013)
under grant agreement #600716 (CoDyCo).
1Technische Universit¨
at Darmstadt, Darmstadt, Germany
paul.weber@stud.tu-darmstadt.de,
beckerle@ims.tu-darmstadt.de
2Intelligent Autonomous Systems Lab
{rueckert, calandra}@ias.tu-darmstadt.de
3Robot Learning Group, Max-Planck Institute for Intelligent Systems,
Tuebingen, Germany mail@jan-peters.net
4Institute for Mechatronic Systems in Mechanical Engineering,
beckerle@ims.tu-darmstadt.de
5Member, IEEE
Fig. 1. The low-cost sensor glove can be used for teleoperation of complex
robots with five finger hands such as the iCub humanoid (shown in the
picture). If the robot hand exhibits sensors, vibrotactile feedback can be
provided to the operator through motors at the fingertips.
iCub [3], [10] (see Figure 1). Furthermore, vibrotactile
feedback could improve human-robot interaction by giving
the user a feeling of ownership [11]. Yet, potential benefits
depend on the application: for instance, a combined degree
of freedom per finger might be sufficient in certain reha-
bilitation robots [12], while additional ones could improve
exploring body schema integration [5], [13] and can be
crucial in hand exoskeletons [14].
In contrast to commercial and rather high-priced data
gloves such as the Cyberglove [15], many low-cost gloves
do not provide more than one degree of freedom (DoF) per
finger, e.g., [10], [13], [16]. Besides resistive sensors [10],
[13], marker-based motion capturing [17], optical linear
encoders [18], or stretch sensors [19] are used to track
finger motions. While the glove from [18] provides ad-
ditional DoFs, it lacks other important features such as
hand position/orientation acquisition. The glove introduced
in [20] provides a wireless interface to the host but only
five DoFs. Although combining more DoFs and hand po-
sition/orientation sensing, the DJ Handschuh [21] is lim-
ited to a single finger. Alternative approaches to motion
acquisition rely on external tracking of the human hand by
fusing visual and gyroscope data [22] or using marker-based
measurement [23]. The majority of gloves does not provide
vibrotactile feedback. One system that implements feedback
and combines it with the other features discussed above, is
the VR Data Glove [24]. This glove aims at virtual reality
applications and only exhibits a single degree of freedom per
finger. An alternative feedback implementation is found in
the Hands Omni Glove [25] which generates contact forces
by inflatable bladders but uses external motion capturing.
As no low-cost glove combines multi-DoF finger tracking,
vibrotactile feedback, hand position/orientation detection
and wireless interfacing, this paper suggests the DAGLOVE.
The DAGL OVE is based on the open-source and low-
cost sensor glove described in [10], [13]. It extends the
existing concept to provide the mentioned features and
makes use of electronic components that are affordable
and that simplify the development. The hardware design
of the new DAGL OVE is presented based on a brief
description of the preliminary glove in Section II before
data acquisition and software design are given in Section III.
Beyond presenting the improved glove design, the paper
qualitatively demonstrates the basic functionality of fingers
and hand motion measurements in Section IV. To show
the benefits of extended kinematic sensing with the glove,
it is shown how the additional degrees of freedom can be
exploited in teleoperation of the iCub humanoid robot. The
results of the paper are concluded in Section V.
II. GLOVE DESIGN
In the following sections, the design of the preliminary
glove and the re-designed DAGL OVE are described. Key
features of the DAGLOVE are the consideration of interpha-
langeal (IP) and metacarpophalangeal (MCP) joint flexion
of the index, middle, ring, and pinky fingers as well as
metacarpophalangeal I (MCP I) and carpometacarpal (CMC)
joint flexion of the thumb (see Figure 2).
A. Preliminary glove
The preliminary glove used a single 4.5inch flex sensor
per finger to measure flexion and extension [10], [13].
The orientation of the hand and its position in space
were acquired by a marker-based motion tracking system
in [10]. For this purpose, reflecting markers were fixed on
the back of the glove and captured by multiple infrared
cameras installed in the laboratory. Coin vibration motors
(Solarbotics VPM2) were attached to the fingertips to
provide vibrotactile feedback that can be controlled based
on tactile or pressure sensors of the teleoperated robotic
hand. In [10], the feedback was implemented to be
proportional to the contact forces occurring at the robotic
Fig. 2. Overview of the human hand articulations [26].
Fig. 3. Component placement: aPlacement of the ten flex sensors. Sensor
denominations X
Y, while X stands for the finger number (1=thumb, 2=index,
3=middle, 4=ring and 5=pinky) and Y stands for the sensor number on
each finger (thumb: 1=CMC&MCP I and 2=IP; other fingers: 1=MCP and
2=PIP&DIP). bLocation of the five vibration motors (VM), the inertial
measurement unit (IMU) and the plug of the glove.
fingertips and thereby inform the user during grasping. A
microcontroller-board (ARDUINO MEGA 2560) read the
sensor values and communicated with a host computer.
To exchange sensor and feedback data, an USB/COM-
Port interface connects the Universal Series Bus (USB)
of the host with the Serial Port (COM) of the microcontroller.
B. Requirements and redesign
The objectives of re-design of the preliminary glove re-
sulting in the new DAGL OV E are:
Sensing complex motions
Improvement of position/orientation sensing
System integration
Extended motion possibilities and improved ergonomics
(e.g., through wireless transmission)
For this purpose, the new DAGL OVE uses ten flex
sensors, the visual motion tracking is replaced by an
inertial measurement unit (IMU), the position of the
vibration motors is optimized and the electronics as well
as the software are completely redesigned as presented
subsequently.
1) Sensing complex motion: Ten 2.2inch flex sensors
from SP EC T RA S YM BOL are implemented as depicted in
Figure 3. This facilitates the detection of more complex
motion-tasks and the acquisition of different joint motions
separately. Since flexion and extension of the distal and
proximal interphalangeal joints are coupled in the human
hand [27], [28], it is sufficient to use one flex sensor placed
on the upper part of the finger. The second flex sensor
is placed at the lower part of the finger to measure the
flexion and extension of the metacarpophalangeal joint. At
the thumb, flexion of the interphalangeal joint is measured by
Fig. 4. Exemplary motion sequences exploiting the degrees of freedom of
the glove: aStarting position. bFlexion of the PIP and DIP joints of the
index and middle finger. cFlexion of the PIP, DIP and MCP joints of the
index, middle, ring and pinky finger. dStarting position. eFlexion of the
CMC joint of the thumb. fFlexion of the IP and MCP I joints of the thumb.
a single flex sensor. A second flex sensor detects the coupled
flexion and opposition movement of the metacarpophangeal
I and carpometacarpal joints exploiting their dependency
discussed in [29].
The additional degrees of freedom are acquired to meet the
requirement of facilitating more complex movements given
above. Examples of hand motions that present movements
which exploit the additional degrees of freedom are illus-
trated in Figure 4. A critical requirement is that the motion
of each joint must be sensed isolated from the movement of
the other joints. This is realized by placing the flex sensors
as shown in Figure 3a.
To fix the flex sensors and allow them to move along
the finger axis but not orthogonal to it, they slid into small
pockets that are sewed on the top of the glove. The sensors
are fixed at the side of their electronic connection pins and
their motion is guided by the pocket. The guidance and
fixation of the sensors with these elastic pockets prevents
the sensors from being damaged during different finger
movements.
2) Position- and orientation-sensing: The DAG L OVE
further includes an IMU with nine degrees of freedom to
acquire hand motions and orientation. For this purpose a
INV EN SE NS E MPU -91 50-chip, which includes a 3-axis
accelerometer, a 3-axis gyroscope, and a 3-axis digital
compass (magnetometer) is implemented. To break out all
required pins of the MP U-9 150 to standard 0.1inch spaced
headers, a breakout version of this chip from SPARK FUN
(SPA RK FU N 9 DE G RE E S OF FR EED OM BRE AKOUT -
MPU- 915 0) is used. The board has an I2C-interface and is
centered on the back of the hand as it is shown in Figure 3b.
3) System integration and ergonomic aspects: The system
is designed as a standalone, easy-to-use, untethered, and
integrated solution which includes two main parts, i.e.,
sensory glove and electronic box. The glove is connected
Fig. 5. Exploded view of the electronic box with its components.
1Connector for the flat ribbon cable. 2Elastic band. 3ON/OFF-button.
4Power-LED. 5Exchangeable battery-pack. 6Circuit board including
the ARDUINO MI CRO , BL UE TO OTH-transmitter, analog multiplexer (MUX)
and voltage regulation. 7Cover for the battery-pack. 8Micro-USB-Port.
length 130 mm. width 120 mm. height 50 mm.
to the electronic box with a flexible flat ribbon cable to
avoid disturbing the user. The connector of the electronic
box is shown in Figure 5. If the cable is unplugged, the
glove can easily be put on or off without any mechanical
disturbances. The connector of the glove is shown in Fig-
ure 3b. The electronic box includes the circuit board and
an exchangeable and rechargeable battery pack. The box
with its components is illustrated in Figure 5. The circuit
board and its functionality are presented in Section III. The
electronic box can be connected to an external PC over a
BLU ET OOTH -interface and together with its battery pack,
the sensory glove allows an easy-to-use wireless operation.
Due to its special shape and low weight, the 3D-printed box
can be attached to the upper arm of the user with elastic
bands. The compact design of the different components and
the wireless connection allow a high flexibility to use this
low-cost sensory glove in different operation and motion
scenarios.
Further improvements are made regarding the fixation of
the vibration motors to the glove. They are sewed on the
glove at the tip of the fingertips, as close as possible to the
fingernails. These new positions provide a more compact
fixation, give the user a better feeling in the fingertips and
do not disturb the user while grasping objects.
III. DATA ACQUISITION
In this section, the implemented electronic hardware as
well as the firmware that controls and monitors the whole
glove system are presented.
A. Glove electronic implementation
The whole system is controlled and monitored by an
ARDUINO MIC RO-board which is chosen for its small size
and overall good compatibility. On the one hand, it reads all
the data from the sensors and streams it to the host computer,
as it is illustrated in Figure 6. On the other hand, it can get
a command line from that device which defines the intensity
of the vibrotactile feedback of each finger. The connection to
the microcontroller can either be implemented over its own
Fig. 6. Simplified block diagram of the electronic implementation.
Micro-USB-Port or via BL U ET OOTH . The BLU E TO OT H-
connection is provided by an additional transceiver module
(AUKRU H C-0 6) connected to the UART-serial-interface
of the ARDUINO MIC RO. The IMU is connected to the
microcontroller through a serial I2C-interface. The intensity
of the vibrotactile feedback depends on the frequency and
amplitude of the corresponding vibration motor. The fre-
quency, as well as the amplitude, is directly proportional
to the motor input voltage and can be varied in a range
of 50 Hz to 220 Hz, respectively 1.2 N to 2.4 N. For this
purpose, the input voltage is regulated by an amplified pulse
width modulation (PWM) signal of the microcontroller-
board. The microcontroller includes 20 digital inputs and
outputs of which 7 can be used as PWM outputs and
12 as analog inputs. The sensory glove needs five PWM
outputs (vibration motors), four serial-port pins (IMU and
BLU ET OOTH -module) and 11 analog inputs (monitoring the
battery-level and 10 flex sensors). As three of the analog
inputs have to be used as PWM outputs, the remaining nine
analog pins are insufficient to read out all the flex sensors.
That is the reason why a 16-channel analog multiplexer
(MUX) (CD 74H C40 67) is used in a breakout-board version
from SPARK FUN . This multiplexer allows to read 16 analog
channels using a single analog input (and five additional
digital outputs) on the ARDUINO-board. The exchangeable
battery-pack with 2400mA h provides enough power for a
constant operation of at least five hours. Figure 6 gives a
block diagram of the system, its components, and interfaces.
B. Glove software interfaces
The software is written in the ARDUINO IDE using
additional libraries in C++. The measured values from
the IMU are read using modified versions of the open
source libraries published on GIT HUB 1. Due to the analog
multiplexer (MUX), it is possible to read the values from
each flex sensor subsequently with one analog input only.
Then the microcontroller checks for feedback-intensities
coming from an external device in the form of a string
command, e.g., from tactile sensors at the operated robots
finger tips. The command line can be detected on any
1https://github.com/sparkfun/MPU-9150_Breakout
Time [s]
5 10 15 20 25 30
0
500
1000
Metacarpophalangeal joint (MCP)
Time [s]
5 10 15 20 25 30
0
200
400
600 Distal/Proximal interphalangeal joints (DIP/PIP)
Flex-Sensors
Raw Data
Fig. 7. Example of raw data recorded for both MCP and IP joints with
the flex sensors for various hand poses (each color correspond to a different
finger). The flexion of the different fingers is visible from the data.
serial-communication-port, either on the Micro-USB-Port
or over the BL UE TOOT H-connection. Subsequently, the
values are extracted from this command line and the
vibration motors are controlled by a PWM-output that
is proportional to these values. In addition to that, the
software monitors the actual battery level. Finally it writes
a string command with all the values of the flex sensors,
the IMU, the actual battery level and some control data
(actual time stamp and feedback intensities). This string is
either sent over the Micro-USB-Port or the BL UE TO OT H-
connection with a baudrate of 115200 bit/s. The main
software loop runs with a frequency of 25Hz. The glove
software interface, as well as detailed information about the
hardware implementation are freely available at https:
//github.com/TUDarmstadtPWeber/DAGlove.
IV. EXP E RI M EN TAL EVAL UATI ON
As a first proof-of-concept of the design and functionality
of the DAGLOVE, its capabilities and the performance of
its sensors are demonstrated. Moreover, a first application
in teleoperation that benefits from the separation of DIP/PIP
and MCP joints is presented.
A. Sensor capabilities
In the following sections, the functionality of the sensors
are presented by collecting the measured values from the
flex sensors, as well as the IMU for exemplary finger and
hand movements.
1) Finger Sensors: This experiment demonstrates that the
placement of the sensors is appropriate to collect valuable
information about the flexion of the single phalanges. To
reduce the measurement noise of the flex sensors, a simple
moving average filter ˆxt=1
nn1
i=0xtiwith a window n=10
is applied. More advanced filtering techniques including
-1000
5 10 15 20 25
-500
0
500
1000
Time [s]
X-axis
Y-axis
Z-axis
Gyroscope
Raw Data
Fig. 8. Example of raw values recorded from the IMU gyroscope for
various movements.
the widely used Butterworth filters in human grasping will
be investigated in future work. In Figure 7, an example of
the filtered data collected from each individual flex sensor
for the MCP joints is shown. The flexion of each finger is
visible from the data. Flexion of other joints and fingers is
observed for some DIP/PIP joint motions due to mechanical
couplings in the human hand.
2) IMU: To present the detection of the correct
orientation by the IMU, an example of the data collected
during various arm motions and poses is visualized in
Figure 8. As seen in the motion samples of Figure 8, the
aspect of the angular velocity curves behaves as expected
for each axis. To validate the precision and drift effects of
the gyroscope, accelerometer, and magnetometer, further
experiments will be performed in the future.
B. Teleoperation
Fig. 10. The humanoid robot
iCub used in the experiments.
To investigate human-robot
interaction application, direct
teleoperation of a robotic
hand trough the DAGLOVE
is considered. The goal is to
demonstrate the benefits of
the high number of flex sen-
sors for grasping tasks. There-
fore, the humanoid robot iCub
shown in Figure 10 which
possess 9 DoF for each
hand [30] is used as a hard-
ware platform.
In the experiments, the flex
sensor readings are directly
mapped to desired joint an-
gles in the iCub. Let qdenote
a joint angle with an operational range of [qmin,qmax ]. A flex
sensor reading sis normalized and mapped to a desired joint
angle with q=qmin +(qmax qmin )(ssmin)/(smax smin ).
The operational ranges of the iCub finger joints and the
glove were obtained in a pre-processing phase. Note that
the iCub possess three DoF in the thumb (CMC, MCP I
and IP flexion) while the DAGLOV E only has two (IP and
coupled CMC/MCP I flexion). Thus for the CMC and MCP I
joint the same coupled CMC/MCP I flexion signal was used.
Moreover, the ring and pinky fingers of the iCub are coupled
Fig. 9. Various grasping poses with the DAGLOV E (left) and the
teleoperated iCub hand (right). The different grasp types make use of
different correlations between the finger joints, and as such benefit from
the use of two separate flex sensors for each finger.
and jointly controlled by a single DoF while the sensory
glove measures four separate DoF. Here we used the average
of these four readings for control.
In Figure 9, three grasp poses achieved using the
DAGLOVE to teleoperate the hand of the iCub are shown.
Performing all these grasp types would not be possible
using the same correlation matrix. For example, the grasp
in Figure 9b requires both MCP and IP joints to flex, while
the grasp in Figure 9c only makes use of the IP joints.
Further, the additional thumb sensor enables the control
of the opposition of the thumb, which is crucial to stable
grasps of different sizes and shapes.
V. CONCLUSION
This paper presents a new sensor glove: the DAGL OVE.
The DAGL OV E is designed for human-robot interaction
research and therefore combines 2-DoF kinematic sens-
ing for all fingers, vibrotactile feedback and hand posi-
tion/orientation acquisition. These aspects are integrated in
an easy-to-use and low-cost system with wireless connection
to the host computer. The key components comprise ten flex
sensors, which are separately measuring proximal and distal
finger joint motions as well as the flexion of the thumb
and the thumb saddle joints. An inertial measurement unit
facilitates detecting hand position and orientation. Finally,
coin vibration motors are attached to the fingertips, provid-
ing vibrotactile feedback. Despite these improvements, the
overall material costs of the DAGL OV E is less than 300 e.
As a proof-of-concept, preliminary experiments are per-
formed to qualitatively examine the features of the DA-
GLOV E and their use. First, the separate acquisition of
proximal and distal finger motions as well as the tracking
of the hand movements with the integrated IMU are studied.
Following, the potential of the extended low-cost glove for
human-robot interaction is presented in a teleoperation sce-
nario with the iCub humanoid robot. Although a quantitative
assessment of sensor data quality is missing, these first exper-
iments demonstrate that the separate sensing of proximal and
distal finger joint motion can enable teleoperating grasps with
increased complexity. Moreover, the additional detection of
thumb saddle joint motions enables grasping of flat and soft
objects without deforming them.
Future works will focus on improving the electronics and
software to increase the operating frequency. The quality of
flex sensor and IMU data should be quantitatively assessed
and filter implementations for these data should be tested.
Further potentials are the use of force instead of vibrotactile
feedback [31].
REFERENCES
[1] B. D. Argall, S. Chernova, M. Veloso, and B. Browning, “A survey
of robot learning from demonstration,” Robotics and autonomous
systems, vol. 57, no. 5, pp. 469–483, 2009.
[2] M. Fischer, P. van der Smagt, and G. Hirzinger, “Learning techniques
in a dataglove based telemanipulation system for the dlr hand,” in
Robotics and Automation, 1998. Proceedings. 1998 IEEE International
Conference on, vol. 2. IEEE, 1998, pp. 1603–1608.
[3] L. Fritsche, F. Unverzagt, J. Peters, and R. Calandra, “First-person
tele-operation of a humanoid robot,” in 15th IEEE-RAS International
Conference on Humanoid Robots, Nov. 2015, pp. 997–1002.
[4] G. Salvietti, I. Hussain, D. Cioncoloni, S. Taddei, S. Rossi, and
D. Prattichizzo, “Compensating hand function in chronic stroke pa-
tients through the robotic sixth finger.IEEE Transactions on Neural
Systems and Rehabilitation Engineering, 2016.
[5] E. A. Caspar, A. de Beir, P. A. Magalhaes Da Saldanha da Gama,
F. Yernaux, A. Cleeremans, and B. Vanderborght, “New frontiers in
the rubber hand experiment: when a robotic hand becomes one’s own,”
Behavior Research Methods, vol. 6, pp. 1 – 12, 2014.
[6] C.-P. Tung and A. C. Kak, “Automatic learning of assembly tasks
using a dataglove system,” in Intelligent Robots and Systems 95.’Hu-
man Robot Interaction and Cooperative Robots’, Proceedings. 1995
IEEE/RSJ International Conference on, vol. 1. IEEE, 1995, pp. 1–8.
[7] R. Dillmann, O. Rogalla, M. Ehrenmann, R. Zollner, and M. Borde-
goni, “Learning robot behaviour and skills based on human demon-
stration and advice: the machine learning paradigm,” in ROBOTICS
RESEARCH-INTERNATIONAL SYMPOSIUM-, vol. 9, 2000, pp. 229–
238.
[8] K. Gr¨
ave, J. St¨
uckler, and S. Behnke, “Learning motion skills from
expert demonstrations and own experience using gaussian process
regression,” in Robotics (ISR), 2010 41st International Symposium on
and 2010 6th German Conference on Robotics (ROBOTIK). VDE,
2010, pp. 1–8.
[9] M. Henriques, “Planning and control of anthropomorphic robotic
hands using synergies.”
[10] E. Rueckert, R. Lioutikov, R. Calandra, M. Schmidt, P. Beckerle,
and J. Peters, “Low-cost sensor glove with force feedback
for learning from demonstrations using probabilistic trajectory
representations,” in ICRA 2015 Workshop ”Tactile & force sensing
for autonomous, compliant, intelligent robots”, 2015. [Online].
Available: http://tubiblio.ulb.tu-darmstadt.de/73567/
[11] M. D’Alonzo and C. Cipriani, “Vibrotactile sensory substitution elicits
feeling of ownership of an alien hand,” PloS one, vol. 7, no. 11, p.
e50756, 2012.
[12] I. Hussain, G. Salvietti, L. Meli, and D. Prattichizzo, “Using the
robotic sixth finger and vibrotactile feedback for grasp compensation
in chronic stroke patients,” in IEEE International Conference on
Rehabilitation Robotics, 2015.
[13] A. De Beir, E. A. Caspar, F. Yernaux, P. A. Magalhaes Da Saldanha da
Gama, B. Vanderborght, and A. Cleermans, “Developing new frontiers
in the rubber hand illusion: Design of an open source robotic hand to
better understand prosthetics,” in IEEE International Symposium on
Robot and Human Interactive Communication, 2014.
[14] A. M. Schmidts, L. Dongheui, and A. Peer, “Imitation learning of
human grasping skills from motion and force data,” in IEEE/RSJ
International Conference on Intelligent Robots and Systems, 2011.
[15] CyberGlove Systems LLC. (2016) Cyberglove systems. [Online].
Available: http://www.cyberglovesystems.com/
[16] R. Gentner and J. Classen, “Development and evaluation of a low-
cost sensor glove for assessment of human finger movements in
neurophysiological settings,” Journal of Neuroscience Methods, vol.
178, pp. 138 – 147, 2008.
[17] Y. Han, “A low-cost visual motion data glove as an input device
to interpret human hand gestures,” IEEE Transactions on Consumer
Electronics, vol. 56 (2), pp. 501 – 209, 2010.
[18] K. Li, I.-M. Chen, S. H. Yeo, and C. K. Lim, “Development of finger-
motion capturing device based on optical linear encoder.” 2011.
[19] StretchSense. (2016) Stretchsense - smart, soft, stretchable sensors.
[Online]. Available: http://stretchsense.com/
[20] L. K. Simone, N. Sundarrajan, X. Luo, Y. Jia, and D. G. Kamper, “A
low cost instrumented glove for extended monitoring and functional
hand assessment,” Journal of Neuroscience Methods, vol. 160, pp. 335
– 348, 2007.
[21] H. Lohse and J.-L. Tirpitz. (2014) Dj handshuh. [On-
line]. Available: http://joanna.iwr.uni-heidelberg.de/projects/2014SS
DJHANDSCHUH/index.html
[22] A. P. L. B´
o, M. Hayashibe, and P. Poignet, “Joint angle estimation
in rehabilitation with inertial sensors and its integration with kinect,”
IEEE International Conference on Engineering in Medicine and
Biology, 2011.
[23] Q. Fu and M. Santello, “Tracking whole hand kinematics using ex-
tended kalman filter,IEEE International Conference on Engineering
in Medicin and Biology, 2010.
[24] P. Christian. (2014) Homebrew vr data glove with haptic
feedback. [Online]. Available: https://www.youtube.com/watch?v=
-b9UNLNkYFY&feature=youtu.be
[25] D. Ruth and M. Williams. (2015) Hands omni haptic glove lets
gamers feel virtual objects. [Online]. Available: http://news.rice.edu/
2015/04/22/gamers-feel- the-glove- from-rice- engineers- 2/
[26] M. Sch¨
unke, E. Schulte, and U. Schumacher, Prometheus - Allgemeine
Anatomie und Bewegungssystem : LernAtlas der Anatomie, 2nd ed.
Stuttgart: Thieme, 2007.
[27] J. Leijnse, P. Quesada, and C. Spoor, “Kinematic evaluation of
the finger’s interphalangeal joints coupling mechanism–variability,
flexion-extension differences, triggers, locking swanneck deformities,
anthropometric correlations.” Journal of Biomechanics, vol. 43,
no. 12, pp. 2381 – 2393, 2010. [Online]. Available: http:
//www.sciencedirect.com/science/article/pii/S0021929010002332
[28] J. Leijnse and C. Spoor, “Reverse engineering finger extensor
apparatus morphology from measured coupled interphalangeal
joint angle trajectories - a generic 2d kinematic model.” Journal of
Biomechanics, vol. 45, no. 3, pp. 569 – 578, 2012. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S0021929011006841
[29] Z.-M. Li and J. Tang, “Coordination of thumb joints during
opposition,” Journal of Biomechanics, vol. 40, no. 3, pp. 502 –
510, 2007. [Online]. Available: http://www.sciencedirect.com/science/
article/pii/S002192900600087X
[30] G. Metta, G. Sandini, D. Vernon, L. Natale, and F. Nori, “The icub
humanoid robot: an open platform for research in embodied cognition,”
in Proceedings of the 8th workshop on performance metrics for
intelligent systems. ACM, 2008, pp. 50–56.
[31] D. Prattichizzo, F. Chinello, C. Pacchierotti, and M. Malvezzi, “To-
wards wearability in fingertip haptics: A 3-dof wearable device for
cutaneous force feedback,” IEEE Transactions on Haptics, vol. 6,
no. 4, pp. 506–516, 2013.
... Wearable technology, in particular sensory gloves, has received increased research and development in the human-robot collaboration field. This is evident as various sensory gloves have been designed and developed over recent years as documented in [17,18]. ...
... These components define the foundation of a mechatronics system, shown in Figure 1, which is further expanded to accommodate the complex integration required between the components of sensory gloves. The integration of the four components require a synergistic relationship [17]. ...
... Sensory gloves include electrical and e tronic components, a computer component and a control component, which controls data flow from the sensory glove to the computer system. These components define foundation of a mechatronics system, shown in Figure 1, which is further expanded to commodate the complex integration required between the components of sensory glo The integration of the four components require a synergistic relationship [17]. Extensive research has been conducted on a wide range of sensory gloves that h been designed, tested and built for academic and commercial uses. ...
Article
Full-text available
Human–robot collaboration (HRC) enables humans and robots to coexist in the same working environment by performing production operations together. HRC systems are used in advanced manufacturing to improve the productivity and efficiency of a manufacturing process. The question is which HRC systems can ensure that humans can work with robots in a safe environment. This present study proposes a solution through the development of a low-cost sensory glove. This glove was developed using a number of hardware and software tools. The sensory glove analysed and computed the motion and orientation of a worker’s hand. This was carried out to operate the robot through commands and actions while under safe operating conditions. The sensory glove was built as a mechatronic device and was controlled by an algorithm that was designed and developed to compute the data and create a three-dimensional render of the glove as it moved. The image produced enabled the robot to recognize the worker’s hand when collaboration began. Tests were conducted to determine the accuracy, dynamic range and practicality of the system. The results showed that the sensory glove is an innovative low-cost solution for humans and robots to collaborate safely. The sensory glove was able to provide a safe working environment for humans and robots to collaborate on operations together.
... The low cost of the final device proposed for human motion and force measurement is an important factor in order to achieve future commercialization. In papers [17,18], the authors propose a cost-effective three-finger exoskeleton hand motion-capturing device [17] or a sensor glove with vibrotactile feedback and multiple finger joints and hand motion sensing [18]. The first device provides 12 DOFs data of finger motion by a unique bevel-gear structure as well as the use of six 3D magnetic sensors. ...
... The low cost of the final device proposed for human motion and force measurement is an important factor in order to achieve future commercialization. In papers [17,18], the authors propose a cost-effective three-finger exoskeleton hand motion-capturing device [17] or a sensor glove with vibrotactile feedback and multiple finger joints and hand motion sensing [18]. The first device provides 12 DOFs data of finger motion by a unique bevel-gear structure as well as the use of six 3D magnetic sensors. ...
Article
Full-text available
A prototype portable device that allows for simultaneous hand and fingers motion and precise force measurements has been. Wireless microelectromechanical systems based on inertial and force sensors are suitable for tracking bodily measurements. In particular, they can be used for hand interaction with computer applications. Our interest is to design a multimodal wireless hand grip device that measures and evaluates this activity for ludic or medical rehabilitation purposes. The accuracy and reliability of the proposed device has been evaluated against two different commercial dynamometers (Takei model 5101 TKK, Constant 14192-709E). We introduce a testing application to provide visual feedback of all device signals. The combination of interaction forces and movements makes it possible to simulate the dynamic characteristics of the handling of a virtual object by fingers and palm in rehabilitation applications or some serious games. The combination of these above mentioned technologies and open and portable software are very useful in the design of applications for assistance and rehabilitation purposes that is the main objective of the device.
... To better assist the user, hand exoskeletons should be able to predict user's intention. Exoskeleton control through intention detection can be achieved through different techniques, such as force sensing [17], motion sensing [15], [10], electroencephalography (EEG) [3], and electromyography (EMG) [13]. However, neurotechnology based methods such as EEG and EMG are highly subjectdependent and need regular calibration, which is not feasible for independent training applications at home. ...
Conference Paper
Full-text available
This paper presents a visually-guided grip selection based on the combination of object recognition and tactile feedback of a soft-hand exoskeleton intended for hand rehabilitation. A pre-trained neural network is used to recognize the object in front of the hand exoskeleton, which is then mapped to a suitable grip type. With the object cue, it actively assists users in performing different grip movements without calibration. In a pilot experiment, one healthy user completed four different grasp-and-move tasks repeatedly. All trials were completed within 25 seconds and only one out of 20 trials failed. This shows that automated movement training can be achieved by visual guidance even without biomedical sensors. In particular, in the private setting at home without clinical supervision, it is a powerful tool for repetitive training of daily-living activities.
... 101 Figure 5A-i shows a low-cost sensor glove, which consists of ten flex sensors (two on each finger), five vibration sensors located at the fingertips, and an inertial measurement unit designed for human-computer interaction and can be used for teleoperation of complex robots. 102 The glove robotic hand provides vibration feedback to the operator through the fingertip motors and gesture/direction acquisition combines motion positioning and orientation sensing and allows for separate measurements of proximal and distal finger joint motions. Figure 5A Tactile feedback can play an important role in improving the operation of robots remotely, which has been shown to increase the performance of these systems in terms of completion time, 104 accuracy, 105 and peak and average forces. ...
Article
Full-text available
Recent developments in the fields of materials science and engineering technology (mechanical, electrical, biomedical) lay the foundation to design flexible bioelectronics with dynamic interfaces, widely used in biomedical/clinical monitoring, stimulation, and characterization. Examples of this technology include body motion and physiological signal monitoring through soft wearable devices, mechanical characterization of biological tissues, skin stimulation using dynamic actuators, and energy harvesting in biomedical implants. Typically, these bioelectronic systems feature thin form factors for enhanced flexibility and soft elastomeric encapsulations that provide skin-compliant mechanics for seamless integration with biological tissues. This review examines the rapid and continuous progress of bioelectronics in the context of design strategies including materials, mechanics, and structure to achieve high performance dynamic interfaces in biomedicine. It concludes with a concise summary and insights into the ongoing opportunities and challenges facing developments of bioelectronics with dynamic interfaces for future applications.
... These sensors are expensive and cannot register resistance change in several bending points. One low cost sensor glove with vibrio tactile feedback and multiple finger joint and hand motion sensing for human-robot interaction presents in [8]. This low-cost glove is made with price of 300€. ...
Article
Full-text available
This research work is carried out on designing and prototyping a smart glove, which can conduct 3D interaction with computer MATLAB model in real time. The smart glove is constructed with only inertial measurement units for gathering and achieving human hand movement position data. This application will support the accuracy of the device and provide additional flexibilities for human interaction with other objects. The purpose of our design is to provide a smart glove with low price (less than 100€) for researchers in different institutions to develop their research projects with virtual and augmented reality. The design of hardware and software, as well as prototyping experiments is also presented.
Chapter
Promoting technical means advances the experimental possibilities to probe human-robot body experience and improves implementing the target applications. This chapter discusses practical approaches towards wireless sensing and feedback as well as integrating psychophysiological measurement. Even with low-cost solutions, current technology holds a substantial potential to improve human-robot experience research. Moreover, an expert study exploring ways to tailor mechatronic limb designs for human-in-the-loop experiments is presented. It recommends taking inspiration from synergies to achieve high dexterity while simplifying design and control. Finally, a new hand/arm design to foster the investigation of ecologically valid scenarios through increasing wearability is discussed.
Article
ABSTRAK Kebutuhan sistem otomasi untuk pekerjaan tentu sangat dituntut untuk menjadi lebih efektif dan efisien seperti halnya pemanfaatan PLC dalam bidang industri untuk menjalankan segala aktifitas menjadi otomatis, terkadang sistem ADC (analog to digital converter) pada PLC itu terbatas, dan harus diberi tambahan ekstensi eksternal yang dijual terpisah di masing-masing vendor. Tentunya harganya juga terkadang lebih mahal dari pada Unit pemrosesnya, hal ini yang sering menjadi sebuah pertimbangan. Contoh nya sistem ADC pada PLC CP1L-E memiliki 2 kanal. Oleh karena itu penelitian ini dilakukan, dengan melakukan perancangan multiplekser dan demultiplekser analog dengan memanfaatkan IC 74HC4067, hasil dari penelitian ini dalam 1 kanal input analog PLC CP1L-E dapat menjadi 16 kanal input dengan waktu pensaklaran direkomendasikan diatas 0,1detik / kanal berdasarkan hasil selisih pembacaan nilai ADC dibandingkan dengan tanpa melalui multiplekser berkisar antara 0,00% sampai 1,969%, dan multiplekser ini direkomendasikan untuk penggunaan sinyal input pada tegangan 0 sampai 5 volt. Kata kunci: ADC, Multiplekser,Demultiplekser, IC CD74HC4067, PLC ABSTRACT The need for automation systems for work is certainly highly demanded to be more effective and efficient as well as the use of PLCs in the industrial sector to carry out all activities automatically, sometimes the ADC (analog to digital converter) system on the PLC is limited, and must be given additional external extensions that are sold separately for each vendor. Of course the price is also sometimes more expensive than the processing unit, this is often a consideration. For example, the ADC system on the CP1L-E PLC has 2 channels. Therefore, this research was carried out, by designing an analog multiplexer and demultiplexer using IC 74HC4067, the results of this study in 1 analog input channel PLC CP1L-E can be 16 input channels with recommended switching times above 0.1 seconds / channel based on the results of the difference. The reading of the ADC value compared to without going through the multiplexer ranges from 0.00% to 1.969%, and this multiplexer is recommended for the use of input signals at a voltage of 0 to 5 volts. Keywords: ADC, Multiplexer, Demultiplexer, IC CD74HC4067, PLC
Article
Wearable sensing gloves and sensory feedback devices that record and enhance the sensations of the hand are used in healthcare, prosthetics, robotics, and virtual reality. Recent technological advancements in soft actuators, flexible bioelectronics, and wireless data acquisition systems have enabled the development of ergonomic, lightweight, and low-cost wearable devices. This review article includes the most up-to-date materials, sensors, actuators, and system-packaging technologies to develop wearable sensing gloves and sensory feedback devices. Furthermore, this review contemplates the use of wearable sensing gloves and sensory feedback devices together to advance their capabilities as assistive devices for people with prostheses and sensory impaired limbs. This review is divided into two sections: One detailing the technologies used to develop strain, pressure, and temperature sensors integrated with a multifunctional wearable sensing glove, and the other reviewing the devices and methods used for wearable sensory displays. We discuss the limitations of the current methods and technologies along with the future direction of the field. Overall, this paper presents an all-inclusive review of the technologies used to develop wearable sensing gloves and sensory feedback devices.
Article
In this paper, special attention is given to the hand. The literature provides solutions allowing the hand gestures recognition and/or object recognition for virtual reality, robotic applications, and so on. These solutions rely mainly on computer vision and data gloves. From this finding, we decided to develop our data glove prototype. The data glove is exploited to recognize common objects in the kitchen that the person can hold (e.g., hold a fork) while he/she performs basic daily activities such as drink a glass of water. The proposed approach is straightforward, cheap (∼260 $ in USD) and efficient (∼100 %). Moreover, the designed data glove gives easy and direct access to the raw data provided by sensors. Besides, a comparison between classical machine learning algorithms (e.g., CART, Random Forest) and a deep neural network is given. Finally, the proposed prototype is described in a way that researchers can reproduce it for any applications involving the object recognition with the hand.
Article
Full-text available
A novel solution to compensate hand grasping abilities is proposed for chronic stroke patients. The goal is to provide the patients with a wearable robotic extra-finger that can be worn on the paretic forearm by means of an elastic band. The proposed prototype, the Robotic Sixth Finger, is a modular articulated device that can adapt its structure to the grasped object shape. The extra-finger and the paretic hand act like the two parts of a gripper cooperatively holding an object. We evaluated the feasibility of the approach with four chronic stroke patients performing a qualitative test, the Frenchay Arm Test. In this proof of concept study, the use of the Robotic Sixth Finger has increased the total score of the patients of 2 points in a 5 points scale. The subjects were able to perform the two grasping tasks included in the test that were not possible without the robotic extra-finger. Adding a robotic opposing finger is a very promising approach that can significantly improve the functional compensation of the chronic stroke patient during everyday life activities.
Conference Paper
Full-text available
Remote control of robots is often necessary to complete complex unstructured tasks in environments that are inaccessible (e.g. dangerous) for humans. Tele-operation of humanoid robots is often performed trough motion tracking to reduce the complexity deriving from manually controlling a high number of DOF. However, most commercial motion tracking apparatus are expensive and often uncomfortable. Moreover, a limitation of this approach is the need to maintain visual contact with the operated robot, or to employ a second human operator to independently maneuver a camera. As a result, even performing simple tasks heavily depends on the skill and synchronization of the two operators. To alleviate this problem we propose to use augmented-reality to provide the operator with first-person vision and a natural interface to directly control the camera, and at the same time the robot. By integrating recent off-the-shelf technologies, we provide an affordable and intuitive environment composed of Microsoft Kinect, Oculus Rift and haptic SensorGlove to tele-operate in first-person humanoid robots. We demonstrate on the humanoid robot iCub that this setup allows to quickly and naturally accomplish complex tasks.
Conference Paper
Full-text available
Sensor gloves are popular input devices for a large variety of applications including health monitoring, control of music instruments, learning sign language, dexterous computer interfaces, and teleoperating robot hands. Many commercial products as well as low-cost open source projects have been developed. We discuss here how low-cost (approx. 250 EUROs) sensor gloves with force feedback can be build, provide an open source software interface for Matlab and present first results in learning object manipulation skills through imitation learning on the humanoid robot iCub.
Conference Paper
Full-text available
This paper presents a wearable robotic extra finger used by chronic stroke patients to compensate for the missing hand functions of the paretic limb. The extra finger is worn on the paretic forearm by means of an elastic band, and it is coupled with a vibrotactile ring interface worn on the healthy hand. The robotic finger and the paretic hand act like the two parts of a gripper working together to hold an object. The human user is able to control the flexion/extension of the robotic finger through a switch placed on the ring, while being provided with vibrotactile feedback about the forces exerted by the robotic finger on the environment. To understand how to control the vibrotactile interface to evoke the most effective cutaneous sensations, we carried out perceptual experiments to evaluate its absolute and differential thresholds. Finally, we performed a qualitative experiment, the Franchay Arm Test, with a chronic post-stroke patient presenting a partial loss of sensitivity on the paretic limb. Results show that the proposed system significantly improves the performance of the considered test.
Conference Paper
Full-text available
In psychology the Rubber Hand Illusion (RHI) is an experiment where participants get the feeling that a fake hand is becoming their own. Recently, new testing methods using an action based paradigm have induced stronger RHI. However, these experiments are facing limitations because they are difficult to implement and lack of rigorous experimental conditions. This paper proposes a low-cost open source robotic hand which is easy to manufacture and removes these limita-tions. This device reproduces fingers movement of the partici-pants in real time. A glove containing sensors is worn by the participant and records fingers flexion. Then a microcontroller drives hobby servo-motors on the robotic hand to reproduce the corresponding fingers position. A connection between the robotic device and a computer can be established, enabling the experimenters to tune precisely the desired parameters using Matlab. Since this is the first time a robotic hand is developed for the RHI, a validation study has been conducted. This study confirms previous results found in the literature. This study also illustrates the fact that the robotic hand can be used to conduct innovative experiments in the RHI field. Understanding such RHI is important because it can provide guidelines for prosthetic design.
Article
Full-text available
The rubber hand illusion is an experimental para- digm in which participants consider a fake hand to be part of their body. This paradigm has been used in many domains of psychology (i.e., research on pain, body ownership, agency) and is of clinical importance. The classic rubber hand para- digm nevertheless suffers from limitations, such as the ab- sence of active motion or the reliance on approximate mea- surements, which makes strict experimental conditions diffi- cult to obtain. Here, we report on the development of a novel technology—a robotic, user- and computer-controllable hand—that addresses many of the limitations associated with the classic rubber hand paradigm. Because participants can actively control the robotic hand, the device affords higher realism and authenticity. Our robotic hand has a comparative- ly low cost and opens up novel and innovative methods. In order to validate the robotic hand, we have carried out three experiments. The first two studies were based on previous research using the rubber hand, while the third was specific to the robotic hand. We measured both sense of agency and ownership. Overall, results show that participants experienced a “robotic hand illusion” in the baseline conditions. Furthermore, we also replicated previous results about agency and ownership. LINK: http://link.springer.com/article/10.3758%2Fs13428-014-0498-3
Article
Full-text available
Wearability will significantly increase the use of haptics in everyday life, as has already happened for audio and video technologies. The literature on wearable haptics is mainly focused on vibrotactile stimulation, and only recently, wearable devices conveying richer stimuli, like force vectors, have been proposed. This paper introduces design guidelines for wearable haptics and presents a novel 3-DoF wearable haptic interface able to apply force vectors directly to the fingertip. It consists of two platforms: a static one, placed on the back of the finger, and a mobile one, responsible for applying forces at the finger pad. The structure of the device resembles that of parallel robots, where the fingertip is placed in between the static and the moving platforms. This work presents the design of the wearable display, along with the quasi-static modeling of the relationship between the applied forces and the platform's orientation and displacement. The device can exert up to 1.5 N, with a maximum platform inclination of 30 degree. To validate the device and verify its effectiveness, a curvature discrimination experiment was carried out: employing the wearable device together with a popular haptic interface improved the performance with respect of employing the haptic interface alone.
Article
Full-text available
Tactile feedback plays a key role in the attribution of a limb to the self and in the motor control of grasping and manipulation. However, due to technological limits, current prosthetic hands do not provide amputees with cutaneous touch feedback. Recent findings showed that amputees can be tricked into experiencing an alien rubber hand as part of their own body, by applying synchronous touches to the stump which is out of view, and to the rubber hand in full view. It was suggested that similar effects could be achieved by using a prosthesis with touch sensors that provides synchronous cutaneous feedback through an array of tactile stimulators on the stump. Such a prosthesis holds the potential to be easily incorporated within one's body scheme, because it would reproduce the perceptual illusion in everyday usage. We propose to use sensory substitution - specifically vibrotactile - to address this issue, as current haptic technology is still too bulky and inefficient. In this basic study we addressed the fundamental question of whether visuo-tactile modality mismatch promotes self-attribution of a limb, and to what extent compared to a modality-matched paradigm, on normally-limbed subjects. We manipulated visuo-tactile stimulations, comprising combinations of modality matched, modality mismatched, synchronous and asynchronous stimulations, in a set of experiments fashioned after the Rubber Hand Illusion. Modality mismatched stimulation was provided using a keypad-controlled vibrotactile display. Results from three independent measures of embodiment (questionnaires, pointing tests and skin conductance responses) indicate that vibrotactile sensory substitution can be used to induce self-attribution of a rubber hand when synchronous but modality-conflicting visuo-tactile stimulation is delivered to the biological finger pads and to the equivalent rubber hand phalanges.
Article
Full-text available
We report about the iCub, a humanoid robot for research in embodied cognition. At 104 cm tall, the iCub has the size of a three and half year old child. It will be able to crawl on all fours and sit up to manipulate objects. Its hands have been designed to support sophisticate manipulation skills. The iCub is distributed as Open Source following the GPL/FDL licenses. The entire design is available for download from the project homepage and repository (http://www.robotcub.org). In the following, we will concentrate on the description of the hardware and software systems. The scientific objectives of the project and its philosophical underpinning are described extensively elsewhere [1].
Article
Motion data gloves are frequently used input devices that interpret human hand gestures for applications such as virtual reality and human-computer interaction. However, commercial motion data gloves are too expensive for consumer use, and this has limited their popularity. This paper presents an inexpensive motion data glove to overcome this obstacle. To lower costs, we designed our glove to use single-channel video instead of expensive motion-sensing fibers or multi-channel video. Our visual motion data glove is composed of an inexpensive consumer glove with attached thin-bar-type optical indicators and a closed-form reconstruction algorithm that can overcome the common disadvantages of single-channel video approaches, i.e., occlusion and the need for inconvenient iterative reconstruction algorithms. Our low-cost visual motion data gloves are used to interpret human hand gestures, and the resulting performance is evaluated.