Content uploaded by Yael Hanein
Author content
All content in this area was uploaded by Yael Hanein on Apr 04, 2025
Content may be subject to copyright.
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 1 of 14
Capturing Dynamic Finger Gesturing with High-resolution
Surface Electromyography and Computer Vision
Nitzan Luxembourg1, Dvir Ben-Dov1, Rufael Fekadu Marew2, Dvir Teitelbaum1, Ieva Vebraite Adereth1, Yael Hanein1,3
1School of Electrical Engineering, Tel Aviv University 2Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence
(MBZUAI) 3X-trodes Ltd.
Corresponding Author
Yael Hanein
yaelha@tauex.tau.ac.il
Citation
Luxembourg, N., Ben-Dov, D., Fekadu
Marew, R., Teitelbaum, D., Vebraite
Adereth, I., Hanein, Y. Capturing
Dynamic Finger Gesturing with High-
resolution Surface Electromyography
and Computer Vision. J. Vis. Exp. (217),
e67766, doi:10.3791/67766 (2025).
Date Published
March 28, 2025
DOI
10.3791/67766
URL
jove.com/video/67766
Abstract
Finger gestures are a critical element in human communication, and as such, finger
gesture recognition is widely studied as a human-computer interface for state-of-the-
art prosthetics and optimized rehabilitation. Surface electromyography (sEMG), in
conjunction with deep learning methods, is considered a promising method in this
domain. However, current methods often rely on cumbersome recording setups and
the identification of static hand positions, limiting their effectiveness in real-world
applications. The protocol we report here presents an advanced approach combining
a wearable surface EMG and finger tracking system to capture comprehensive data
during dynamic hand movements. The method records muscle activity from soft
printed electrode arrays (16 electrodes) placed on the forearm as subjects perform
gestures in different hand positions and during movement. Visual instructions prompt
subjects to perform specific gestures while EMG and finger positions are recorded.
The integration of synchronized EMG recordings and finger tracking data enables
comprehensive analysis of muscle activity patterns and corresponding gestures. The
reported approach demonstrates the potential of combining EMG and visual tracking
technologies as an important resource for developing intuitive and responsive gesture
recognition systems with applications in prosthetics, rehabilitation, and interactive
technologies. This protocol aims to guide researchers and practitioners, fostering
further innovation and application of gesture recognition in dynamic and real-world
scenarios.
Introduction
Hand gesturing is essential in human communication,
making the recognition of finger gestures a crucial
area of research across fields such as human-computer
interaction, advanced prosthetics1 , 2 , 3 , 4 , and rehabilitation
technologies5 , 6 . As a result, finger gesture recognition has
garnered significant attention for its potential to enhance
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 2 of 14
intuitive control systems and assistive devices. Surface
electromyography (sEMG) combined with deep learning
algorithms is emerging as a highly promising approach for
capturing and interpreting these gestures due to its ability to
detect the electrical activity of muscles associated with hand
movements7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 .
However, despite these advances, current approaches
face limitations in real-world applications. Most existing
systems require complex, cumbersome recording setups with
numerous electrodes5 , 7 , 9 , 16 , 17 and precise positioning3, 18 ,
which are often difficult to implement outside of controlled
environments. Additionally, these systems tend to focus on
static hand positions13 , 18 , 19 , 20 , 21 , limiting their ability to
interpret dynamic, fluid gestures that occur in daily activities.
The protocol aims to address these limitations by supporting
dynamic gesture recognition in more natural conditions. Such
methodology would enable more practical and user-friendly
applications in areas like prosthetics and rehabilitation, where
real-time, natural gesture interpretation is essential.
To address these challenges, developing more accurate and
adaptable algorithms requires datasets that reflect natural,
everyday conditions3, 4 . Such datasets must capture a
wide range of dynamic movements, various hand positions,
and large volumes of data to ensure model robustness.
Furthermore, the variability between training and test datasets
is crucial, allowing models to generalize across different
hand postures, muscle activation patterns, and motions.
Incorporating such diversity into the data will enable
algorithms to perform gesture recognition more accurately in
everyday, real-world applications22 .
Overcoming these challenges will be essential for the future
development of more practical and widely applicable gesture
recognition systems. The study and protocol described
here stem from the need to have a portable, user-friendly
setup that can capture dynamic hand movements in natural
settings. Comprehensive datasets and advanced algorithms
are critical to fully unlocking the potential of sEMG and deep
learning in human-computer interfaces, neuroprosthetics,
and rehabilitation technologies. We expect this protocol to
contribute to the field by facilitating comprehensive data
collection to further enable the development of algorithm
models that generalize across diverse hand positions.
A significant challenge in gesture recognition lies in the
sensitivity of sEMG signals to hand positioning. While
many studies focus on fixed-hand positions for gesture
prediction, real-world applications demand models capable of
recognizing finger movements across various hand postures.
Recent approaches have addressed this by incorporating
computer vision as a ground truth reference, enhancing the
accuracy and flexibility of these models15 , 19 . Additionally,
hybrid models that integrate sEMG signals with visual data
offer further improvements in recognition accuracy across
diverse scenarios23 .
In this protocol, we present a synchronized approach to
data collection that enhances dynamic gesture recognition
by incorporating both EMG and hand-tracking data in real-
world-like conditions. Unlike traditional methods that restrict
gesture performance to static positions, this protocol includes
gestures performed across four distinct positions: hand down,
hand up, hand straight, and hand moving. The hand-tracking
camera tracks hand movements within a three-dimensional
interactive zone, identifying distinct hand elements and
capturing dynamic movements with high resolution. A soft
electrode array of 16 electrodes placed on the forearm to
record muscle activity offers stable and wireless recordings
without impeding participant mobility. The synchronized data
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 3 of 14
from these two sources provides a comprehensive foundation
for developing advanced gesture recognition algorithms
capable of operating in real-world conditions. The approach
specifically addresses the limitations of current setups by
facilitating free movement and stable signal recording in
realistic scenarios. This advancement supports gesture
recognition technologies for applications in prosthetics,
rehabilitation, and interactive technologies, where intuitive
control and flexibility are essential.
Protocol
Healthy participants (n = 18, aged 18-32 years, both males
and females) were recruited for this study, which was
approved by the Tel Aviv University Ethics Review Board
(Approval No. 0004877-3). The protocol adheres to the
board's guidelines for research involving human participants.
Informed consent was obtained from all participants in
accordance with institutional requirements.
1. Experimenter briefing
1. Ask participants to perform a series of 14 distinct finger
gestures (see Figure 1) and repeat each gesture 7x in
a random sequence. Ask them to maintain each gesture
firmly for 5 s, followed by a 3 s rest period. The total
duration for each session is 13:04 min.
2. A large image of the gesture displayed on a computer
screen is accompanied by a countdown timer to
indicate gesture performance. During the rest period,
ask the participant to look at the small image of the
upcoming gesture shown, along with a timer indicating
the remaining rest time. Two distinct beep sounds signal
the start and end of each gesture, helping participants
prepare for the next gesture.
3. Ask each participant to execute the procedure in four
different positions, similar to previously presented22 :
Position 1: Participant standing. Hand down, straight, and
relaxed.
Position 2: Participant sitting in the armchair. Hand
extended forward at 90°, palm relaxed (a support device
may be used).
Position 3: Hand folded upwards (with an elbow resting
on the armchair), palm relaxed.
Position 4: Participant chooses one of the previous
positions and may move the hand freely within the
camera's detection range, monitored in real-time on a PC
screen (see step 1.4 for more details).
4. For each session, make the participant wear an
electromyography device on the arm and position a
hand-tracking camera towards them. Ask the participants
to ensure that their palms always face the camera. The
hand-tracking software is displayed on a separate screen
so that both the participant and the conductor can verify
that the hand is correctly recognized.
5. For each position, adjust the hand-tracking camera's
position and angle to ensure accurate hand recognition.
Additionally, assess the quality of the signals from the
electrodes using the spectrogram script.
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 4 of 14
Figure 1: Schematic representation of the data collection process. The subject is equipped with a soft electrode
array placed on the forearm (3), which captures high-resolution surface electromyography (sEMG) signals during gesture
performance. The subject performs 14 different finger gestures presented in random order on a computer display (4). The
EMG data is streamed wirelessly to a personal computer (PC) from the data acquisition unit (DAU; 1). Simultaneously, hand
kinematic data (HKD) representing finger joint angles is captured using a hand-tracking camera (2). Please click here to view
a larger version of this figure.
2. Setting up the data acquisition units
1. Open Github repository at https://github.com/
NeuroEngLabTAU/Fingers_Gestures_Recognition.git
and follow the detailed instructions in the Installation
section. Locate the primary Python file data_collection.py
in the folder finger_pose_estimation/data_acquisition.
Use this to run the experiment, use the script
spectrogram.py to assess EMG signal quality before the
experiment begins, and the script data_analysis.py for
signal filtering and segmentation.
2. Ensure that the EMG Data Acquisition Unit (DAU) is fully
charged before each session and turn it on.
3. Connect the DAU to the PC through Bluetooth using the
dedicated application. Set the Bluetooth communication
rate to 500 samples per second (S/s).
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 5 of 14
4. Install and open the hand-tracking camera's software on
the PC. Connect the hand-tracking camera to the PC
using a cable.
5. Use one screen to always display hand-tracking camera
software. This way, the conductor and the participant will
be able to ensure that the camera recognizes the hand
correctly during the experiment.
3. Participant preparation
1. Introduction and consent
1. Briefly explain the study's relevance and the
experimental procedure to the participant. Obtain
informed consent following institutional guidelines
for research involving humans.
2. Electrode placement
1. Instruct the participant to flex their right hand by
forming a strong fist. While the participant flexes,
palpate the forearm by gently pressing along the
muscle to identify the spot where muscle activation
is most prominent. This location is easily identifiable
by feeling the area where the muscle bulges during
contraction.
2. Optional: Prepare the identified skin area by
cleaning with an alcohol fiber-less cloth, prep gel,
or water and soap. Allow the area to air dry. Avoid
excessive cleaning with alcohol, as it may dry
the skin. This step is optional; see the discussion
section.
3. Peel off the white protective layer from the EMG
electrode array and carefully attach the electrodes
to the identified forearm area as determined in step
3.2.1. (see Figure 1). Ensure the adhesive tape is
closer to the palm. Secure the electrode array to the
skin by gently tapping.
4. Once the electrode array is attached to the skin, peel
off the transparent support layer.
5. Insert the electrode array connector card into the
DAU's connector socket. Attach the DAU to the
adhesive tape next to the electrodes.
6. Run custom Python spectrogram script
Spectrogram.py to verify real-time signal quality. A
window will appear displaying raw data (on the left)
and frequency domain (on the right) for all electrodes
(see Supplementary Figure 1 for reference).
1. Verify that all electrodes are detected and
function properly and that the signal is clean
from excessive noise and 50 Hz noise.
2. If needed, reduce 50 Hz noise by moving
away from electronic devices that may cause
interference and unplugging unnecessary
devices from the power. Allow time for the signal
to stabilize.
3. Verify EMG signal capture: instruct participant
to place an elbow on the armchair and move
fingers, then relax. Ensure that a clear EMG
signal is displayed followed by static baseline
noise.
4. Close the script once signal verification is
complete.
3. Gesture and hand position review
1. Open Images folder by clicking on
Finger_pose_estimation > Data_acquisition.
Review the gesture images with the participants.
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 6 of 14
2. Ensure they understand each movement and can
perform them accurately. Explain the four hand
positions clearly to the participant.
3. Instruct the participant on how to hold the hand
before each session, ensuring proper posture and
positioning.
4. Participant and camera positioning
1. For hand position 1, instruct the participant to stand
straight about 1 m away from the table. Instruct
the participant to hold the right hand down, straight
and relaxed, with the palm facing the hand-tracking
camera. Fix the hand-tracking camera on the table
with a selfie stick and direct it to face the participant's
hand.
2. For hand position 2, instruct the participant to sit
comfortably in an armchair positioned 40-70 cm from
the monitors. Instruct the participant to extend the
right hand forward at 90° with a relaxed palm facing
the hand-tracking camera. Use a support device, if
needed, to hold the hand stable. Place the hand-
tracking camera on the table facing up.
NOTE: As the participant is requested to remain in
a fixed posture, it is important to find a comfortable
position they can maintain throughout the session.
3. For hand position 3, instruct the participant to sit as
described in step 3.4.2. Instruct the participant to
fold the hand upwards while resting the elbow on
the armchair. The palm should be relaxed, and the
participant should face the hand-tracking camera.
Fix the hand-tracking camera on the table facing the
participant's hand (use a selfie stick if necessary).
Ensure the participant's position is optimal for both
viewing the screens and being within the camera's
field of view.
4. Continuously monitor the screen displaying hand-
tracking data to ensure the camera detects the hand
and fingers throughout the experiment. Optional:
verify EMG signal quality (step 3.2.6.) in each hand
position before starting the experiment.
4. Data collection
1. Running the experiment
1. Open Python and load data_collection.py. Verify
the parameters num_repetition, gesture_duration,
rest_duration are set as desired.
1. num_repetition: Define the number of times
each gesture image is shown. For this
experiment, set it to 7, meaning each image is
shown 7 times. gesture_duration: Specify the
duration (in s) for which the participant performs
the hand gesture. For this experiment, set it to
5 s, determining how long each gesture image is
displayed. Rest_duration: Specify the duration
(in s) for which the participant relaxes their palm
between gestures. For this experiment, set it to
3 s.
2. Adjust the hand-tracking camera position and angle
to the participant's hand position.
3. Run the data_collection.py script. A window will
appear to enter the participant's details (serial
number, age, sex, session number, and hand
position). Complete this information and press OK to
start the experiment automatically.
2. Data collection
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 7 of 14
1. For each session, record EMG and hand-tracking
data which are automatically saved. Repeat the
experiment 4x for each participant, once per hand
position.
5. End of experiment and post-experiment data
handling
1. As the experiment is completed, the data are
automatically saved. Ensure data is saved in a folder
labeled with the participant's serial number. Each session
is stored in a subfolder named S# (e.g., S1), with four
subfolders for each hand position P# (P1, P2, P3, and
P4). The folder size for a single session is approximately
160 MB.
2. If a participant completes multiple sessions, ensure all
data is saved in the corresponding session folder (e.g.,
S1, S2).
3. Data files
Ensure that each hand-position folder (P#)
contains the following files: EMG data
saved in an EDF file, named as follows:
fpe_pos{position number}_{subject number}_S{session
number}_rep0_BT; hand-tracking data saved in a
CSV file, named fpe_pos{position number}_{subject
number}_S{session number}_rep0_BT_full; and a log
file, log.txt, containing metadata about the session.
4. Data processing
NOTE: A user may choose how to proceed with
signal analysis and which tools to use. Here, we
provide a script for performing signal filtering and data
segmentation in Python. When using Python, ensure
all dependencies (e.g., Numpy, Pandas, SciPy, MNE,
Sklearn) are installed.
1. Open Python, load data_analysis.py and run the
script.
2. Request will appear in the console to provide
necessary parameters for data processing: path to
EMG file, path to hand kinematic data, path where
the processed data will be saved, sampling rate in
Hz, window duration in ms, and stride interval in ms.
3. Following that step the script will perform the data
processing.
4. EMG signal filtering: Run the script as above. The
script first filters the sEMG signal by applying a 4th-
order Butterworth high-pass filter with a 20 Hz cutoff
to remove non-EMG signals, then a notch filter to
remove 50 Hz and 100 Hz harmonics. Additionally,
script applies normalization of the EMG signal.
5. EMG, HKD data, and instructed gesture
segmentation: Run the script as above. The script
applies segmentation, utilizing a rolling window
technique defined by the specified window duration
and stride interval. In this experiment, set them
to 512 and 2 ms, respectively. The script then
transforms the sEMG channel organization into a 4
x 4 spatial grid configuration while maintaining the
electrode array layout. Finally, the script generates
a dictionary containing metadata as a pickle file.
6. Data cleaning and validation steps
1. Identify and exclude segments containing
artifacts, noise, or inconsistent gesture labels
from the dataset.
2. Ensure segment completeness and temporal
continuity across windows to maintain data
reliability.
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 8 of 14
3. Cross-check gesture data against the HKD
for consistency. Remove windows displaying
gesture patterns that deviate from HKD session
standards.
4. Detect and discard outlier segments that fail to
conform to the expected kinematic patterns for
the session.
5. Perform further data analysis using advanced
algorithms. These are not provided in the
current protocol.
Representative Results
The dataset consists of two time-synchronized components:
a 16-channel EMG dataset and data from a hand-tracking
camera system. The 16-channel EMG data captures muscle
activity by recording electrical signals from different muscles
over time. The hand-tracking system provides 16 channels
of data corresponding to key points on a skeletal model of
the hand. While the model has 21 points, excluding the wrist,
this number was reduced to 16 due to motion constraints24 .
The EMG and visual data were collected by running two
separate processes on the same computer during recording
to establish synchrony. A timestamp was used to mark the
start of each process, allowing data analysis code to align
muscle activity and hand movement data at the end of the
recording. Timestamp annotations were saved automatically
in both EDF and CSV files, marking the exact time when
specific finger gestures were instructed and facilitating the
alignment during data analysis. The filtered EMG signal (20
Hz 4th-order Butterworth high pass filter) is characterized
by a low baseline (grey-shaded areas), which typically falls
within the range of 3-9 µV25 . This baseline is observed when
the subject's hand is stationary and the muscles are at rest.
However, if muscle tone is present even in the rest position,
a distinct EMG signal can be detected. Mechanical artifacts
caused by movement usually manifest in the 10-20 Hz range
and should be filtered out accordingly. Significantly elevated
baseline values may indicate 50 Hz line interference and
should be avoided during the experimental setup stage. In
cases where moderate 50 Hz noise persists; a notch filter is
applied. Sharp movement artifacts, which are more difficult to
remove, often appear as pronounced high-amplitude spikes
in the signal (see asterisk in Figure 2A). The amplitude of the
EMG signal across the 16-electrode array varies, reflecting
the spatial distribution of the muscle activity over the region
measured. This variance provides valuable insight into the
heterogeneity of muscle contraction during hand gestures.
The hand-tracking camera provides direct information of
finger angles (hand kinematic data, HKD), which are expected
to correlate closely with the recorded EMG signals. During
gestures, finger angles in normal range26 , depending on the
specific gesture. When the visual path between the hand-
tracking camera and the hand is unobstructed, the resulting
signal is stable and accurate, as demonstrated in Figure 2.
However, in instances where visual contact is lost or when
the system experiences technical limitations, the HKD may
become erratic, displaying jumps between incorrect values.
Such outlier data should be minimized during data collection
and discarded in the final analysis to maintain the integrity of
the results.
The HKD is intuitive and provides a direct comparison with the
actual gestures performed. It exhibits low variability between
subjects and across different hand positions. In contrast, the
EMG data tends to vary significantly between individuals due
to anatomical differences such as hand size and muscle
development27 . Additionally, variability may be observed
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 9 of 14
between dominant and non-dominant hands. This subject-
specific variability can be addressed during offline analysis.
In Figure 2, it is evident that both the EMG and HKD are offset
relative to the instructed gesture trigger. This discrepancy
arises due to response time and the natural movement
execution28 . In regression tasks, such variability could
contribute to the richness of the data, while in classification
tasks, it can be managed using a generalized likelihood ratio
approach, as applied in similar scenarios28 .
Figure 2: Representative sEMG and HKD during finger abduction. Surface electromyography (sEMG) signals and hand
kinematic data (HKD) recorded during dynamic finger abduction and rest performed during hand position 1 (hand down,
straight, and relaxed) by a single participant. (A) Filtered EMG signals from 16 channels as a function of time. Asterisk (*)
denotes a mechanical artifact detected in the EMG recording of Channel 5. (B) HKD, showing the joint angles as a function
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 10 of 14
of time. Joint angles are measured at various joints: trapeziometacarpal (TMC), metacarpophalangeal (MCP), and proximal
interphalangeal (PIP). The phases of the experiment (rest and abduction) are indicated along the x-axis. Please click here to
view a larger version of this figure.
These representative results demonstrated the utility of the
synchronized EMG and HKD data in capturing hand gestures.
The alignment of EMG signals with corresponding HKD
allows mapping muscle activity to specific finger movements.
When constructing a predictive model, researchers can use
HKD as ground truth, iteratively verifying and refining EMG-
based gesture predictions. This approach highlights the
practical applicability of the protocol and suggests the need
for further research in more natural settings.
Supplementary Figure 1: Spectrogram windows
displayed during the signal verification step. The left
panels show raw EMG data, while the right panels show the
detected frequency domains. (A) Example of a very noisy
EMG signal with strong 50 Hz and 100 Hz interference. (B)
Example of the same EMG signal recording after moving the
participant further away from electrical devices, resulting in a
clean EMG signal with minimal interference. Please click here
to download this File.
Discussion
The protocol presented in this study outlines critical steps,
modifications, and troubleshooting strategies aimed at
enhancing hand gesture recognition through the combination
of sEMG signals and HKD. It addresses key limitations and
compares this approach to existing alternatives, highlighting
its potential applications in various research domains. One
of the most important aspects of the protocol is ensuring
the correct positioning and alignment of the hand-tracking
camera. Accurate gesture capture is highly dependent on the
angle and distance of the camera relative to the participant's
hand. Even slight deviations in camera positioning can lead
to tracking inaccuracies, reducing the fidelity of the gesture
data. This alignment must be carefully adjusted for each
participant and hand position to ensure consistent and reliable
data collection. Additionally, it is crucial that participants are
well-acquainted with the protocol to prevent junk data - where
gestures are either incorrectly executed or misaligned with the
experimental flow. Ensuring that participants are comfortable
and familiar with the gestures and the experimental setup can
minimize data noise and improve the quality of the recordings.
A common challenge in this type of study is noise
contamination in both sEMG and HKD. sEMG signals are
particularly sensitive to factors such as muscle fatigue, motion
artifacts, and environmental noise like electromagnetic
interference. Pre-processing techniques, such as band-pass
filtering, are essential for reducing noise and improving
signal clarity. Proper electrode placement and instructing
participants to maintain relaxed muscles during rest
phases can further mitigate motion artifacts. Despite these
precautions, some variability in sEMG signals is inevitable
due to individual differences in anatomy, hand strength, and
muscle activation patterns. This variability can be addressed
through flexible algorithms capable of normalizing these
differences across subjects and conditions.
A key factor in achieving high-quality sEMG signals is initial
signal verification. Traditional protocols using gel electrodes
require skin preparation, such as exfoliating or cleaning with
alcohol, to improve signal clarity. However, in a previous study
we showed that with dry electrodes, skin preparation may
not significantly impact signal quality25 . In this protocol, skin
cleaning is optional and thus simplifies the process. Another
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 11 of 14
skin-related issue affecting signal quality is excessive and
thick arm hair. In such cases, we suggest either shaving the
area or excluding the subject from the study.
One of the critical challenges in using sEMG for gesture
recognition is its sensitivity to hand positioning. Even when
performing the same gesture, variations in hand orientation
can lead to different EMG signal patterns. To address
this issue, machine learning models that can accommodate
variability in hand positions are essential22 . These models
must be trained with data from multiple hand postures to
improve robustness and generalizability. Synchronization of
visual and sEMG data is another important consideration.
Consistent timing of gestures is critical to avoid discrepancies
between the gesture execution and the data recording. This
protocol uses visual countdowns and auditory cues to help
ensure accurate timing and recalibration steps are employed
when necessary to correct any misalignment during data
collection.
Despite its strengths, this protocol has several limitations.
One major constraint is the limited field of view of the hand-
tracking camera, which requires the participant's hands to
remain within the camera's detection range. This restricts the
analysis to a small set of movements. For outside the lab
experiments a more complex video imaging will be required
or the use of smart gloves. Participant fatigue also poses a
challenge during longer sessions, potentially affecting gesture
accuracy and muscle activation, which can degrade the
quality of the sEMG data. To mitigate these effects, it may
be necessary to limit the session length or introduce breaks
to minimize fatigue. Additionally, powerline interference can
introduce noise into the sEMG signals, particularly when the
participants are close to the PC for data capture. A wireless
version of the system could reduce such interference by
allowing participants to be farther from the computer.
A significant methodological limitation of EMG-based finger
gesture detection stems from the high inter-subject variability
in sEMG signals, which requires the development of custom
models for each participant. This subject-specific approach,
while more accurate, limits the protocol's scalability and
requires additional calibration and training time for each new
user. EMG and HKD data streams show minor temporal
synchronization differences due to dual process recording.
These timing discrepancies have a minimal impact on the
static gesture analysis since the maintained poses are
temporally stable. The sustained nature of static gestures
provides adequate time for both EMG and kinematic features
to stabilize, unlike dynamic gestures, which require more
precise synchronization.
A key advantage of this method is its flexibility in capturing
gestures. Unlike other systems that require rigid setups
and strict gesture parameters, this protocol accommodates
dynamic and flexible hand positions19 . This flexibility is
especially useful in studies aimed at analyzing a broad
range of motions, making it more adaptable to real-
world applications. Furthermore, this protocol is cost-
effective compared to more advanced motion capture and
sEMG systems, which often involve complex setups29 . By
integrating a hand-tracking camera with semi-automated
sEMG algorithms, this method provides a viable alternative
for gesture recognition studies without compromising data
quality. Additionally, the system's potential for real-time data
processing opens possibilities for immediate feedback in
applications such as neuroprosthetics and rehabilitation,
where real-time responsiveness is essential. This protocol
has significant implications for several fields, particularly
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 12 of 14
neuroprosthetics. Accurate prediction of hand gestures from
sEMG signals is crucial for controlling prosthetic limbs, and
the flexibility in hand positioning offered by this method
makes it an ideal candidate for real-time prosthetic devices.
In rehabilitation, this protocol could be employed to monitor
and enhance motor recovery in patients with hand or
finger impairments. By analyzing muscle activation patterns
during gesture performance, this system could be used to
tailor rehabilitation exercises to individual needs, offering
a personalized approach to motor recovery. For human-
computer interaction (HCI), this method enables more natural
gesture-based control systems, improving the intuitiveness
and efficacy of user interfaces. Lastly, the protocol could be
applied to ergonomic studies to assess how different hand
positions and gestures influence muscle activity and fatigue,
potentially leading to advancements in workplace design and
user ergonomics.
To ensure consistent contraction strength across participants,
future studies could implement a glove with force-sensitive
resistors to measure force directly. This would allow for
standardized effort across subjects, improving the reliability
of EMG data. Additionally, integrating this force measurement
as a label in joint kinematics would provide a more detailed
representation of the muscle's internal state, potentially
enriching the analysis of muscle function and movement
patterns. This approach would not only enhance data
consistency but also offer deeper insights into the relationship
between muscle contraction and joint motion.
In conclusion, this protocol provides a novel and flexible
approach to hand gesture recognition with broad applications
across neuroprosthetics, rehabilitation, HCI, and ergonomics.
Although the system has limitations, its flexibility, cost-
effectiveness, and potential for real-time use represent
substantial advancements over existing methods. These
strengths make it a promising tool for further development and
innovation in gesture recognition technologies.
Disclosures
Yael Hanein declares a financial interest in X-trodes
Ltd, which commercialized the screen-printed electrode
technology used in this paper. The other authors have no
other relevant financial involvement with any organization or
entity with a financial interest in or financial conflict with the
subject matter or materials discussed in the manuscript apart
from those disclosed.
Acknowledgments
This project was partially funded with a grant from the ERC
(OuterRetina) and ISF. The funders had no role in study
design, data collection and analysis, decision to publish,
or preparation of the manuscript. We thank David Buzaglo,
Cheni Hermon, Liron Ben Ari and Adi Ben Ari for their
assistance with designing the original version of the protocol.
References
1. Fang, B. et al. Simultaneous sEMG recognition of
gestures and force levels for interaction with prosthetic
hand. IEEE Trans Neural Sys Rehabilitation Eng. 30,
2426-2436 (2022).
2. Yadav, D., Veer, K. Recent trends and challenges
of surface electromyography in prosthetic applications.
Biomed Eng Lett. 13 (3), 353-373 (2023).
3. Sapsanis, C., Georgoulas, G., Tzes, A. EMG based
classification of basic hand movements based on time-
frequency features. 21st Mediterranean Conf Control and
Automation. 716-722 (2013).
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 13 of 14
4. Qaisar, S. M., Lopez, A., Dallet, D., Ferrero, F. J. sEMG
signal based hand gesture recognition by using selective
subbands coefficients and machine learning. 2022 IEEE
Int Instrument Measurement Technol Conf. 1-6 (2022).
5. Zhang, X., Zhou, P. High-density myoelectric pattern
recognition toward improved stroke rehabilitation. IEEE
Trans Biomed Eng. 59 (6), 1649-1657 (2012).
6. Guo, K. et al. Empowering hand rehabilitation with AI-
powered gesture recognition: a study of an sEMG-based
system. Bioengineering. 10 (5), 557 (2023).
7. Sun, T., Hu, Q., Libby, J., Atashzar, S. F.
Deep heterogeneous dilation of LSTM for transient-
phase gesture prediction through high-density
electromyography: towards application in neurorobotics.
IEEE Robot Autom Lett. 7 (2), 2851-2858 (2022).
8. Atzori, M. et al. Characterization of a benchmark
database for myoelectric movement classification. IEEE
Trans Neural Sys Rehabilitat Eng. 23 (1), 73-83 (2015).
9. Amma, C., Krings, T., Schultz, T. Advancing muscle-
computer interfaces with high-density electromyography.
Proc 33rd Ann ACM Conf Human Factors Computing
Sys. 929-938 (2015).
10. Geng, W. et al. Gesture recognition by instantaneous
surface EMG images. Sci Rep. 6 (1), 36571 (2016).
11. Wei, W. et al. A multi-stream convolutional neural
network for sEMG-based gesture recognition in muscle-
computer interface. Pattern Recognit Lett. 119, 131-138
(2019).
12. Padhy, S. A tensor-based approach using multilinear
SVD for hand gesture recognition from sEMG signals.
IEEE Sens J. 21 (5), 6634-6642 (2021).
13. Moin, A. et al. A wearable biosensing system with
in-sensor adaptive machine learning for hand gesture
recognition. Nat Electron. 4 (1), 54-63 (2021).
14. Côté-Allard, U. et al. Deep learning for electromyographic
hand gesture signal classification using transfer learning.
IEEE Trans Neural Sys Rehabilita Eng. 27 (4), 760-771
(2019).
15. Liu, Y., Zhang, S., Gowda, M. NeuroPose: 3D hand pose
tracking using EMG wearables. Web Conference 2021
Proc World Wide Web Conf. 1471-1482 (2021).
16. Dere, M. D., Lee, B. A novel approach to surface EMG-
based gesture classification using a vision transformer
integrated with convolutive blind source separation. IEEE
J Biomed Health Inform. 28 (1), 181-192 (2024).
17. Chen, X., Li, Y., Hu, R., Zhang, X., Chen, X. Hand
gesture recognition based on surface electromyography
using convolutional neural network with transfer learning
method. IEEE J Biomed Health Inform. 25 (4), 1292-1304
(2021).
18. Lee, K. H., Min, J. Y., Byun, S. Electromyogram-based
classification of hand and finger gestures using artificial
neural networks. Sensors. 22 (1), 225 (2022).
19. Zhou, X. et al. A novel muscle-computer interface for
hand gesture recognition using depth vision. J Ambient
Intell Humaniz Comput. 11 (11), 5569-5580 (2020).
20. Zhang, Z., Yang, K., Qian, J., Zhang, L. Real-time surface
EMG pattern recognition for hand gestures based on an
artificial neural network. Sensors. 19 (14), 3170 (2019).
21. Nieuwoudt, L., Fisher, C. Investigation of real-time
control of finger movements utilizing surface EMG
signals. IEEE Sens J. 23 (18), 21989-21997 (2023).
Copyright © 2025 JoVE Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
License
jove.com March 2025 • 217 • e67766 • Page 14 of 14
22. Ben-Ari, L., Ben-Ari, A., Hermon, C., Hanein, Y. Finger
gesture recognition with smart skin technology and deep
learning. Flexible Printed Electron. 8 (2), 25012 (2023).
23. Yang, C., Xie, L. Gesture recognition method based
on computer vision and surface electromyography:
implementing intention recognition of the healthy side in
the hand assessment process. 2024 4th Int Conf Neural
Network Info Comm Eng. 663-668 (2024).
24. Lin, J., Wu, Y., Huang, T. S. Modeling the constraints
of human hand motion. Proc Workshop Human Motion.
121-126 (2000).
25. Arché-Núñez, A. et al. Bio-potential noise of dry
printed electrodes: physiology versus the skin-electrode
impedance. Physiol Meas. 44 (9), 95006 (2023).
26. Gracia-Ibáñez, V., Vergara, M., Sancho-Bru, J. L., Mora,
M. C., Piqueras, C. Functional range of motion of the
hand joints in activities of the International Classification
of Functioning, Disability and Health. J Hand Ther. 30 (3),
337-347 (2017).
27. Milosevic, B., Farella, F., Benatti, S. Exploring arm
posture and temporal variability in myoelectric hand
gesture recognition. 2018 7th IEEE Int Conf Biomed
Robotics Biomech. 1032-1037 (2018).
28. Gijsberts, A., Atzori, M., Castellini, C., Müller, H., Caputo,
B. Movement error rate for evaluation of machine
learning methods for sEMG-based hand movement
classification. IEEE Trans Neural Sys Rehabilitation Eng.
22 (4), 735-744 (2014).
29. Armitano-Lago, C., Willoughby, D., Kiefer, A. W. A SWOT
analysis of portable and low-cost markerless motion
capture systems to assess lower-limb musculoskeletal
kinematics in sport. Front Sports Act Living. 3, 809898
(2022).