ArticlePDF Available

Classification of human grasp forces in activities of daily living using a deep neural network

Authors:
  • Universidad Politécnica de Uruapan
  • Universidad Politecnica de Uruapan Michoacan

Abstract and Figures

The study of human grasp forces is fundamental for the development of rehabilitation programs and the design of prosthetic hands in order to restore hand function. The purpose of this work was to classify multiple grasp types used in activities of daily living (ADLs) based on finger force data. For this purpose, we developed a deep neural network (DNN) model using finger forces obtained during the performance of six tests through a novelty force sensing resistor (FSR) glove system. A study was carried out with 25 healthy subjects (mean age: 35.4±11.6) all right handed. The DNN classifier showed high overall performance, obtaining an accuracy of 93.19%, a precision of 93.33%, and a F1-score of 91.23%. Therefore, the DNN classifier in combination with the FSR glove system is an important tool for physiotherapists and health professionals to determine and identify finger grasp forces patterns. The DNN model will facilitate the development of tailored and personalized rehabilitation programs for subjects recovering of hand injurie and other hand diseases. In future work, prosthetic hand devices can be optimized to more accurately reproduce natural grasping patterns.
Content may be subject to copyright.
Bulletin of Electrical Engineering and Informatics
Vol. 13, No. 6, December 2024, pp. 4403~4412
ISSN: 2302-9285, DOI: 10.11591/eei.v13i6.7181 4403
Journal homepage: http://beei.org
Classification of human grasp forces in activities of daily living
using a deep neural network
Jesus Fernando Padilla-Magaña1, Isahi Sanchez-Suarez2, Esteban Peña-Pitarch3
1Department of Manufacturing Technologies, Polytechnic University of Uruapan, Michoacán, México
2Department of Physiotherapy, Polytechnic University of Uruapan, Michoacán, México
3Escola Politècnica Superior dEnginyeria de Manresa (EPSEM), Polytechnic University of Catalonia (UPC), Barcelona, Spain
Article Info
ABSTRACT
Article history:
Received Jul 13, 2023
Revised Jun 24, 2024
Accepted Jun 26, 2024
The study of human grasp forces is fundamental for the development of
rehabilitation programs and the design of prosthetic hands in order to restore
hand function. The purpose of this work was to classify multiple grasp types
used in activities of daily living (ADLs) based on finger force data. For this
purpose, we developed a deep neural network (DNN) model using finger
forces obtained during the performance of six tests through a novelty force
sensing resistor (FSR) glove system. A study was carried out with 25 healthy
subjects (mean age: 35.4±11.6) all right handed. The DNN classifier showed
high overall performance, obtaining an accuracy of 93.19%, a precision of
93.33%, and a F1-score of 91.23%. Therefore, the DNN classifier in
combination with the FSR glove system is an important tool for
physiotherapists and health professionals to determine and identify finger
grasp forces patterns. The DNN model will facilitate the development of
tailored and personalized rehabilitation programs for subjects recovering of
hand injurie and other hand diseases. In future work, prosthetic hand devices
can be optimized to more accurately reproduce natural grasping patterns.
Keywords:
Deep learning
Finger force
Force sensors
Human grasp
Rehabilitation
This is an open access article under the CC BY-SA license.
Corresponding Author:
Jesus Fernando Padilla-Magaña
Department of Manufacturing Technologies, Polytechnic University of Uruapan Michoacán
1200 Uruapan-Carapan Highway, Uruapan, Michoacán 60210, México
Email: fe.padilla@upu.edu.mx
1. INTRODUCTION
The human hand is one of the most complex parts of the human body composed of 29 bones
combined with an advanced muscular and ligamentous system which makes it difficult to study [1]. One of
the main functions of the human hand is the object manipulation that allows the performance of several
activities of daily living (ADLs). Grasping is an essential part of object manipulation as it enables the initial
contact and control of the object. Once a proper grasp is achieved, the hand becomes capable of performing a
wide range of manipulation actions, facilitating interaction with both the object and the environment.
However, human grasping is a major challenge for people with prosthetic hands or who have suffered hand
diseases or injuries. The current state of commercially available prosthetic hands is far from approaching
human-level dexterity, even for relatively simple grasping activities [2]. The limited reliability, functionality
and durability of prosthetic hands have resulted in low utilization or abandonment of sophisticated devices
[3]. On the other hand, a typical consequence of traumatic brain injuries, degenerative brain diseases and
strokes is a decreased ability to grasp and manipulate objects [4]. People with impaired hand function have a
significantly lower quality of life because they are unable to perform ADLs, so the physical rehabilitation
process is of utmost importance. Therefore, the study of human grasping is an important subject in
biomechanics and medical rehabilitation, for the design of realistic prosthetic hands, the assessment of hand
ISSN: 2302-9285
Bulletin of Electr Eng & Inf, Vol. 13, No. 6, December 2024: 4403-4412
4404
function and the development of specific rehabilitation programs [5][7]. In recent years, human hand motion
(HHM) analysis has become an important tool for understanding human grasping. Several studies on HHM
have been conducted in the areas of robotics, biomechanics, occupational therapy, neuroscience and artificial
intelligence [8], [9]. HHM analysis uses several sensing technologies that provides information about hand
position, force, and velocity over time for the development of computational models to study these motions
[8]. Specifically, force information plays a vital role in the analysis of HHM, and particularly in the context
of human grasping. One of the most widely used sensors for determining hand forces in HHM are the force
sensing resistors (FSRs). The FSRs are robust polymer thick film (PTF) devices with piezoresistive sensing
technology, which exhibit a reduction in resistance when force is exerted to their active area [10]. FSRs
consist of a pair of split membranes with an adhesive layer that produces an air gap between them. One
membrane is coated with a special resistive ink, while the other is printed with an interdigitated circuit
composed of multiple electrically distinct traces. The value of the FSR sensor is inversely proportional to the
applied force. When the sensor is pressed, the resistance of the FSR decreases as the force increases, due to
the conductive ink within the sensor. Additionally, FSR sensors can measure both dynamic and static forces
[11]. Recently, in order to measure and analysis human hand forces FSRs sensors have been used in several
applications in medical rehabilitation and biomechanics [12][17]. Lately, the use of machine learning and
deep learning (DL) algorithms using force data from FSRs, surface electromyography (sEMG), and force
myography (FMG) sensors have been used to improve the accuracy rate for the detection and classification of
Human grasp forces and motions. However, there has been limited research conducted in this area.
Li et al. [18] measured pressure distribution patterns using an array of 32 FSR sensors placed around the
forearm in combination with a support vector machine (SVM) algorithm for the classification of different
finger motions including grasping motions. On the other hand, Wan et al. [19] classify 21 distinct hand
gestures developing a k-nearest neighbor (kNN) classification algorithm using forearm electromyography
(EMG) signals acquired with a Myo armband and muscle pressure signals from the back of the hand acquired
with an array of five FSRs. Kakoty and Hazarika [20] used an SVM algorithm based on a radial basis
function (RBF) with forearm EMG signals to classify six different grasp types used during daily living
activities. Coskun et al. [21] proposed a convolutional neural network model (1D-CNN) to classify six grasp
types using as features surface EMG and sEMG signals. Jiang et al. [22] classified different types of grasping
by a linear discriminant analysis (LDA) model using an FMG system composed of an array of 16 FSR
sensors placed on the wrist. Therefore, in this study we propose the use of a deep neural networks (DNNs)
model for multiclass classification using fingertip force as features unlike previous studies that have been
focused in forearm muscle force. Although traditional machine learning classification algorithms, such as
SVM, random forest (RF), and kNN, have been successfully used in clinical applications. In addition, DNNs
have significant advantages in transforming low-level features into complex high-level features across
neurons and thus learning more complex and nonlinear patterns [23], [24]. The aim of this work was to
classify several grasp types used in ADLs based on finger force data. Therefore, we presented the
development of a low-cost and novelty FSR glove system for finger forces measurement. The force system
was evaluated during the performance of six test that involves several grasps used in ADLs in healthy
subjects. Subsequently, the dataset obtained was used for the development of a DNN model for the
classification of the six grasp types using as features the force data obtained with the FSR glove system. This
study contributes to the development of a novel high-performance DNN model capable of recognizing and
classifying finger force patterns associated with different types of grasping used in ADLs. These advances
offer significant advantages for the design of prosthetic hands and rehabilitation programs tailored to subjects
with upper extremity impairments.
2. MATERIALS AND METHODS
2.1. The proposed FSR glove system
The design of the FSR Glove systems is described in this section. The FSRs were placed at the distal
segments of the fingers because the purpose of the system was to study the force exerted on the fingertips
during the grasping of different items in diverse activities. Therefore, based on size and the functional
characteristics of the sensor, the FSR 07 model (Ohmite Manufacturing Company, USA) was chosen. Model
FSR 07 has the following features: a thickness of 0.375 mm (including adhesive), an active area of 14.7 mm
and a sensor overall length of 56.34 mm and overall width of 18.0 mm.
2.1.2. Signal conditioning
At this phase, a sensor conditioning circuit was built to generate a variable voltage as a function of
the force applied on the sensor. Common conditioning methods include using a voltage divider and the
inverting operational amplifier (op-amp) circuit. However, several studies [25][27] have demonstrated that
Bulletin of Electr Eng & Inf ISSN: 2302-9285
Classification of human grasp forces in activities of daily living using … (Jesus Fernando Padilla-Magaña)
4405
inverting op-amp circuits with FSRs exhibit linear behavior between voltage and pressure at both low and
high values. Since the study activities included objects of various sizes and the range of pressures would be
different in each fingertip, we decided to use an inverting op-amp for signal conditioning. In this
configuration, the output voltage (VOUT) exhibits an opposite polarity to the reference voltage (VREF).
Furthermore, as the resistance of the FSR (RFSR) increases, the VOUT decreases proportionally.
Consequently, when no pressure is applied in this configuration, the circuit generates zero VOUT as a result
of the high RFSR impedance [28]. Nonetheless, VOUT increases as force is exerted, either significantly or
minimally, based on the resistor value (RG) selected. In (1) delineates the amplifiers output.

  (1)
The inverting op-amp configuration included a combination of a voltage regulator, an op-amp, and a
resistor. We chose the LM324 quad op-amp device for its cost-effectiveness and low voltage requirements.
Subsequently, the appropriate RG was determined taking into account the following considerations. Since the
microcontroller are limited to a power input of 5 V, exceeding this voltage would cause saturation and render
the circuit unusable. Additionally, the highest fingertip strength applied on the FSRs was determined with a
multimeter, reaching an RFSR value of 100 . Therefore, a RG of 150 was selected to ensure a VOUT
close to 5 V. Finally, we use a first-order active low-pass filter for attenuating high-frequency interference. A
cutoff frequency of 60 Hz was considered adequate for reducing the signal interference from the electrical
circuit. For this purpose, a 22 μF capacitor was used.
2.1.3. Calibration
Finally, FSR sensors underwent static calibration prior to application to minimize inaccuracies;
multiple studies have used comparable calibration procedures [14], [29], [30]. Throughout the test, the
VOUT was recorded using the parallax data acquisition tool (PLX-DAQ) application. Next, calibrated
weights of different magnitudes were then mounted on each FSR for generating a plot of the relationship of
the exerted force to the VOUT, which resulted in the calibration curve equations. The results indicate that the
VOUT increases proportionally to the amount of force applied. In addition, the derived equations offer the
possibility to extend the findings to higher pressures as needed. Coefficients of correlation and calibration
equations are shown in Table 1.
Once the FSR signals were conditioned, we proceed to the data acquisition phase. The
transformation from analog to digital signals followed these steps: the inverter op-amp output pins were
connected to the analog ports of an Arduino Nano to transform the analog voltage using a 10-bit analog-to-
digital converter (ADC). Next, using the equations found during the calibration process, we created a sketch
to read the analog inputs of each FSR sensor and convert them into force values. Once the force values were
acquired, they were transferred wirelessly to a user friendly graphic user interface created in unity v.2020.2.1.
The HC-05 (bluetooth module) was used for this purpose, employing the serial port protocol (SPP). Sensor
data obtained during each of the tests were collected at a frequency of 50 Hz and stored in a file of
comma-separated values (CSV). Next, the data was filtered with a 5-Hz low-pass second order Butterworth
filter, similar to previous studies of HHM analysis [31][34]. Finally, the FSR sensors were mounted in a
flexible glove on the distal segments of the fingers.
Table 1. Calibration equations for FSR sensors
Sensor
Coefficient of correlation (󰇜





2.2. Participants
Twenty-five healthy individuals participated in this study, 13 women and 12 men (mean age:
35.411.6, hand length (HL): 186.7 cm13.1 cm, hand breadth (HB): 83.1 cm7.6 cm). The inclusion criteria
include the following requirements: right-handed, being at least 18 years old, having no history of hand
disorders or injuries, and reported to be 100% functional with the right hand. The study was performed with
the approval of the Ethics Committee of the Universidad Politécnica de Uruapan (UPUCE/F004/2022) and
all subjects signed an informed consent for inclusion after being informed of the protocol which was in
accordance with the declaration of Helsinki.
ISSN: 2302-9285
Bulletin of Electr Eng & Inf, Vol. 13, No. 6, December 2024: 4403-4412
4406
2.3. Experimental setup
The study was conducted at the Universidad Politécnica de Uruapan facilities. The HL and HB were
determined in each subject using a measuring tape. Participants were fitted with the FSR glove on the
right-hand, the force system was mounted on the wrist with a velcro strap, and bluetooth communication with
the user interface was tested (Figure 1). Participants were then instructed to hold, grasp, and lift each item
using the wearable device. Each participant performed the six tests three times each. Six items commonly
used in ADLs were selected for testing in this study. The items varied in size, weight, and shape, requiring
different grasping configurations and forces during each test. The grasp types used during each test were
classified according Cutkosky’s grasp taxonomy [35]. The characteristics of the objects and the grasp
classification are shown in Table 2. The force data was determined and stored in a CSV file throughout the
entire duration of each test for further statistical analysis. At the beginning of each trial, the hand was placed
horizontally on a table in a neutral position. Next, in the pre-grasp phase, no force was exerted on the object;
therefore, it was not taken into consideration in the analysis. Then, during the grasping phase, the maximum
force is reached when the object is lifted. Finally, as the participant drops the item, the force gradually
decreases and the fingers return to their initial position. As a result, the peak force values from the three trials
performed on each subject were averaged for each task. We identified these averaged values as the subjects
maximum finger force during a task. The maximum finger forces of all 25 subjects were then averaged.
Figure 1. A participant wearing the FSR glove system
Table 2. Objects description and grasp classification of each test
Test
Item
Size
Weight (g)
Grasp type
1
Wooden block
10 cm3
490
Large diameter heavy wrap
2
Wooden block
7 cm3
195
Medium wrap
3
Wooden block
2.5 cm3
6.5
Tripod grasp
4
Tennis ball
Diameter, 6.7 cm
60
Sphere precision grasp
5
Marble
Diameter, 1.8 cm
5.4
Thumb+1 finger pinch
6
Plastic tumbler with water
Diameter, 7 cm
320
Sphere power grasp
2.4. Deep neural network
DL is a subfield of machine learning, which is based essentially in artificial neural networks
(ANNs). In turn, an ANN is a computer model inspired on the flow of data processed throughout neurons in
the human brain [36]. The structure of an ANN is represented as a set of layers, which are defined as input,
hidden and output layers. The neurons in the input layer corresponds to the number of features in the dataset
and passes them to the rest of the network. The hidden layers are intermediate layers between the input and
output layers and process the data. Finally, the number of neurons in the output layer is the same as the
corresponding outputs connected to each input, and these neurons produce the final results. However, when
an ANN has multiple hidden layers between the input and output layers, this is defined as DNNs. DNNs are
often employed for their ability to model complex non-linear relationships accurately and adaptively in
classification tasks [24], [37].
2.4.1. Data preprocessing
The DNN model were developed in Anaconda (Anaconda Inc., TX, USA). Once the force data were
collected with the FSR glove system, we started the process to data pre-processing. Importantly, in the DNN
classifier we used the three maximum force values obtained for each subject during each of the six tests so a
total of 450 samples composed the dataset. Initially, we distinguish input and output variables within the data
set, commonly referred to as features and responses, respectively. In contrast, the response variables were the
classes corresponded to the grasp type, so the model has six outputs labeled. The dataset has 11 inputs also
Bulletin of Electr Eng & Inf ISSN: 2302-9285
Classification of human grasp forces in activities of daily living using … (Jesus Fernando Padilla-Magaña)
4407
known as features, that includes demographic characteristics as age, gender, HB, and HL. In addition, the
weight of each object and the force values of each finger were also considered as features of the model. Next,
the categorical feature (gender) was was then converted to binary multidimensional vectors through the one-
hot encoding technique. The classes in the dataset were balanced with 25 samples in each class and no
missing values were found, so the 450 samples were employed in the DNN. Subsequently, the DNNs overall
performance was evaluated using two validation methods. First, we use the hold-out method, in which data
set is divided into two parts: a training set and a test set. Therefore, we used 75% of the data as training set
and the remaining 25% as test set. In addition, we used k-fold cross-validation (CV) method. This method
enables us to evaluate and test the performance of our model in predict data on the test set (unseen data). The
k-fold CV method is a follows the data sample is split into k equal number of folds. Then, the model used k-1
of the folds as training set and the remaining fold as the test set; this process is repeated k times, using each
fold as a test set only once. During each iteration, a performance measure is computed e.g. accuracy.
Therefore, the performance metric obtained using k-fold CV is the mean obtained during all interactions.
2.4.2. Deep neural network configuration
The DNN classifier has five layers which are the input layer, three hidden layers, and the output
layer composed as follows. The input layer was defined according to the features in the dataset therefore 11
inputs were used. In addition, three hidden layers were used with the rectified linear unit (relu) funtion with
30 neurons each. Furthermore, this is a multi-class classification problem with six outputs, therefore six
neurons were used in the output layer using the activation function (softmax). In the DNN we used the
dropout technique, which is a powerful method to prevent overfitting and efficiently combine a wide range of
neural networks architectures [38]. Therefore, a dropout rate of 20% was used from the input layer to the first
hidden layer. Subsequently, GridSearchCV (GSCV) technique was applied to the DNN classification model
to find the optimal performance of the model. Hyperparameters are adjustable parameters which values
define the model performance and are set before the model training process. One of the most commonly used
computational methods to find the optimal hyperparameters is the GSCV technique. This method analyzes
every possible combination of parameter values of a given model using k-fold cross validation. Finally, the
following metrics were used for evaluate the multi-class classification model: accuracy, precision, recall and
F1-score. In addition, unlike the accuracy metric, which is calculated in the same way as in a binary
classifier, the other metrics of the multiclass classification are calculated as the arithmetic mean of the
individual class metrics [39]. The formulas of the evaluation metrics are shown in Table 3.
Table 3. Formulas of multi-class evaluation metrics
Metric
Formula
Accuracy


Precision


Recall


F1-score


3. RESULTS AND DISCUSSION
3.1. Fingertip forces
In this section, we discussed the results obtained with the FSR glove system. The mean maximum
finger forces obtained from performing the six tests are shown in Table 4. Test 1, which used a large
diameter heavy grasp, showed the highest forces in all five digits (thumb 6.2 N, index 3.32 N, middle 4.4 N,
ring 3.04 N, and little 1.85 N). In contrast, similar forces were used in the tests 2 and 6 in the fingers thumb,
index, middle, and ring, as is shown in Figure 2. On the other hand, test 3 (thumb 2.68 N, index 2.43, and
middle 1.60 N) and test 5 (thumb 1.83 N and index 1.46 N) showed the lower forces. In addition, the results
demonstrated that the maximum total force of 18.81 N was applied during the performance of the large
diameter heavy wrap grasp. In this test a power grasp was executed, this grasp is used when it is necessary to
hold an object with an important force and it is executed between the fingers and the palm of the hand.
Furthermore, the fingers flex more, employing flexion at all finger joints and the thumb acting as a buttress
[40]. In contrast, the results showed that the minimum total finger force was found during the performance of
the thumb+1 finger pinch (4.38 N) and the sphere precision grasp (6.71 N). In these tests, a precision grasp
was executed using the terminal pads of the thumb and one or more of the rest fingers [41]. The precision
ISSN: 2302-9285
Bulletin of Electr Eng & Inf, Vol. 13, No. 6, December 2024: 4403-4412
4408
grasp involves the execution of delicate and precise movements, requiring the use of less force [42]. On the
other hand, the finger force distribution during each test using different grasp types is shown in Figure 2. The
finger force distribution allows to know the number of fingers used at the moment of grasping an object and
thus classify the type of grasping performed. In addition, we observed a relationship between the number of
fingers used and the objects size, similar results were found in other studies [43], [44].
Table 4. Descriptive statistics of fingertip forces during the performance of the six tests
Test
Grasp type
Thumb (N)
Index (N)
Middle (N)
Ring (N)
Little (N)
Mean
SD
Mean
SD
Mean
SD
Mean
SD
Mean
SD
1
GT_1
6.20
2.19
3.32
1.79
4.40
1.92
3.04
1.75
1.85
0.99
2
GT_2
4.38
2.16
1.99
1.89
2.14
1.12
1.35
1.02
0.03
0.11
3
GT_3
2.68
1.64
2.43
1.23
1.60
1.20
0.18
0.42
0.01
0.04
4
GT_4
3.80
2.02
1.64
1.49
2.11
0.79
1.05
0.77
0.05
0.18
5
GT_5
1.83
1.26
2.45
1.53
0.01
0.04
0.12
0.25
0.02
0.08
6
GT_6
4.43
2.07
1.46
1.05
2.56
1.07
1.09
0.89
0.12
0.26
N=force in newtons; GT_1=large diameter heavy wrap; GT_2=medium wrap; GT_3=thumb+1 finger pinch; GT_4=sphere power
grasp; GT_5=sphere precision grasp; GT_6=tripod grasp; and SD=standard deviation
Figure 2. Boxplots of the maximum force values of all the fingers during each test
3.2. Deep neural network model
This section presents the results obtained from a DNN model used to classify a set of six different
grasp types commonly used in ADLs. The model utilizes force data collected with the FSR glove system as
features. The classifier was implemented in the following environment, operating system: macOS Sonoma
14, central processing units (CPU): Intel Core i7 (2.6 GHz), and memory: 16 GB RAM.
3.2.1. Hyperparameters selection
In order to obtain the best classifier performance, GSCV technique was applied to the DNN model
using a five k-fold CV. GSCV has proven to be efficient in numerous clinical investigations where machine
learning classification models were implemented [45], [46]. The hyperparameter values obtained were as:
DNN [batch_size: 32, epochs: 150, and optimizer: RMSprop].
Bulletin of Electr Eng & Inf ISSN: 2302-9285
Classification of human grasp forces in activities of daily living using … (Jesus Fernando Padilla-Magaña)
4409
3.2.2. Deep neural network performance in the test set
The best performing hyperparameters obtained were employed to evaluate the DNN classifier
performance in predicting results on the testing set. Table 5 presents the classification report with several
evaluation metrics for the DNN model. The results of the DNN classifier showed a 100% of precision, recall,
and F1-score in the classes GT_1, GT_2, GT_4, and GT_6. In contrast, the classes GT_3 and GT_5 showed a
lower precision (92% and 83%), recall (80% and 94%), and F1-score (86% and 88%), respectively. On the
other hand, the overall performance of the DNN classifier in the test set was similar in precision weighted,
recall weighted, F1-score weighted and accuracy achieving a 96%. On the other hand, the confusion matrix
in Figure 3 shows excellent performance in classifying the GT_1, GT_2, GT_4, and GT_6 classes. On the
other hand, the classification of class GT_3 showed a good performance because 10 % of the samples were
categorized as GT_5 class. In contrast, the classification of class GT_5 showed a regular performance
because 20% of the samples were categorized as GT_3 class.
Table 5. DNN classification report
Evaluation metrics
Class
Precision
Recall
F1-score
GT_1
1.00
1.00
1.00
GT_2
1.00
1.00
1.00
GT_3
0.92
0.80
0.86
GT_4
1.00
1.00
1.00
GT_5
0.83
0.94
0.88
GT_6
1.00
1.00
1.00
Accuracy
0.96
Macro avg
0.96
0.96
0.96
Weighted avg
0.96
0.96
0.96
Figure 3. DNN model confusion matrix
3.2.3. Cross validation
Finally, five k-fold CV was implemented for evaluate and validate the overall performance of the
DNN model. The average values and standard deviation obtained in several multi-class evaluation metrics
are shown in Table 6. The DNN model showed an excellent performance in several metrics, achieving an
accuracy of 93.19%, precision of 93.33% and a F1-score weighted of 91.23%. Coskun et al. [21] presented
similar results using DL methods obtaining an accuracy of 94.94% in the classification of six hand
movements. In contrast, Jiang et al. [22] used a LDA to classify 16 different grasps achieving an average
accuracy of 82%. However, each of the studies used different types of sensors and information for the
development of the classification models. In our study we used FSRs data. In contrast, Coskun et al. [21]
used as features sEMG signals obtained of two forearm electrodes, while Jiang et al. [22] used FMG data.
Therefore, each of the sensors used in these studies measured different types of force. In our study, we used
FSR sensors to measure the pressure or normal force applied to the sensing element. In contrast, sEMG
sensors records electrical activity during muscle contractions using surface electrodes, while FMG relies on
FSR sensors placed on the skin or integrated into wearable devices to detect volumetric changes in the
underlying muscles during contractions and movements [47]. However, the DNN model demonstrated a
lower precision in the classification of the thumb+1 finger pinch and sphere precision grasp as is shown in
the classification report in Table 5. We consider that the accuracy is lower during the performance of these
tasks due to the maximum forces in the thumb and index fingertips were similar in the two tests as is shown
in Table 4 and Figure 2. Therefore, the results obtained demonstrate that utilizing DNNs for human grasp
ISSN: 2302-9285
Bulletin of Electr Eng & Inf, Vol. 13, No. 6, December 2024: 4403-4412
4410
classification based on finger forces may offer significant advantages for future work in prosthetic hand
design and the implementation of personalized rehabilitation procedures. Prosthetic devices can be improved
to replicate accurately natural grasp patterns, which will improve the user experience by providing users with
greater comfort, functionality, and confidence in performing ADLs. In addition, identify and classified finger
force patterns during the performance of several types of grasping, rehabilitation programs can be tailored to
individual needs. Physiotherapy can use this information to develop interventions aimed at improving
specific grasping functions based on each persons abilities and difficulties.
Table 6. Results of the evaluation metrics using a five-fold cross validation
Metric
Mean (%)
SD (%)
Accuracy
93.19
1.9
Precision weighted
93.33
1.7
Recall weighted
92.31
3.2
F1-score weighted
91.23
2.4
3.3. Limitations
Nevertheless, the present work had some limitations. We were restricted to using five FSR sensors
due to the analog inputs of the Arduino board utilized. Therefore, we believe that for future research, it would
be imperative to incorporate a data acquisition board to expand the number of available analog inputs, thus
enabling a more complete analysis of the hand. On the other hand, only FSRs sensors were used in this study,
however FMG or sEMG sensors could be included in future research. Using sEMG sensors to capture the
electrical signals from the muscles, along with FMG sensors to detect the actual muscle contractions, can
enhance the feature representation of DL models. Therefore, combining these signals with data on finger
forces obtained with FSRs could improve the classification accuracy and robustness of the models. Finally,
future research would benefit from the incorporation of a motion capture system using data gloves or inertial
measurement units (IMUs), as proven in other studies [32], [48], [49].
4. CONCLUSION
The DNN demonstrates high performance in classifying and evaluating finger force patterns during
the execution of various grasping tasks commonly used in ADLs, achieving an accuracy of 93.19%. In
addition, the model can provide real-time feedback to patients, helping them to achieve correct grasping
patterns and improve their motor skills through specific rehabilitation programs. Therefore, the combination
of the DNN classifier and the FSR glove system is an important tool for health professionals. On the other
hand, the DNN model proposed is useful in other areas as robotics and prosthetics. Using the DNN model in
robotics to determine the appropriate grasp type based on the object characteristics as well as to determine
the distribution and magnitude of finger forces necessary during object manipulation. In contrast, prosthetic
hands technology will enable more precise emulation of natural hand movements, thereby further enhancing
the quality of life for individuals using prosthetic devices.
ACKNOWLEDGMENTS
The authors acknowledge the Polytechnic University of Uruapan, Michoacán for providing the
facilities for this study and extend their gratitude to all the participants involved. This research was partially
supported by the National Council of Humanities, Sciences and Technologies (CONAHCYT) of Mexico,
grant number A1-S-44382 of CB2017-2018.
REFERENCES
[1] Ł. Jaworski and R. Karpiński, “Biomechanics of the human hand,” Journal of Technology and Exploitation in Mechanical
Engineering, vol. 3, no. 1, pp. 2833, Jun. 2017, doi: 10.35784/jteme.536.
[2] F. Cordella et al., “Literature review on needs of upper limb prosthesis users,” Frontiers in Neuroscience, vol. 10, May 2016, doi:
10.3389/fnins.2016.00209.
[3] Q. Fu and M. Santello, “Improving fine control of grasping force during hand-object interactions for a soft synergy-inspired
myoelectric prosthetic hand,” Frontiers in Neurorobotics, vol. 11, Jan. 2018, doi: 10.3389/fnbot.2017.00071.
[4] M. Nilsson, J. Ingvast, J. Wikander, and H. Von Holst, “The soft extra muscle system for improving the grasping capability in
neurological rehabilitation,” in 2012 IEEE-EMBS Conference on Biomedical Engineering and Sciences, Dec. 2012, pp. 412417,
doi: 10.1109/IECBES.2012.6498090.
[5] B. Abbasi, E. Noohi, S. Parastegari, and M. Zefran, “Grasp taxonomy based on force distribution,” in 25th IEEE International
Symposium on Robot and Human Interactive Communication, Aug. 2016, pp. 10981103, doi: 10.1109/ROMAN.2016.7745245.
Bulletin of Electr Eng & Inf ISSN: 2302-9285
Classification of human grasp forces in activities of daily living using … (Jesus Fernando Padilla-Magaña)
4411
[6] T. Feix, J. Romero, H. B. Schmiedmayer, A. M. Dollar, and D. Kragic, “The GRASP taxonomy of human grasp types,” IEEE
Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 6677, Feb. 2016, doi: 10.1109/THMS.2015.2470657.
[7] A. García Álvarez, A. Roby-Brami, J. Robertson, and N. Roche, “Functional classification of grasp strategies used by hemiplegic
patients,” PLoS ONE, vol. 12, no. 11, p. e0187608, Nov. 2017, doi: 10.1371/journal.pone.0187608.
[8] Z. Ju and H. Liu, “Human hand motion analysis with multisensory information,” IEEE/ASME Transactions on Mechatronics, vol.
19, no. 2, pp. 456466, Apr. 2014, doi: 10.1109/TMECH.2013.2240312.
[9] Y. Xue, Z. Ju, K. Xiang, J. Chen, and H. Liu, “Multimodal human hand motion sensing and analysis-a review,IEEE Transactions on
Cognitive and Developmental Systems, vol. 11, no. 2, pp. 162175, Jun. 2019, doi: 10.1109/TCDS.2018.2800167.
[10] C. Guo, C. Wu, B. Wang, and H. Liu, “A two-dimensional deflection sensor based on force sensing resistors,” Journal of Sensors,
vol. 2017, pp. 18, 2017, doi: 10.1155/2017/1241280.
[11] A. S. Sadun, J. Jalani, and J. A. Sukor, “Force sensing resistor (FSR): a brief overview and the low-cost sensor for active
compliance control,” in First International Workshop on Pattern Recognition, , vol. 10011, p. 1001112 Jul. 2016, doi:
10.1117/12.2242950.
[12] H. Liu et al., “A glove-based system for studying hand-object manipulation via joint pose and force sensing,” in IEEE
International Conference on Intelligent Robots and Systems, Sep. 2017, vol. 2017, pp. 66176624, doi:
10.1109/IROS.2017.8206575.
[13] A. Nikonovas, A. J. L. Harrison, S. Hoult, and D. Sammut, “The application of force-sensing resistor sensors for measuring forces
developed by the human hand,” Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in
Medicine, vol. 218, no. 2, pp. 121126, Feb. 2004, doi: 10.1243/095441104322984013.
[14] Q. Ye, M. Seyedi, Z. Cai, and D. T. H. Lai, “Force-sensing glove system for measurement of hand forces during motorbike riding,”
International Journal of Distributed Sensor Networks, vol. 2015, no. 11, p. 545643, Nov. 2015, doi: 10.1155/2015/545643.
[15] E. Battaglia et al., “ThimbleSense: a fingertip-wearable tactile sensor for grasp analysis,” IEEE Transactions on Haptics, vol. 9,
no. 1, pp. 121133, Jan. 2016, doi: 10.1109/TOH.2015.2482478.
[16] S. Ganeson, R. Ambar, and M. M. A. Jamil, “Design of a low-cost instrumented glove for hand rehabilitation monitoring system,”
in Proceedings-6th IEEE International Conference on Control System, Computing and Engineering, 2017, pp. 189192, doi:
10.1109/ICCSCE.2016.7893569.
[17] M. Hoda, Y. Hoda, B. Hafidh, and A. El Saddik, “Predicting muscle forces measurements from kinematics data using kinect in stroke
rehabilitation,” Multimedia Tools and Applications, vol. 77, no. 2, pp. 18851903, Jan. 2018, doi: 10.1007/s11042-016-4274-5.
[18] N. Li, D. Yang, L. Jiang, H. Liu, and H. Cai, “Combined use of FSR sensor array and SVM classifier for finger motion
recognition based on pressure distribution map,” Journal of Bionic Engineering, vol. 9, no. 1, pp. 3947, Mar. 2012, doi:
10.1016/S1672-6529(11)60095-4.
[19] B. Wan, R. Wu, K. Zhang, and L. Liu, “A new subtle hand gestures recognition algorithm based on EMG and FSR,” in
Proceedings of the 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design, Apr. 2017,
pp. 127132, doi: 10.1109/CSCWD.2017.8066682.
[20] N. M. Kakoty and S. M. Hazarika, “Recognition of grasp types through principal components of DWT based EMG features,” in
IEEE International Conference on Rehabilitation Robotics, Jun. 2011, pp. 16, doi: 10.1109/ICORR.2011.5975398.
[21] M. Coskun, O. Yildirim, Y. Demir, and U. R. Acharya, “Efficient deep neural network model for classification of grasp types
using sEMG signals,” Journal of Ambient Intelligence and Humanized Computing, vol. 13, no. 9, pp. 44374450, Sep. 2022, doi:
10.1007/s12652-021-03284-9.
[22] X. Jiang, L. K. Merhi, and C. Menon, “Force exertion affects grasp classification using force myography,” IEEE Transactions on
Human-Machine Systems, vol. 48, no. 2, pp. 219226, Apr. 2018, doi: 10.1109/THMS.2017.2693245.
[23] X. Y. Liu, Y. Fang, L. Yang, Z. Li, and A. Walid, “High-performance tensor decompositions for compressing and accelerating
deep neural networks,” in Tensors for Data Processing: Theory, Methods, and Applications, Elsevier, 2021, pp. 293340.
[24] Y. Lecun, Y. Bengio, and G. Hinton, Deep learning,” Nature, vol. 521, no. 7553, pp. 436444, May 2015, doi:
10.1038/nature14539.
[25] S. K. Bahadir, “Identification and modeling of sensing capability of force sensing resistor integrated to E-textile structure,” IEEE
Sensors Journal, vol. 18, no. 23, pp. 97709780, Dec. 2018, doi: 10.1109/JSEN.2018.2871396.
[26] K. Sarangam, T. Appala, B. V. Vani, and A. S. Kumar, “An efficient FSR-based approach for self notifying chairs to minimize
long duration sitting postures,” in 2023 14th International Conference on Computing Communication and Networking
Technologies, Jul. 2023, pp. 15, doi: 10.1109/ICCCNT56998.2023.10306812.
[27] A. Erwin, F. Sergi, V. Chawda, and M. K. O’Malley, “Interaction control for rehabilitation robotics via a low-cost force sensing
handle,” in ASME 2013 Dynamic Systems and Control Conference, Oct. 2013, vol. 2, doi: 10.1115/DSCC2013-4073.
[28] C. Madeu, F. J. Garcia, and Y. Ye Lin, “Desarrollo de un sistema para la monitorización de contracción muscular en base a un
sensor de presión en superficie [trabajo final de grado en Internet],” Universidad politécnica de Valencia, 2016.
[29] J. A. Flórez and A. Velásquez, Calibration of force sensing resistors (fsr) for static and dynamic applications,” in 2010 IEEE
ANDESCON Conference Proceedings, Sep. 2010, pp. 16, doi: 10.1109/ANDESCON.2010.5633120.
[30] W. C. Hsu, T. Sugiarto, J. W. Chen, and Y. J. Lin, “The design and application of simplified insole-based prototypes with plantar pressure
measurement for fast screening of flat-foot,” Sensors (Switzerland), vol. 18, no. 11, p. 3617, Oct. 2018, doi: 10.3390/s18113617.
[31] V. Gracia-Ibáñez, M. Vergara, J. L. Sancho-Bru, M. C. Mora, and C. Piqueras, “Functional range of motion of the hand joints in
activities of the international classification of functioning, disability and health,” Journal of Hand Therapy, vol. 30, no. 3, pp.
337347, Jul. 2017, doi: 10.1016/j.jht.2016.08.001.
[32] V. Gracia-Ibáñez, P. J. Rodríguez-Cervantes, V. Bayarri-Porcar, P. Granell, M. Vergara, and J. L. Sancho-Bru, “Using sensorized
gloves and dimensional reduction for hand function assessment of patients with osteoarthritis,” Sensors, vol. 21, no. 23, p. 7897,
Nov. 2021, doi: 10.3390/s21237897.
[33] A. Roda-Sales, M. Vergara, J. L. Sancho-Bru, V. Gracia-Ibáñez, and N. J. Jarque-Bou, “Human hand kinematic data during
feeding and cooking tasks,” Scientific Data, vol. 6, no. 1, p. 167, Sep. 2019, doi: 10.1038/s41597-019-0175-6.
[34] N. J. Jarque-Bou, M. Vergara, J. L. Sancho-Bru, V. Gracia-Ibanez, and A. Roda-Sales, “Hand kinematics characterization while
performing activities of daily living through kinematics reduction,” IEEE Transactions on Neural Systems and Rehabilitation
Engineering, vol. 28, no. 7, pp. 15561565, Jul. 2020, doi: 10.1109/TNSRE.2020.2998642.
[35] M. R. Cutkosky, “On grasp choice, grasp models, and the design of hands for manufacturing tasks,” IEEE Transactions on
Robotics and Automation, vol. 5, no. 3, pp. 269279, Jun. 1989, doi: 10.1109/70.34763.
[36] O. A. M. López, A. M. López, and J. Crossa, “Fundamentals of artificial neural networks and deep learning,” in Multivariate
Statistical Machine Learning Methods for Genomic Prediction, Cham: Springer International Publishing, 2022, pp. 379425.
[37] A. Halnaut, R. Giot, R. Bourqui, and D. Auber, “Compact visualization of DNN classification performances for interpretation and
improvement,” in Explainable Deep Learning AI: Methods and Challenges, Elsevier, 2023, pp. 3554.
ISSN: 2302-9285
Bulletin of Electr Eng & Inf, Vol. 13, No. 6, December 2024: 4403-4412
4412
[38] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: a simple way to prevent neural networks
from overfitting,” Journal of Machine Learning Research, vol. 15, no. 1, pp. 19291958, 2014.
[39] M. Grandini, E. Bagli, and G. Visani, “Metrics for multi-class classification: an overview,” 2020, doi:
10.48550/arXiv.2008.05756.
[40] J. Hamill and K. Knutzen, Biomechanical basis of human movement, 3rd ed. Lippincott Williams & Wilkins, 2009.
[41] D. Falk, L. Aiello, and C. Dean, “An introduction to human evolutionary anatomy.,” Man, vol. 27, no. 2, p. 410, 1992, doi:
10.2307/2804064.
[42] Y. Yang, C. Fermüller, Y. Li, and Y. Aloimonos, “Grasp type revisited: a modern perspective on a classical feature for vision,” in
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 2015, vol. 07-12-June-
2015, pp. 400408, doi: 10.1109/CVPR.2015.7298637.
[43] E. Peña-Pitarch, J. F. P. Magaña, N. Ticó-Falguera, A. Al Omar, I. A. Larrión, and J. V. Costa, “Virtual human hand: grasps and
fingertip deformation,” in Advances in Intelligent Systems and Computing, vol. 975, 2020, pp. 484492.
[44] J. W. Lee and K. Rim, “Measurement of finger joint angles and maximum finger forces during cylinder grip activity,” Journal of
Biomedical Engineering, vol. 13, no. 2, pp. 152162, Mar. 1991, doi: 10.1016/0141-5425(91)90062-C.
[45] N. M. Ali, R. Besar, and N. A. A. Aziz, “A case study of microarray breast cancer classification using machine learning
algorithms with grid search cross validation,” Bulletin of Electrical Engineering and Informatics (BEEI), vol. 12, no. 2, pp. 1047
1054, Apr. 2023, doi: 10.11591/eei.v12i2.4838.
[46] G. N. Ahmad, H. Fatima, Shafiullah, A. S. Saidi, and Imdadullah, “Efficient medical diagnosis of human heart diseases using
machine learning techniques with and without GridSearchCV,” IEEE Access, vol. 10, pp. 8015180173, 2022, doi:
10.1109/ACCESS.2022.3165792.
[47] E. Cho, R. Chen, L. K. Merhi, Z. Xiao, B. Pousett, and C. Menon, “Force myography to control robotic upper extremity
prostheses: a feasibility study,” Frontiers in Bioengineering and Biotechnology, vol. 4, Mar. 2016, doi:
10.3389/fbioe.2016.00018.
[48] H. S. Nam, W. H. Lee, H. G. Seo, Y. J. Kim, M. S. Bang, and S. Kim, “Inertial measurement unit based upper extremity motion
characterization for action research arm test and activities of daily living,” Sensors (Switzerland), vol. 19, no. 8, p. 1782, Apr.
2019, doi: 10.3390/s19081782.
[49] B. O’Flynn, J. Torres, J. Connolly, J. Condell, K. Curran, and P. Gardiner, “Novel smart sensor glove for arthritis rehabiliation,”
in 2013 IEEE International Conference on Body Sensor Networks, May 2013, pp. 16, doi: 10.1109/bsn.2013.6575529.
BIOGRAPHIES OF AUTHORS
Jesus Fernando Padilla-Magaña received the Ph.D. degree in the program of
automatics, robotics and vision from the Polytechnic University of Catalonia in Barcelona,
Spain. Currently, he is full time Professor at the Polytechnic University of Uruapan, Mexico
in the department of manufacturing technologies engineering. Currently he has the distinction
as National Researcher LEVEL I granted by CONAHCYT in the National System of
Researchers of Mexico. His research is focused on the medical rehabilitation, analysis of
human motion, wearable sensors, development of medical devices, and the development of
algorithms in the area of artificial intelligence (machine learning and deep learning). He can
be contacted at email: fe.padilla@upu.edu.mx.
Isahi Sanchez-Suarez received the Ph.D. degree from the Universidad
Michoacana de San Nicolás de Hidalgo at the Institute of Physics and Mathematics. He did a
postdoctoral stay at the Mathematics Department UNAM in Morelia, Michoacán. Currently
he has the distinction as National Researcher LEVEL II granted by CONAHCYT in the
National System of Researchers of Mexico. Currently attached to the Polytechnic University
of Uruapan, Michoacan as full time Professor C, from 2016 to date. He can be contacted at
email: i.sanchez@upu.edu.mx.
Esteban Peña-Pitarch received the Ph.D. degree from the Polytechnic
University of Catalonia in Barcelona, Spain. He has carried out his teaching work at the
Technical College of Manresa (EPSEM), since 1988 and belongs to the department of
mechanical engineering. He collaborates with the Institute of Industrial and Control
Engineering (IOC), UPC, since 2008, in the robotics division. His research is focused on
rehabilitation and simulation of stroke survivors, the creation of medical devices and the
application of kinematics and dynamics to the human body by way of mathematical tools
used in robotics. He can be contacted at email: esteban.pena@upc.edu.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Breast cancer is one of the leading causes of death and most frequently diagnosed cancer amongst women. Annually, almost half a million women do not survive the disease and die from breast cancer. Machine learning is a subfield of artificial intelligence (AI) and computer science that uses data and algorithms to mimic how humans learn, and gradually improving its accuracy. In this work, simple machine learning methods are used to classify breast cancer microarray data to normal and relapse. The data is from the gene expression omnibus (GEO) website namely GSE45255 and GSE15852. These two datasets are integrated and combined to form a single dataset. The study involved three machine learning algorithms, random forest (RF), extra tree (ET), and support vector machine (SVM). Grid search cross validation (CV) is applied for hyperparameter tuning of the algorithms. The result shows that the tuned SVM is best among the tested algorithms with accuracy of 97.78%. In the future it is recommended to include feature selection method to get the optimal features and better classification accuracies.
Article
Full-text available
Predicting cardiac disease is considered one of the most challenging tasks in the medical field. It takes a lot of time and effort to figure out what's causing this, especially for doctors and other medical experts. In this paper, various Machine Learning algorithms such as LR, KNN, SVM, and GBC, together with the GridSearchCV, predict cardiac disease. The system uses a 5-fold cross-validation technique for verification. A comparative study is given for these four methodologies. The Datasets for both Cleveland, Hungary, Switzerland, and Long Beach V and UCI Kaggle are used to analyze the models' performance. It is found in the analysis that the Extreme Gradient Boosting Classifier with GridSearchCV gives the highest and nearly comparable testing and training accuracies as 100% and 99.03% for both the datasets (Hungary, Switzerland & Long Beach V and UCI Kaggle). Moreover, it is found in the analysis that XGBoost Classifier without GridSearchCV gives the highest and nearly comparable testing and training accuracies as 98.05% and 100% for both the datasets (Hungary, Switzerland & Long Beach V and UCI Kaggle). Furthermore, the analytical results of the proposed technique are compared with previous heart disease prediction studies. It is evident that amongst the proposed approach, the Extreme Gradient Boosting Classifier with GridSearchCV is producing the best hyperparameter for testing accuracy. The primary aim of this paper is to develop a unique model-creation technique for solving real-world problems.
Chapter
Full-text available
In this chapter, we go through the fundamentals of artificial neural networks and deep learning methods. We describe the inspiration for artificial neural networks and how the methods of deep learning are built. We define the activation function and its role in capturing nonlinear patterns in the input data. We explain the universal approximation theorem for understanding the power and limitation of these methods and describe the main topologies of artificial neural networks that play an important role in the successful implementation of these methods. We also describe loss functions (and their penalized versions) and give details about in which circumstances each of them should be used or preferred. In addition to the Ridge, Lasso, and Elastic Net regularization methods, we provide details of the dropout and the early stopping methods. Finally, we provide the backpropagation method and illustrate it with two simple artificial neural networks.
Article
Full-text available
Sensorized gloves allow the measurement of all hand kinematics that are essential for daily functionality. However, they are scarcely used by clinicians, mainly because of the difficulty of analyzing all joint angles simultaneously. This study aims to render this analysis easier in order to enable the applicability of the early detection of hand osteoarthritis (HOA) and the identification of indicators of dysfunction. Dimensional reduction was used to compare kinematics (16 angles) of HOA patients and healthy subjects while performing the tasks of the Sollerman hand function test (SHFT). Five synergies were identified by using principal component (PC) analyses, patients using less fingers arch, higher palm arching, and a more independent thumb abduction. The healthy PCs, explaining 70% of patients’ data variance, were used to transform the set of angles of both samples into five reduced variables (RVs): fingers arch, hand closure, thumb-index pinch, forced thumb opposition, and palmar arching. Significant differences between samples were identified in the ranges of movement of most of the RVs and in the median values of hand closure and thumb opposition. A discriminant function for the detection of HOA, based in RVs, is provided, with a success rate of detection higher than that of the SHFT. The temporal profiles of the RVs in two tasks were also compared, showing their potentiality as dysfunction indicators. Finally, reducing the number of sensors to only one sensor per synergy was explored through a linear regression, resulting in a mean error of 7.0°.
Article
Full-text available
Grasping is a challenging problem in robotics and prosthetic applications due to its control requirements. The visual percep-tion and analyzing electromyography (EMG) signals are the two ways to give the inputs to robots and prosthetic amputees for grasping abilities. The EMG is a diagnostic manner that evaluates the fitness condition of skeletal muscles. Examination or evaluation of the EMG signals is time-consuming and arduous for experts. Hence, the state-of-the-art methods in artificial intelligence (AI) is employed to improve the accuracy rate for the detection and classification of EMG signals for grasping. Recently, deep learning architectures have been used in many engineering applications such as diagnosis of health conditions, computer vision, and human machine interaction (HMI). In this study, a new deep one-dimensional convolutional neural network model (1D-CNN) is proposed to classify six types of hand movements. Our proposed 1D-CNN model implemented using surface EMG (sEMG) has obtained the highest accuracy of 94.94% in classifying six hand movements. The strength of our model is that, it can perform the automated classification of various hard grasps using only one channel data. Our developed prototype model is ready to be tested with more data and can be used to assist in musculoskeletal disorders.
Article
Full-text available
Improving the understanding of hand kinematics during the performance of activities of daily living may help improve the control of hand prostheses and hand function assessment. This work identifies sparse synergies (each degree of freedom is present mainly in only one synergy), representative of the global population, with emphasis in unveiling the coordination of joints with small range of motion (palmar arching and fingers abduction). The study is the most complete study described in the literature till now, involving 22 healthy subjects and 26 representative day-to-day life activities. Principal component analysis was used to reduce the original 16 angles recorded with an instrumented glove. Five synergies explained 75% of total variance: closeness (coordinated flexion and abduction of metacarpophalangeal finger joints), digit arching (flexion of proximal interphalangeal joints), palmar-thumb coordination (coordination of palmar arching and thumb carpometacarpal flexion), thumb opposition, and thumb arch. The temporal evolution of these synergies is provided during reaching per intended grasp and during manipulation per specific task, which could be used as normative patterns for the global population. Reaching has been observed to require the modulation of closeness, digit arch and thumb opposition synergies, with different control patterns per grasp. All the synergies are very important during manipulation and need to be modulated for all the tasks. Finally, groups of tasks with similar kinematic requirements in terms of synergies have been identified, which could benefit the selection of tasks for rehabilitation and hand function assessments.
Chapter
Large-scale deep neural networks (DNNs) have achieved impressive success in many applications. However, two challenges often arise in DNN deployment in Internet of Things (IoT) devices and real-time applications: training time and memory footprint. This chapter takes a practical approach to seek a better efficiency–accuracy trade-off, which utilizes high-performance tensor decompositions to compress and accelerate neural networks by exploiting low-rank structures of the network weight matrix. Our experiments reproduce the results of earlier research on decomposing neural networks, and we also propose and implement a novel transform-based compression scheme. Our experiments achieve comparable results to the state-of-the-art performance accuracy, 9× in compression ratio and 3× in speedup. We provide a codebase containing the implementation and experiments (https://github.com/hust512/Tensor_Layer_for_Deep_Neural_Network_Compression).
Preprint
Classification tasks in machine learning involving more than two classes are known by the name of "multi-class classification". Performance indicators are very useful when the aim is to evaluate and compare different classification models or machine learning techniques. Many metrics come in handy to test the ability of a multi-class classifier. Those metrics turn out to be useful at different stage of the development process, e.g. comparing the performance of two different models or analysing the behaviour of the same model by tuning different parameters. In this white paper we review a list of the most promising multi-class metrics, we highlight their advantages and disadvantages and show their possible usages during the development of a classification model.