Ambulatory Measurement and Analysis of the Lower Limb 3D Posture Using Wearable Sensor System

Conference Paper · September 2009with43 Reads
DOI: 10.1109/ICMA.2009.5245982 · Source: IEEE Xplore
Conference: Mechatronics and Automation, 2009. ICMA 2009. International Conference on
Abstract
An original approach for ambulatory measurement and analysis of lower limb 3D gait posture was presented, and a wearable sensor system was developed according to the approach. To explicate the lower limb posture, thigh orientation angles were calculated based on a virtual sensor at the hip joint and double analog inertial sensors (MAG<sup>3</sup>) on the thigh; Knee joint angle in sagittal plane was calculated with combination of angular accelerations and angular velocities measured by two MAG<sup>3</sup> on the thigh and shank on the basis of the virtual-sensor based algorithm. The developed wearable sensor system was evaluated on the lower limb. Without integration of angular acceleration or angular velocity for the thigh orientation angles and the knee joint angle, the calculated result was not distorted by offset and drift. Using virtual sensors at the hip joint and the knee joint were more simple, practical and effective than fixing physical sensors at these joints. Compared with the result from the reference system, the measured result with the developed wearable sensor system was feasible to do gait analysis for the patients in the daily life, and the method can also be used in other conditions such as measuring rigid segment posture with less sensors and high degree of accuracy.
    • "Actually a small tilt or misplacement of the sensor would result in large variation on the measured data. It should be noted that some research works completely ignore this problem by assuming that the IMUs can be mounted precisely in a predefined orientation towards the joint [93,94]. In this study, the IMUs sensors were very securely attached to the participant's body using special straps provided by the Xsens Company. "
    [Show abstract] [Hide abstract] ABSTRACT: This paper presents a review of different classification techniques used to recognize human activities from wearable inertial sensor data. Three inertial sensor units were used in this study and were worn by healthy subjects at key points of upper/lower body limbs (chest, right thigh and left ankle). Three main steps describe the activity recognition process: sensors’ placement, data pre-processing and data classification. Four supervised classification techniques namely, k-Nearest Neighbor (k-NN), Support Vector Machines (SVM), Gaussian Mixture Models (GMM), and Random Forest (RF) as well as three unsupervised classification techniques namely, k-Means, Gaussian mixture models (GMM) and Hidden Markov Model (HMM), are compared in terms of correct classification rate, F-measure, recall, precision, and specificity. Raw data and extracted features are used separately as inputs of each classifier. The feature selection is performed using a wrapper approach based on the RF algorithm. Based on our experiments, the results obtained show that the k-NN classifier provides the best performance compared to other supervised classification algorithms, whereas the HMM classifier is the one that gives the best results among unsupervised classification algorithms. This comparison highlights which approach gives better performance in both supervised and unsupervised contexts. It should be noted that the obtained results are limited to the context of this study, which concerns the classification of the main daily living human activities using three wearable accelerometers placed at the chest, right shank and left ankle of the subject.
    Full-text · Article · Dec 2015
    • "While both methods might also be adapted and employed for abduction/adduction and inversion/eversion angle measurements, we focus only on flexion/extension. As mentioned above, this is in accordance with numerous authors [9,13,17,18,24]. Nevertheless, small additional rotations in the other dimensions do not affect any of the geometrical arguments used in the algorithms above. "
    [Show abstract] [Hide abstract] ABSTRACT: This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.
    Full-text · Article · Apr 2014
    • "As the difference of both integrals, a joint angle estimate is obtained that is highly accurate up to slow drift. In a second step, a noisy but driftless joint angle estimate is calculated from the measured accelerations and the joint position vectors, as explained in [8]. Finally, these two estimates are combined by a Kalman filter to obtain a highly accurate, driftless flexion/extension angle. "
    Full-text · Conference Paper · Jan 2013 · Sensors
Show more