Conference Paper

Validity of deep learning based motion capture using DeepLabCut to assess proprioception

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The entire training process consisted of 10,000 iterations, and the model's metric parameter values were saved every 10 iterations. During training, the optimization of the model parameters involved three primary loss functions [19,20]. First, the location refinement loss (located loss), which is the core loss function in DLC, attempted to minimize the difference between the model's predicted offsets and the true offsets to enhance the prediction accuracy of each body part's location. ...
Article
Full-text available
With the ongoing development of computer vision technologies, the automation of lameness detection in dairy cows urgently requires improvement. To address the challenges of detection difficulties and technological limitations, this paper proposes an automated scoring method for cow lameness that integrates deep learning with keypoint tracking. First, the DeepLabCut tool is used to efficiently extract keypoint features during the walking process of dairy cows, which enables the automated monitoring and output of positional information. Then, the extracted positional data are combined with temporal data to construct a scoring model for cow lameness. The experimental results demonstrate that the proposed method tracks the keypoint of cow movement accurately in visible-light videos and satisfies the requirements for real-time detection. The model classifies the walking states of the cows into four levels, i.e., normal, mild, moderate, and severe lameness (corresponding to scores of 0, 1, 2, and 3, respectively). The detection results obtained in real-world real environments exhibit the high extraction accuracy of the keypoint positional information, with an average error of only 4.679 pixels and an overall accuracy of 90.21%. The detection accuracy for normal cows was 89.0%, with 85.3% for mild lameness, 92.6% for moderate lameness, and 100.0% for severe lameness. These results demonstrate that the application of keypoint detection technology for the automated scoring of lameness provides an effective solution for intelligent dairy management.
Article
Full-text available
Noninvasive behavioral tracking of animals during experiments is critical to many scientific pursuits. Extracting the poses of animals without using markers is often essential to measuring behavioral effects in biomechanics, genetics, ethology, and neuroscience. However, extracting detailed poses without markers in dynamically changing backgrounds has been challenging. We recently introduced an open-source toolbox called DeepLabCut that builds on a state-of-the-art human pose-estimation algorithm to allow a user to train a deep neural network with limited training data to precisely track user-defined features that match human labeling accuracy. Here, we provide an updated toolbox, developed as a Python package, that includes new features such as graphical user interfaces (GUIs), performance improvements, and active-learning-based network refinement. We provide a step-by-step procedure for using DeepLabCut that guides the user in creating a tailored, reusable analysis pipeline with a graphical processing unit (GPU) in 1–12 h (depending on frame size). Additionally, we provide Docker environments and Jupyter Notebooks that can be run on cloud resources such as Google Colaboratory. This protocol describes how to use an open-source toolbox, DeepLabCut, to train a deep neural network to precisely track user-defined features with limited training data. This allows noninvasive behavioral tracking of movement.
Article
Full-text available
Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
Article
Full-text available
The purpose of this study was to identify differences in knee proprioceptive accuracy between subjects with early knee osteoarthritis (OA), established knee OA, and healthy controls. Furthermore, the relation between proprioceptive accuracy on the one hand and functional ability, postural balance, and muscle strength on the other hand was also explored. New MRI-based classification criteria showing evidence of beginning joint degeneration have been used to identify subjects with early knee OA. A total of 45 women with knee OA (early OA, n = 21; established OA, n = 24) and 20 healthy female control subjects participated in the study. Proprioceptive accuracy was evaluated using the repositioning error of a knee joint position sense test using a three-dimensional motion analysis system. Subjective and objective functional ability was assessed by the knee injury and osteoarthritis outcome score, the timed "Up & Go" test, and the stair climbing test. The sensory organization test measured postural control. Muscle strength was measured by isokinetic dynamometry. Early OA subjects showed no significant differences in proprioceptive accuracy compared to healthy controls. In contrast, established OA subjects showed a higher repositioning error compared to early OA subjects (+29 %, P = 0.033) and healthy controls (+25 %, P = 0.068). Proprioceptive accuracy was not significantly associated with functional ability, postural balance, and muscle strength. Knee joint proprioceptive deficits were observed in established OA but not in early OA, suggesting that impaired proprioception is most likely a consequence of structural degeneration, rather than a risk factor in the pathogenesis of knee OA. Impaired proprioceptive accuracy was not associated with disease-related functionality in knee OA patients. Treatment strategies designed to address proprioceptive deficits may be not effective in prevention of knee OA progression and may have no impact on patients' functionality. However, this should be confirmed further in well-designed clinical trials.
Article
The Standardization and Terminology Committee (STC) of the International Society of Biomechanics (ISB) proposes a general reporting standard for joint kinematics based on the Joint Coordinate System (JCS), first proposed by Grood and Suntay for the knee joint in 1983 (J. Biomech. Eng. 105 (1983) 136). There is currently a lack of standard for reporting joint motion in the field of biomechanics for human movement, and the JCS as proposed by Grood and Suntay has the advantage of reporting joint motions in clinically relevant terms.In this communication, the STC proposes definitions of JCS for the ankle, hip, and spine. Definitions for other joints (such as shoulder, elbow, hand and wrist, temporomandibular joint (TMJ), and whole body) will be reported in later parts of the series. The STC is publishing these recommendations so as to encourage their use, to stimulate feedback and discussion, and to facilitate further revisions.For each joint, a standard for the local axis system in each articulating bone is generated. These axes then standardize the JCS. Adopting these standards will lead to better communication among researchers and clinicians.