Conference PaperPDF Available

Joint angles calculation through augmented reality

Authors:

Abstract and Figures

The aim of this study was to develop a computational system that assists in the analysis of working postures through joint angles calculation of the human body. This is done through tracking markers positioned on specific anatomical points of the body. Each marker is positioned on anatomical landmarks that define the body segments of a biomechanical model in the sagittal plane. By means of computer vision using a webcam, the Processing programming language and the NyARToolkit and the GSVideo libraries, the markers are recognized in sequence. From markers recognition the system recognizes the top left corner of each marker, draws line segments (vectors) between these vertices and calculates the joint angles defined by two vectors that have a vertex in common. The system also executes postural analysis taking into account postural angles defined in the REBA (Rapid Entire Body Assessment) tool. The main aspect of this study is that with a low cost infrastructure, comprised of a webcam and paper printed markers, it was possible to develop a real time joint angles calculation and assessment tool.
Content may be subject to copyright.
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 1
Joint angles calculation through augmented reality
F. C. H. PASTURA*1,2, G. L. DE ALMEIDA1,3 and G. CUNHA1,4
1Coordenação dos Programas de Pós-Graduação em Engenharia (COPPE/UFRJ), Rio de Janeiro, Brazil
2Instituto Nacional de Tecnologia (INT/MCTI), Rio de Janeiro, Brazil
3Pontifícia Universidade Católica do Rio de Janeiro (PUC/Rio), Rio de Janeiro, Brazil
4Laboratório de Métodos Computacionais em Engenharia (LAMCE/COPPE/UFRJ), Rio de Janeiro, Brazil
Abstract
The aim of this study was to develop a computational system that assists in the analysis of working postures
through joint angles calculation of the human body. This is done through tracking markers positioned on specific
anatomical points of the body. Each marker is positioned on anatomical landmarks that define the body segments
of a biomechanical model in the sagittal plane. By means of computer vision using a webcam, the Processing
programming language and the NyARToolkit and the GSVideo libraries, the markers are recognized in
sequence. From markers recognition the system recognizes the top left corner of each marker, draws line
segments (vectors) between these vertices and calculates the joint angles defined by two vectors that have a
vertex in common. The system also executes postural analysis taking into account postural angles defined in the
REBA (Rapid Entire Body Assessment) tool. The main aspect of this study is that with a low cost infrastructure,
comprised of a webcam and paper printed markers, it was possible to develop a real time joint angles calculation
and assessment tool.
Keywords: Postural analysis, Joint angles, Augmented reality.
1. Introduction
In order to conduct a postural analysis of working
postures it is frequently necessary to apply auxiliary
tools, such as the REBA (Rapid Entire Body
Assessment) tool. However, to conduct postural
analysis using REBA it is necessary to have
technical knowledge with regards to joint angles
definition from visual observation of body posture
(see figure 1).
This knowledge is on the specific domain of health
professionals particularly of physiotherapists. For
the non-health professional it is quite difficult to
determine joint angles from visual observation
therefore the need to develop a system that
calculates these angles so that the information can
be either used on REBA or on other postural
analysis tool.
Figure 1: REBA postural angles and scores (HIGNETT, S.; MCATAMNEY, L., 2000).
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 2
The aim of this study was to develop a
computational system that assists on postural
analysis of working postures through real time joint
angles calculation in the sagittal plane (see Figure
2) by means of tracking markers placed on specific
anatomical landmarks.
A second goal was to implement the postural
analysis with references to the postural angles
defined in the REBA tool. The third goal was to
develop a low cost and low complexity, easy to use
system.
Figure 2: Planes definition for body movement recording
(http://limatreinamento.blogspot.com.br/2010_09_01_arc
hive.html).
2. Fundaments
According to Amadio (1985), Biomechanics
research methods can be classified as: Kinemetry,
Anthropometry, Dynamometry and
Electromyography (EMG). Kinemetry deals with
movement analysis from images; whether they were
captured by video, photography or by optical-
electronic systems. Anthropometry determines the
characteristics and properties of the locomotor
system (segments dimensions; joint and center of
gravity positioning; lever arms’ length; mass and
inertia moment). Dynamometry deals with force
measurement (external and internal forces of the
locomotor system) and with pressure measurement
(applied force per area ratio). EMG is the recording
of electrical activity associated with muscle
contractions (see Figure 3)
Figure 3: Biomechanics research methods (AMADIO,
A. C., 1985).
The Kinemetry method allows postural kinematics
analysis of the human body in motion and is an
essential analysis phase, within the biomechanical
analysis, because it allows human motion
description. This postural analysis is characterized
by the body segments description in space through
the definition of two important angles: the joint
angles - defined as the angles between the body
segments and that define the joint amplitude and,
the segmentary angles - defined as the inclination
angles of the body segments in relation to one of the
absolute axes “x” or “y”.
Postural analysis is an important stage in work
activities study. The risk of injury and/or
musculoskeletal disorders is often associated with
the adopted posture(s) which may be an important
factor to be modified on implementing work system
changes. Therefore, the availability of field tools
that facilitate postural analysis during work
activities is of great help for many professionals
like physiotherapists and ergonomists. Through this
analysis it is possible to evaluate the adequacy of
human/workstation interface and to identify
adjustments to be made on the workstation.
Among the entire body postural analysis methods
applied to work activities analysis, there is the
method called REBA - Rapid Entire Body
Assessment. This method is characterized for being
a postural analysis system sensitive to
musculoskeletal risks in a variety of tasks; for its
characteristic of dividing the body into segments to
be coded individually, with reference to movement
planes; for providing a scoring system for muscle
activity caused by static, dynamic, rapid changing
Kinemetry
Position and
orientation
of body
segments
Dynamometry
Anthropometry
EMG
External
forces and
pressure
distribution
Model
Parameters
for the
body model
Model
Gravitation forces
Mechanical energy
Inertia
Liquid moments
and
Internal forces
Kinemetry
Dynamometry
Anthropometry
EMG
Parameters
for the
body model
Muscular
activity
Position and
orientation
of body
segments
External
forces and
pressure
distribution
Model
Model
Gravitation forces
Mechanical energy
Inertia
Liquid moments
and
Internal forces
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 3
or unstable postures, for characterizing that
coupling is important in the handling of loads but
may not always be via the hands; and for providing
a level of action with an indication of urgency.
The REBA method, as well as the Augmented
Reality (AR) technology, was the basis for this
study.
Augmented Reality can be defined as the
overlapping of virtual objects with the physical
environment, being shown to the user in real time
through a technological device, using the real
environment interface adjusted to visualize and
manipulate real and virtual objects (Kirner, C.;
Kirner, T. G., 2008).
In recent search for Brazilian systems that make use
of AR in human movement tracking, two systems
were found: the ARPhysio system and the MMSS
system - upper limbs movement measurement and
recording system. The ARPhysio system is a
support tool for the physiotherapist work that uses
AR techniques to capture images of the patient, to
track the evolution of his movements and to make
an overlay image, in real-time, of information
related to the activity being performed. This system
was applied to the knee joint motion analysis
(Lima, J. P. S. M. et al. 2006). The upper limbs
movement measurement and recording system
(MMSS) measures and records the upper limbs
joint angles during a worker activity. The
measurement and recording are made during task
observation by ergonomists (Oliveira, A., S., 2008).
With regards to human movement analysis system
the National Institute of Technology (INT) of the
Ministry of Science, Technology and Innovation
(MCTI) / Brazil, developed an occupational
biomechanical analysis system that aims to assist
designers in achieving and applying biomechanical
parameters. The system interacts with image files
(pictures and movies) and through the inclusion of
biomechanical models in the sagittal, frontal and
transverse planes allows the biomechanical analysis
of the movement and the postural analysis, which
results in the analysis of kinematics and kinetics
parameters of the human activity (see figure 4
below).
Figure 4: INT biomechanics analysis system - version 1.1
3. Methods
On this study the programming language Processing
version 1.5.1 and the libraries GSVideo and
NyARToolkit were used. Initially, eight anatomical
landmarks were selected to define a biomechanical
model and to establish the number of markers to be
used in the system programming (see figure 5).
Table A presents those anatomical points (P).
Figure 5: Anatomical landmarks and biomechanical
model definition.
Table A Anatomical landmarks defined on system
P1 = ear canal
P2 = deltoid muscle insertion (shoulder joint)
P3 = humerus lateral epicondyle (elbow joint)
P4 = ulnar styloid process (wrist joint)
P5 = trochanter (hip joint)
P6 = femur lateral condyle (knee joint)
P7 = tibia lateral malleolus (ankle joint)
P8 = hallux tip (big toe tip)
For each anatomical landmark a binary marker was
established. See table B.
P1
P2
P4
P3
P5
P7
P6
P8
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 4
Table B Anatomical landmarks and corresponding
binary markers
P1 = Hiro marker
P2 = Kanji marker
P3 = NyID marker 0
P4 = NyID marker 1
P5 = NyID marker 2
P6 = NyID marker 3
P7 = NyID marker 4
P8 = NyID marker 5
The study second stage involved markers
recognition through a digital camera (webcam) and
NyARToolkit and GSVideo libraries
implementation. The camera used was a Logitech
Pro 9000 HD. At this stage the multiMarker
software that identifies two markers - "patt.hiro"
and "patt.kanji” was taken as basis. Applying the
function "nya.addNyIdMarker" another six markers
were included in the system programming so that
eight corresponding markers were created.The
study third stage was to identify the markers
vertices because, through that identification, it
would be possible to construct line segments
(vectors) whose ends would be limited by the
vertices positioned in numerical sequence,
according to the biomechanical model (see Figure
6). This goal was achieved applying the function:
"Pvector p1 [] = nya.getMarkerVertex2D (0)".
Figure 6: markers vertices identification and its
connection by line segments
In the study fourth stage the angle between two
vectors that have a point (x, y) in common was
calculated (see Figure 7). For that the arc tangent
trigonometric function (θ) was applied (see Figure
8).
Figure 7: joint angles calculation defined between
vectors.
Figure 8: right triangle trigonometric functions
(http://processing.org/learning/trig/)
During system tests a variation on the result of the
arc tangent calculation occurred: sometimes it was
done by the "internal" joint angle, sometimes by the
"external" (supplementary) joint angle. This
variation was due to vectors positioning in
quadrants. According to that, it was defined that
both angles “internal and external”- should be
calculated and presented in the system. That was
done for all joint angles below (see figure 7):
P2
P6
P5
X
P8
P7
P3
P4
S
Z
Y
K
W
P1
y
t
x
y
u
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 5
Z = elbow angle (arm-forearm angle)
X = shoulder angle (arm-trunk angle)
Y = hip angle (trunk-thigh angle)
K = knee angle (thigh-leg angle)
W = ankle angle (leg-foot angle)
The final programming stage presents, in real time,
all angles simultaneously calculated, giving them
score values based on the postural angles
established in the REBA tool.
4. Results
The developed system identifies, from image
capture through a webcam, markers positioned on
specific anatomical points on the body that define
human body segments. From the markers union
through lines, a virtual biomechanical model of the
body in sagittal plane is built and the joint angles
calculation is done. The result is presented in real
time on screen.
To evaluate the system functionality an application
was implemented using an articulated dummy (see
Figures 9 and 10). As the dummy component parts
(head, trunk, arms and legs) positioning is changed
the system displays the joint angles values, as well
as, the score values defined in the REBA tool.
The camera used in the study was the Logitech Pro
9000, 640x480 pixels resolution, maximum frame
rate of 30 frames per second. The computer uses an
AMD Athlon II Neo K325 Dual-Core 1.30 GHz
processor with 4 GB of RAM.
Figures 9 and 10: Software final result.
In a second application daily activities were
performed to evaluate the system's capacity to track
the adopted postures when performing these
activities.
The experiment activities were: typing, reading on
screen and mouse use, briefcase searching in a box
and heavy travel bag grasping and lifting (see
Figures 11, 12, 13, 14, 15, 16 and 17).
Figure 11: Keyboard typing.
Figure 12: Text reading on screen.
Figure 13: Mouse use.
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 6
Figure 14: Pulling organizing box.
Figure 15: Briefcase searching in the box.
Figure 16: Pushing organizing box.
Figure 17: Heavy travel bag grasping
Figure 18: heavy travel bag lifting.
Some problems were observed in the second
application. They are listed below:
- when the posture was changed, the distance
between the camera and the person filmed varied
too, the camera autofocus function lost its tuning
and hence the markers detection. That occurred on
rapid movements of the body. From the moment
that the autofocus function was disabled, and this
done manually, the problem was solved.
- laser printing produces reflections under artificial
light incidence that interfere on markers detection.
Tests performed with markers that were printed on
inkjet printers that problem didn’t happen.
- according to the adopted posture the markers were
not detected because they got occluded under body
parts.
- for the system operation all the markers need to be
detected and for that reason the entire performed
action needs to be totally framed in the scene. That
required the camera setting on a photographic
tripod with bubble level.
- the markers were fixed on the body and on the
person clothes with tape. This method proved not to
be very effective because during the body joint
movement, for instance, the elbow movement, the
marker also moved getting out of the correct
position over the body joint. To minimize that
problem, each marker position was checked after
the posture was adopted.
5. Discussion
The results achieved were satisfactory considering
the objective established for the study: the
development of a system that calculates body joint
angles in the sagittal plane through tracking
markers placed at specific anatomical points.
That is the system main attribute and also its
limitation since the joint angles calculation is done
for only one of the tri-orthogonal movement planes.
For the system evolution it should be studied the
joint angles and the segmental angles calculation
both in the lateral and in the frontal plane.
With regards to the second objective: to perform
postural analysis taking into account the postural
angles defined in the REBA tool there are
F. Pastura, Joint angles calculation through augmented reality
* Corresponding author. Email: flavia.pastura@int.gov.br 7
adjustments to be made. Biomechanics specialists’
observed an error related to neck and trunk angles
calculation, according to the REBA tool. These
segmental angles calculation should be done
considering the y axis not considering the body
segments, as defined for the joint angles calculation
in the system. This should be adjusted in the system
programming evolution.
Another system aspect that should be implemented
is the capacity to store images and results together.
That is important because it allows human
movement description as well as it enables the
development of reference movement databases.
An important system characteristic is that it permits
to complement the data obtained through other
postural analysis methods, such as RULA and
OWAS methods. The system is not restricted to
REBA method. Also the possibility to
comparatively evaluate the system functionality and
results with the INT system is quite interesting. The
INT system does not perform real time calculation
and that makes a comparative evaluation between
those two systems interesting to be conducted.
6. Conclusion
The main aspect of this study is that with a low cost
infrastructure, comprised of a webcam and paper
printed markers, it was possible to develop a real
time joint angles calculation and assessment tool.
That can be presented as an alternative and
inexpensive solution for the health care professional
who works with postural analysis and requires
simple and agile tools for his use.
References
http://processing.org/
http://processing.org/reference/libraries/
http://nyatla.jp/nyartoolkit/wp/?page_id=361
http://www.sixwish.jp/AR/Marker/idMarker/
http://processing.org/learning/pvector/
http://www.ncbi.nlm.nih.gov/pubmed/10711982
http://vr.isdale.com/vrTechReviews/AugmentedRea
lity_Nov2000.html
http://grva.lamce.coppe.ufrj.br/realidade_aumentad
a/index.php
http://realidadeaumentada.com.br/home/index.php?
option=com_content&task=view&id=1&Itemid=27
http://www.ckirner.com/realidadevirtual/?DEFINI
%C7%D5ES
http://limatreinamento.blogspot.com.br/2010_09_01_arch
ive.html)
Amadio, A. C., BiomechanischeAnalyse des
Dreisprungs. Köln, 275p.Dissertation (Doktor der
Sportwissenschaften) - Deutsche Sporthochschule
Köln, 1985.
Amadio, A.C., Metodologia Biomecânica para o
estudo das forças internas ao aparelho locomotor:
importância e aplicações no movimento humano.
(In) Amadio, A.C.
Barbanti V.J., (Orgs) A Biodinâmica do movimento
humano e suas relações interdisciplinares. p. 45-70,
São Paulo, Editora Estação Liberdade, 2000.
Hignett, S.; Mcatamney, L., Technical note - Rapid
Entire Body Assessment (REBA).Applied
Ergonomics 31, p.201-205, 2000.
Oliveira, A., S., Motta, R., A., S., M., Oliveira, S.,
B., Cunha, G., G., Uma alternativa de baixo custo
para análise da atividade ergonômica: medição e
registro de movimentos dos membros superiores
(MMSS), 2008.
Waters, T. R.; Putz-Anderson, V., Garg, A.; Fine,
L. J., Revised NIOSH equation for the design and
evaluation of manual lifting tasks.Ergonomics 36
(7), p.749-776, 1993.
ERGOKIT, Manual de Aplicação de Dados
Antropométricos,Instituto Nacional de Tecnologia,
Unidade de Programas de Desenho Industrial, 1995.
IIDA, I.; Ergonomia Projeto e Produção. Ed.
EdgardBlücherLtda., 1990
Karhu, O.; Kansi, P. And Kuorinka, I.; Correcting
working postures in industry: a practical method for
analysis. Applied Ergonomics 8 (4): 199-201, 1977.
Kirner, C.; Kirner, T.G., Virtual Reality and
Augmented Reality Applied to Simulation
Visualization. In: El Sheikh, A.A.R.; Al Ajeeli, A.;
Abu-Taieh, E.M.O.. (Ed.). Simulation and
Modeling: Current Technologies and Applications.
1 ed. Hershey-NY: IGI Publishing, 2008, v. 1, p.
391-419. Retrieved March 14, 2010 from
http://www.igi-
global.com/Bookstore/Chapter.aspx?TitleId=28994
... With these challenges ahead, research has been focused on real-time ergonomic evaluations. Recent research endeavors to use sensor tracking devices [20] and wire-free sensor-based device [15] to provide an ergonomic guideline for the individuals. Since these devices support real-time interaction with objects, and they provide a way to get body information through sensors and trackers. ...
... They focus on building the experience prototyping technique initiated in [26]. However, how to know whether the pose is good or not should highly rely on ergonomic evaluation [4,15,5]. ...
... However, the joint occlusion makes the estimation unstable, and hand tracking is not supported in the current sensor, which impedes the arm assessment. In [15], Pastura et al. developed a computational system for joint angles calculation of human body. They use tracking markers positioned on specific anatomical points of the body to design define body segments. ...
Preprint
Full-text available
Digital prototyping and evaluation using 3D modeling and digital human models are becoming more practical for customizing products to the preference of a user. However, the 3D modeling is less accessible to casual users, and digital human models suffer from insufficient body data and less intuitive illustration on how people use the product or how it accommodates to their body. Recently, VR-supported 'Do It Yourself' design has achieved real-time ergonomic evaluation with users themselves by capturing their poses, however, it lacks reliability and quality of design. In this paper, we explore a multi-person interactive design approach that enables designer, user, and even ergonomist to collaborate to achieve the effective and reliable design and prototyping tasks. Mixed Reality that utilizes Hololens and motion tracking devices had been developed to provide instant design feedback and evaluation, and to experience prototyping in physical space. We evaluate the system based on the usability study, where casual users and designers are engaged in the interactive process of designing items with respect to the body information, the preference, and the environment.
Article
Full-text available
In 1985, the National Institute for Occupational Safety and Health (NIOSH) convened an ad hoc committee of experts who reviewed the current literature on lifting, recommend criteria for defining lifting capacity, and in 1991 developed a revised lifting equation. Subsequently, NIOSH developed the documentation for the equation and played a prominent role in recommending methods for interpreting the results of the equation. The 1991 equation reflects new findings and provides methods for evaluating asymmetrical lifting tasks, lifts of objects with less than optimal hand-container couplings, and also provides guidelines for a larger range of work durations and lifting frequencies than the 1981 equation. This paper provides the basis for selecting the three criteria (biomechanical, physiological, and psychophysical) that were used to define the 1991 equation, and describes the derivation of the individual components (Putz-Anderson and Waters 1991). The paper also describes the lifting index (LI), an index of relative physical stress, that can be used to identify hazardous lifting tasks. Although the 1991 equation has not been fully validated, the recommended weight limits derived from the revised equation are consistent with or lower than those generally reported in the literature. NIOSH believes that the revised 1991 lifting equation is more likely than the 1981 equation to protect most workers.
Article
This chapter introduces virtual reality and augmented reality as a basis for simulation visualization. It shows how these technologies can support simulation visualization and gives important considerations about the use of simulation in virtual and augmented reality environments. Hardware and software features, as well as user interface and examples related to simulation, using and supporting virtual reality and augmented reality, are discussed, stressing their benefits and disadvantages. The chapter intends to discuss virtual and augmented reality in the context of simulation, emphasizing the visualization of data and behavior of systems. The importance of simulation to give dynamic and realistic behaviors to virtual and augmented reality is also pointed out. The work indicates that understanding the integrated use of virtual reality and simulation should create better conditions to the development of innovative simulation environments as well as to the improvement of virtual and augmented reality environments.
(Orgs) A Biodinâmica do movimento humano e suas relações interdisciplinares Technical note -Rapid Entire Body Assessment (REBA)Applied Ergonomics 31
  • V J Barbanti
  • S Hignett
  • L Mcatamney
Barbanti V.J., (Orgs) A Biodinâmica do movimento humano e suas relações interdisciplinares. p. 45-70, São Paulo, Editora Estação Liberdade, 2000. Hignett, S.; Mcatamney, L., Technical note -Rapid Entire Body Assessment (REBA).Applied Ergonomics 31, p.201-205, 2000.
Correcting working postures in industry: a practical method for analysis Virtual Reality and Augmented Reality Applied to Simulation Visualization
  • O Karhu
  • P And Kansi
  • I Kuorinka
Karhu, O.; Kansi, P. And Kuorinka, I.; Correcting working postures in industry: a practical method for analysis. Applied Ergonomics 8 (4): 199-201, 1977. Kirner, C.; Kirner, T.G., Virtual Reality and Augmented Reality Applied to Simulation Visualization. In: El Sheikh, A.A.R.; Al Ajeeli, A.;
Metodologia Biomecânica para o estudo das forças internas ao aparelho locomotor: importância e aplicações no movimento humano
  • A C Amadio
Amadio, A.C., Metodologia Biomecânica para o estudo das forças internas ao aparelho locomotor: importância e aplicações no movimento humano. (In) Amadio, A.C.
Orgs) A Biodinâmica do movimento humano e suas relações interdisciplinares
  • V J Barbanti
  • S Hignett
  • L Mcatamney
Barbanti V.J., (Orgs) A Biodinâmica do movimento humano e suas relações interdisciplinares. p. 45-70, São Paulo, Editora Estação Liberdade, 2000. Hignett, S.; Mcatamney, L., Technical note -Rapid Entire Body Assessment (REBA).Applied Ergonomics 31, p.201-205, 2000.
Uma alternativa de baixo custo para análise da atividade ergonômica: medição e registro de movimentos dos membros superiores (MMSS)
  • A Oliveira
  • S Motta
  • M Oliveira
  • B Cunha
Oliveira, A., S., Motta, R., A., S., M., Oliveira, S., B., Cunha, G., G., Uma alternativa de baixo custo para análise da atividade ergonômica: medição e registro de movimentos dos membros superiores (MMSS), 2008.