Conference PaperPDF Available

The Use and Usage of Virtual Reality Technologies in Planning and Implementing New Workstations

Authors:

Abstract and Figures

Virtual reality (VR) technologies can support the planning and implementation of new workstations in various industry sectors, including in automotive assembly. Starting in the early planning stages, VR can help in identifying potential problems of new design ideas, e.g. through ergonomics analyses. Designers can then quickly change the virtual representations of new workstations to test solutions for the emerging difficulties. For this purpose, the actions and motions of prospective workers can be captured while they perform the work tasks in VR. The information can also be used as input for digital human modelling (DHM) tools, to instruct bio-mechanical human models. The DHM tools can then construct families of manikins that differ on anthropometric characteristics, like height, to simulate work processes. This paper addresses both existing technologies for gathering data on human actions and motions during VR usage and ways in which these data can be used to assist in designing new workstations. Here, a novel approach to translate a VR user's actions into instructions for DHM tools through an event-based instruction sampling method is presented. Further, the challenges for utilizing VR are discussed through an industrial use case of the manual assembly of flexible cables in an automotive context.
Content may be subject to copyright.
The Use and Usage of Virtual Reality
Technologies in Planning and
Implementing New Workstations
René REINHARD a,b,1, Peter MÅRDBERG c, Francisco GARCÍA RIVERA d, Tobias
FORSBERG e, Anton BERCE f, Mingji FANG f and Dan HÖGBERG d
a Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM,
67663 Kaiserslautern, Germany
b University of Kaiserslautern, Center for Cognitive Science, 67663 Kaiserslautern,
Germany
c Fraunhofer-Chalmers Centre, 412 88 Gothenburg, Sweden
d University of Skövde, School of Engineering Science, 541 28 Skövde, Sweden
e Industrial Path Solutions AB, 412 58 Gothenburg, Sweden
f China-Europe Vehicle Technology CEVT, 417 55 Gothenburg, Sweden
Abstract. Virtual reality (VR) technologi es can support the planning and implemen-
tation of new workstations in various industry sectors, including in automotive as-
sembly. Starting in the early planning stages, VR can help in identifying potential
problems of new design ideas, e.g. through ergonomics analyses. Designers can then
quickly change the virtual representations of new workstations to test solutions for
the emerging difficulties. For this purpose, the actions and motions of prospective
workers can be captured while they perform the work tasks in VR. The information
can also be used as input for digital human modelling (DHM) tools, to instruct bio-
mechanical human models. The DHM tools can then construct families of manikins
that differ on anthropometric characteristics, like height, to simulate work processes.
This paper addresses both existing technologies for gathering data on human actions
and motions during VR usage and ways in which these data can be used to assist in
designing new workstations. Here, a novel approach to translate a VR user’s actions
into instructions for DHM tools through an event-based instruction sampling
method is presented. Further, the challenges for utilizing VR are discussed through
an industrial use case of the manual assembly of flexible cables in an automotive
context.
Keywords. Digital Human Modelling, Virtual Reality, Motion Tracking, Ergonom-
ics, Assembly Path Generation, Automated Manikins, Flexible Cables, Automotive
Assembly
1. Introduction
Virtual reality (VR) technologies find increasing application in many industrial settings,
including automotive assembly, construction, or energy technologies [1]. This does not
only include the assistance of product design through virtual prototypes [2] and VR-
1 Corresponding Author, Email: rene.reinhard@itwm.fraunhofer.de.
DHM2020
L. Hanson et al. (Eds.)
© 2020 The authors and IOS Press.
This article is published online with Open Access by IOS Press and distributed under the terms
of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0).
doi:10.3233/ATDE200047
388
based personnel training [3], but also the planning, evaluation, and optimization of as-
sembly processes [4, 5]. For these purposes, VR technologies allow for the interaction
with digital prototypes and with virtual representations of planned workstations even
during the early planning stages [6]. Industrial case studies indicate the usefulness of VR
assisted design of manufacturing workstations compared to design work with only a
desktop computer set-up [7]. Thus, in the planning and design of workstations, VR can
be used to generate feedback from relevant groups quickly and early in the process, lead-
ing to more rapid iterations without the need for costly physical prototypes.
On one hand, experts in subjects like ergonomics can experience and interact with
the proposed workstations. Here, the virtual setups can further be used to visualize the
results from path planning and digital human modelling (DHM) tools. Thus, the experts
can assess the results of assembly simulations and the predicted human motions in an
intuitive manner and help evaluate the feasibility, accessibility, and visibility during the
installation of digital prototypes, and provide impressions for ergonomic analyses. The
gained insights can then be used to adjust the digital work site, to further refine the as-
sembly path and to add to or alter the constraints for the prediction of human motion in
DHM tools.
On the other hand, members of the prospective workforce can perform the planned
actions in VR while their motions are being tracked [8]. This allows for ergonomics
evaluations that take individual characteristics of the workers, their specific restraints
and abilities into account. But it also allows for the extraction of data from their tracked
movements, which can in turn be used as input for DHM tools.
In this paper, we review currently available VR and tracking technology options and
explore their usage in planning new and evaluating existing workstations. In this context,
we present a novel event-based instruction sampling approach, in which the actions of
VR users are tracked while they are interacting with the digital workstation. This is then
used to create a simulation of the manufacturing task, where a digital manikin is in-
structed based on the actions the VR user performed. This makes it possible to evaluate
ergonomics and to repeat the simulation using manikins with different anthropometrics
than the original VR user. Moreover, this approach reduces the time needed to setup a
DHM simulation and offers a more intuitive approach to construct simulations for non-
expert users.
2. Capturing human motions and actions in VR
In order for a VR user to experience a virtual environment and interact with the virtual
objects therein, both visualization and tracking technologies are required. In an industrial
context, the most commonly utilized VR visualization approaches are projection-based
systems and head-mounted displays (HMD) [1, 9]. Projection VR systems include single
or multiple projector-based powerwalls, as well as surrounding, walk-in setups, based on
multiple projection screens (e.g. Cave Automatic Virtual Environment (CAVE) or
CAVE-like systems). However, the current paper focuses on HMD solutions, i.e. display
devices which are affixed to the VR user’s head and typically include one or two displays
as the image source, as well as collimating optics between the eyes and the display.
These systems also typically include on-board inertial measurement units (IMUs) to
track rotational movements of the head which are then translated into corresponding ori-
entation changes in the virtual environment [10]. This can further be combined with
methods that track the position of the HMD to allow for full 6-DoF movement. This
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies 389
positional tracking usually uses accelerometer dead reckoning as its basis, which is com-
bined with various additional tracking methods to correct the inertial measurement drift
[11]. These methods may make use of external hardware, e.g. an external camera that
detects infrared signals send from the HMD [12], or lighthouse tracking, where external
base stations with stationary LED arrays and active spinning laser emitters send out LED
flashes followed by laser sweeps, that are registered by photodiodes on the headset [11,
12]. The positional tracking can also be based on inside-out tracking methods, where
cameras on the device estimate the motions of the camera itself relative to the environ-
ment they model based on the recorded input [13].
Besides head tracking, modern VR devices often integrate motion tracked handheld
controllers, that are tracked and visualized in 3D space and also allow for abstract inputs
via button presses [14]. Hereby, they also track the approximate spatial position of the
hand holding the controller. Newer VR controller concepts also expand these capabilities
with, as of yet, limited finger tracking based on sensors in the controller [15]. This al-
ready allows for the usage of some natural motions to interact with virtual objects, like
grasping a virtual object by gripping the controller. Some motions, like a pinch grip, can,
however, be hindered by the geometry of the controller in the user’s hand. Finger track-
ing can be further extended by other hand and finger tracking technologies, like data
gloves or optical tracking, which can translate more natural hand and finger motions into
the virtual world. Some data gloves can also expand the feeling of touching virtual ob-
jects through tactile feedback, such as actuators on the glove that touch the hand across
its surface, or through force feedback, i.e. mechanical forces applied to the finger tips to
provide a resistance consistent with touching the object [16].
Lastly, body movements of workers at workstations can be captured. This can be
relevant for a range of questions, focusing on topics from posture for ergonomics evalu-
ations, upper body movements for capturing workers performing a task, to gait analyses
for logistics analyses on a factory floor. The motion capture can be achieved by a wide
variety of approaches: optoelectronic measurements, image processing systems, ultra-
sonic localization systems, and electromagnetic- or IMU-based systems [17]. Different
motion capture systems may be more or less appropriate for certain use cases. As an
example, optoelectronic measurements, i.e. active or passive marker-based tracking with
usually fixed cameras, offer the most accurate tracking, but can be negatively impacted
by obstructions to the line-of-sight or large distances from the cameras [17]. By contrast,
IMU systems do not need additional external apparatus, are useful in a mobile context,
and are capable of capturing highly dynamic motions, but also need additional infor-
mation, e.g. from human rigid-body models, to actually offer positional data [17].
In industrial use cases, more elaborate tracking options have been established in the
assessment of ergonomics. For example, Daria et al. [18] combined an IMU-based mo-
tion capture system connected to Siemens Jack with ErgoLog to perform ergonomics
evaluations for workstation simulations. Similarly, Caputo et al. [19] also used an IMU-
based motion capture system with Siemens Process Simulate to track posture, which,
together with risk screening methods, was used as the basis for ergonomics evaluations.
But the tracked movements of VR users can also be useful for other use cases, e.g.
VR-based training. Also, the usage of tracked motions as inputs for DHM tools should
be mentioned. For example, Peruzzini et al. [20] used a Vicon optical motion capture
system for posture tracking with a Delmia V5-6 for workstation digitalization. Here, they
used Catia manikin digitalization and Haption RTI Delmia to connect the VR users’ real
movements to the virtual manikin’s movements. Similarly, Garcia et al. [21] used IPS
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies390
VR with IPS IMMA manikins and Smart Textiles to track user movement. These appli-
cations in ergonomics evaluations and manikin instruction will be the focus of the fol-
lowing sections.
2.1. VR assisted ergonomics evaluations
The cost of work-related musculoskeletal disorders is considerable both for companies
and for the afflicted workers [22]. This includes both direct costs, such as healthcare [23],
and indirect losses, e.g. through reduced quality in the production [24]. Such disorders
are also psychologically taxing for those who suffer them [25]. The combination of mo-
tion capture and VR can help in developing new methods to address these problems.
As the health risks are closely related to the posture of the workers while they per-
form their jobs, corrective approaches can target posture either via active or passive
measures. In active corrections, the operators are informed that their posture is poten-
tially harmful [26], while in passive correction, the workstation’s design is improved to
facilitate better posture [27]. In order to design workstations that minimize the workers’
health risks, standardized ergonomics evaluation methods such as rapid upper limb as-
sessment (RULA), rapid entire body assessment (REBA), or the Ovako working posture
analysis system (OWAS) are being utilized [28]. To apply these standards, experts have
to either simulate the workers’ movements at the workstation using DHM tools, or to run
tests by observing real life motions. Traditionally, this design process makes use of 2D
screens and physical prototypes. Both approaches have advantages and disadvantages.
DHM tools can be used to economically test design ideas even during early phases
of the process and thus spot and address potential problems early on, since they allow
for rapid design changes to the workstation, to the performed task and to the anthropo-
metric characteristics of the simulated workers. Yet, the DHMs need to be instructed,
which requires expertise, time, and effort to come to representative results. Tests with
physical prototypes make direct and easy observations of workers performing the tasks
possible, but are also more costly and make changes to the workstation design more
complicated. Physical tests will therefore often be used later in the design process.
By including VR in the design process, the designer can conduct studies of ergo-
nomics earlier, without costly physical prototypes, and with the possibility of rapid
changes. For this purpose, VR has often been combined with motion capturing technol-
ogies [18, 19]. The virtual environment also offers a high degree of control over the
situation, including over factors like lighting and noise. However, the use of VR can also
have drawbacks. The heavy emphasis on the sense of touch during assembly processes
and the expectation of a physical resistance when interacting with virtual structures can
often not be adequately simulated. Further, some people may feel less present in the
virtual environment, or may even react adversely to VR usage, by developing motion
sickness-like symptoms [29]. These individual reactions to VR can in turn impact task
performance [29]. Consequently, the face validity of motions tracked in VR may not be
as clearly established as it is for real life tests on prototypes. Still, VR can be especially
useful for early tests of design ideas and to support the creation of simulations in DHM
tools. In order for VR to assist in working with DHM tools, the information about the
VR user’s actions and motions have to be made usable within the tools. In the following,
a new event-based instruction sampling approach is presented, to show how this can be
achieved and what requirements it entails.
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies 391
2.2. VR assisted ergonomics evaluations in DHM tools: An event-based instruction
sampling approach
For VR to be of assistance in working with DHM tools, the relevant information from
the VR session has to first be recorded and then translated into a form that can be utilized
in later simulations. Here, we present a method that approaches this issue through the
analysis of a VR user’s actions at the virtual workstation, which results in instructions
for a DHM tool that uses IPS IMMA [30]. This approach has several requirements, both
for the used VR technology and for the DHM tool.
On the side of the adopted VR technology, information about the VR user’s move-
ments and interactions in the virtual space have to be captured. This can be realized even
with the minimal tracking equipment provided by most current HMD-based VR options,
namely the HMD itself and standard handheld VR controllers. While additional tracking
information, e.g. from finger or full body tracking, could be gainfully employed, the fol-
lowing will not presuppose access to any additional VR or tracking technology beyond
this typical setup, i.e. an HMD that is capable of both rotational and positional tracking
and motion tracked VR controllers. On the side of the DHM tool, two functionalities are
fundamental to this new method:
x An automated manikin that can interpret and automatically perform instructions
with ergonomically sound postures and motions.
x An instruction language that can be mapped against the events in the VR session
and that can be interpreted by digital manikins and other objects in the scene.
2.2.1. Automated manikins
A manikin in a DHM system can be said to be automated, if it is able to automatically
perform an assembly operation. Thus, if instructed, it will perform a task automatically,
without any additional help from the user of the DHM tool. Moreover, the task needs to
be performed with ergonomically sound postures and motions which, for instance, need
to consider the balance and weight of the manikin’s body parts and of any carried objects
[31, 32]. The simulation should also consider external forces and torques, while ensuring
that the postures and motions are collision free with respect to both the manikin’s body
parts and the objects in the environment [31, 32].
2.2.2. Instruction language
The instruction language should not be limited to only manikins, since the manikin may
interact with other objects in the simulation. All objects used in the simulation can be
seen as actors that are performing a set of instructions. Such actors may include geome-
tries, manikins, or mechanical structures, with each actor having its own set of instruc-
tions to execute. Thus, simulations with both manikins and other objects in the simulation
can be created from the same instruction language [33, 34].
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies392
a)
b)
Figure 1. Example of events in the IPS instruction language. a) Interactions between the manikin and an object
in the scene with which a task is performed. b) Examples of possible actions for a manikin and object.
The set of instructions that the actors may perform during a simulation should de-
pend on the current state of the actor and on the objects in the simulation. For instance,
if the manikin grasps an object with the right hand, then it is not possible for the manikin
to grasp another object with the same hand unless the first object has been released [33,
34]. An example of such instructions in the IPS software is illustrated in Figure 1.
Moreover, each instruction must have a corresponding action event in the simulation.
For instance, a grasp instruction may only be used if there is an object that is available
for the manikin to take a hold of. Hence, properties of an object, such as grip, view, and
attach points, also define the set of possible instructions for the actors.
Depending on the state of the manikin and on the objects in the environment, work
tasks are translated to a sequence of low-level instructions in a controller structure [35].
By following the instruction sequence, a DHM tool like IPS automatically generates col-
lision-free and ergonomically sound manikin motions that accomplish the assembly tasks
[35].
2.2.3. Instruction of DHM manikins through event sampling actions in VR
To instruct the manikin, the presented approach uses a sampling procedure that considers
all the events that occur while the VR user manipulates virtual objects with the control-
lers. Each manipulation corresponds to at least one sampled event. Such events may for
example include the interaction when the user takes the virtual object, followed by the
motion of the object while the VR user holds and moves it to another location. During
the VR session, this interaction would correspond to the user pressing a button on the
controller to hold the object, to moving it with the motion tracked controllers, and then
to releasing the button to let go of the object.
In the presented method, the interaction events in the VR session are translated to
the instruction language by mapping them to a fixed set of manikin actions. As an exam-
ple, when the user is taking and placing an object, this is translated into corresponding
actions of the instruction language, such as grasp and release. The IMMA manikin uses
grip, view, and attach points to interact with objects in the environment. Predefined grip
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies 393
points are automatically created when a VR controller is used to manipulate an object.
Currently, a new predefined grip is created where the object is grasped as soon as the VR
user takes hold of it.
Moreover, the movements of all objects that the VR user manipulates are also doc-
umented in IPS. As an example, during the VR session, the motions of a tracked control-
ler are translated onto a gripped object. The resulting motions are logged and then move-
ment trajectories are created accordingly. These trajectories can then be included in the
simulation and correspond to a follow instruction in the instructional language. Since all
events and trajectories are time stamped, it is possible to construct an instruction se-
quence for the manikin and all other objects used in the VR session. By following the
resulting instruction sequence, a digital simulation of the VR session is created, where
the user is represented by the manikin.
2.2.4. Discussion of an event-based instruction sampling approach
In the presented event-based instruction sampling approach, it is possible to capture the
actions of VR users to quickly create simulations in a DHM tool, thereby lowering the
expertise requirements of using these software options while allowing for their effective
usage. In the resulting simulations, it is possible to change the anthropometric character-
istics of the manikin and then repeat the simulation for different workers. IPS IMMA
contains a built-in functionality to simulate an entire manikin family [36, 37]. Thus, this
approach offers a straight forward and cost effective path to ergonomics evaluations of
new workstation designs that consider workforce diversity through limited motion cap-
ture efforts in VR.
Still, there remain challenges to this approach. When a user picks up an object, a
grip point is created. The currently automatically chosen grip type is similar to the way
that one grasps the handheld controller, but this might not reflect the VR user’s intentions.
While grip types can be adjusted later in the DHM tool, this is an additional time demand.
One approach to overcome this issue could be to let the user select a grip from a list of
predefined grip types which is then automatically aligned, adjusted, and attached to the
object. This would reduce the time that is later spend on adjusting the grips in the DHM
software, but it also constitutes an action besides naturally interacting with the objects in
VR and necessitates a certain degree of expertise to select the correct grip type. Another
approach would be to use information from finger tracking technologies to automatically
select a fitting grip type. With classical VR controllers, even with the newer models that
try to estimate finger positions based on sensors on the controller, this can be a problem.
Currently, these controllers can e.g. let VR users naturally grasp a spherical object in the
palm of their hand, which corresponds to gripping the controller, but other grip types are
not as intuitively translated into VR. In these cases, other finger tracking methods, like
data gloves [16], could be useful. However, these options are more costly and come with
higher initial time demands for equipping and calibrating the devices. New optical track-
ing options based on cameras in the HMD could also be useful, but they require a clear
line-of-sight to the hand while performing the task, which may not always be possible.
The presented implementation could also be extended by motion tracking for body
movements to include complex postural data and actions like squatting or kneeling in the
instruction sequence. Future implementations will also focus on capturing interactions
with complex objects, including collaborative robots and flexible cables.
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies394
Figure 2. An IPS VR user fastening a flexible cable with clips during an assembly task. Courtesy of CEVT.
2.3. Challenges for using VR in planning and implementing workstations exemplified
by a use case from the automotive industry
The manual assembly of flexible cables is a common task in the automotive industry
today. A corresponding industrial use case from the China Euro Vehicle Technology AB
(CEVT) is illustrated in Figure 2. In this use case, the worker should assemble a cable
from the floor to the roof of a car along its b-pillar. Here, the cables were fastened into
particular places with clips, which are specifically designed for these circumstances. To
assemble the cable, it needed to be unfolded, routed, and fastened in a certain order. All
of this had to occur while the internal torques and forces of the cable, as well as the forces
of the clips needed to be considered. An assembly of this type may also be performed in
narrow regions and it may lead to uncomfortable postures for the assembly worker.
As shown in Figure 2, the assembly can be performed by a VR user who guides a
manikin to assemble each of the clips along the pillar. The cable can be realistically
fastened to the car by stepwise changing the boundary conditions of the cable during the
simulation, i.e. by adding constrains to the clips on the cable. This example showcases
many of the challenges that VR can encounter when it is applied to complex industrial
use cases. To work with the clips, the manikin was instructed to utilize a pinch grip,
which did not correspond to the VR users grip on the controller. Further, the assembly
occurred at the b-pillar of a car and the VR users would have both expected its resistance
when affixing the clips and may have wished to lean on the structure during the assembly.
While VR users can be shown a visual impression of their avatar leaning on a virtual
object, they themselves are not provided with its physical support. Even current haptic
feedback options cannot accurately let the VR users touch and interact with such virtual
structures, especially for demanding actions like bodily leaning against them. In addition,
the work task required physically correct behavior from the flexible cable in real time at
a high frame rate, which, while possible in IPS’s VR implementation, can become per-
formance intensive. While some of these challenges can be conquered with new advances
in DHM and VR software, as well as with tracking-related hardware, complex bodily
interactions with virtual structures can likely only be approached by the introduction of
real life elements, like a b-pillar replica in a mixed-reality setup.
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies 395
3. Conclusion
VR offers many potential benefits for the planning, design, and implementation of new
workstations. It can allow for ergonomic assessments even early in the planning phases
and, through new methods like the presented event-based instruction sampling approach,
VR can also support the work in DHM-related software. The usage of VR does, however,
also come with challenges that should be considered. VR can especially be of help in
rooting out potential problems early in the design process, while during later stages real
life prototype tests and ergonomics assessments may still have clear advantages in some
complex use cases.
Acknowledgements
This work has been made possible with support from the Swedish Governmental Agency
for Innovation Systems (VINNOVA) within the VIVA and SUMMIT projects, and the
Knowledge Foundation within the Synergy Virtual Ergonomics (SVE) project, and the
Eurostar project ED-VIMA (E!113330) supported by VINNOVA under the grant num-
ber (2019-03534), as well as by the participating organizations. It is also part of the Sus-
tainable Production Initiative and the Production Area of Advance at Chalmers Univer-
sity of Technology. This support is gratefully acknowledged.
References
[1] Berg LP, Vance JM. Industry use of virtual reality in product design and manufacturing: a survey. Vir-
tual Reality, 2017, 21, 1–17.
[2] Wolfartsberger J. Analyzing the potential of Virtual Reality for engineering design review. Automation
in Construction, 2019, 104, 27–37.
[3] Gorecky D, Khamis M, Mura K. Introduction and establishment of virtual training in the factory of the
future. International Journal of Computer Integrated Manufacturing, 2017, 30, 182–190.
[4] Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals,
2010, 59, 740–759.
[5] Abidi MH, Ahmad A, Darmoul S, Al-Ahmari AM. Haptics Assisted Virtual Assembly. IFAC-
PapersOnLine, 2015, 48, 100–105.
[6] Pontonnier C, Dumont G, Samani A, Madeleine P, Badawi M. Designing and evaluating a workstation
in real and virtual environment: toward virtual reality based ergonomic design sessions. Journal on
Multimodal User Interfaces, 2014, 8, 199–208.
[7] Peruzzini M, Pellicciari M, Gadaleta M. A comparative study on computer-integrated set-ups to design
manufacturing systems. Robotics and Computer-Integrated Manufacturing, 2019, 55, 265–278.
[8] Grajewski D, Górski F, Zawadzki P, Hamrol A. Application of Virtual Reality techniques in design of
ergonomic manufacturing workplaces. Procedia Computer Science, 2013, 25, 289–301.
[9] Liagkou V, Salmas D, Stylios C. Realizing Virtual Reality learning environment for Industry 4.0. Pro-
cedia CIRP, 2019, 79, 712–717.
[10] Desai PR, Desai PN, Ajmera KD, Mehta K. A review paper on Oculus Rift: A Virtual Reality headset.
International Journal of Engineering Trends and Technology, 2014, 13, 175–179.
[11] Niehorster DC, Li L, Lappe M. The Accuracy and precision of position and orientation tracking in the
HTC Vive Virtual Reality system for scientific research. Iperception, 2017, 8, 2041669517708205.
[12] Borrego A, Latorre J, Alcañiz M, Llorens R. Comparison of Oculus Rift and HTC Vive: Feasibility for
Virtual Reality-based exploration, navigation, exergaming, and rehabilitation. Games for Health Jour-
nal, 2018, 7, 151–156.
[13] Gourlay MJ, Held RT. Head-Mounted-Display tracking for Augmented and Virtual Reality. Infor-
mation Display, 2017, 33, 6–10.
[14] Reski N, Alissandrakis A. Open data exploration in Virtual Reality: A comparative study of input tech-
nology. Virtual Reality, 2020, 24, 1–22.
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies396
[15] Arimatsu K, Mori H. Evaluation of machine learning techniques for hand pose estimation on handheld
device with proximity sensor. In: CHI Conference on Human Factors in Computing Systems, 2020,
New York, USA, 1–13.
[16] Perret J, Poorten EV. Touching Virtual Reality: A review of haptic gloves. In: Borgmann H, editor.
ACTUATOR 2018: 16th International Conference on New Actuators, 2018, Bremen, Germany, 1–5.
[17] van der Kruk E, Reijne MM. Accuracy of human motion capture systems for sport applications: State-
of-the-art review. European Journal of Sport Science, 2018, 18, 806–819.
[18] Daria B, Martina C, Alessandro P, Fabio S, Valentina V, Zennaro I. Integrating mocap system and im-
mersive reality for efficient human-centred workstation design. IFAC-PapersOnLine, 2018, 51,188-
193.
[19] Caputo F, Greco A, d‘Amato E, Notaro I, Spada S. A preventive ergonomic approach based on virtual
and immersive reality. In: Rebelo F, Soares M, editors. Advances in Ergonomics in Design, Springer
International Publishing, Cham, Switzerland, 2018, 3–15.
[20] Peruzzini M, Carassai S, Pellicciari M. The benefits of human-centred design in industrial practices:
Re-design of workstations in pipe industry. Procedia Manufacturing, 2017, 11, 1247–1254.
[21] García Rivera F, Brolin E, Syberfeldt A, Högberg D, Iriondo Pascual A, Perez Luque E. Using Virtual
Reality and smart textiles to assess the design of workstations. In: Proceedings of the 9th Swedish Pro-
duction Symposium (SPS2020), 2020, Oct 6-9, Jönköping, Sweden.
[22] Bhattacharya A. Costs of occupational musculoskeletal disorders (MSDs) in the United States. Interna-
tional Journal of Industrial Ergonomics, 2014, 44, 448–454.
[23] Ramos DG, Arezes PM, Afonso P. Analysis of the return on preventive measures in musculoskeletal
disorders through the benefit-cost ratio: A case study in a hospital. International Journal of Industrial
Ergonomics, 2017, 60, 14–25.
[24] Falck A-C, Rosenqvist M. A model for calculation of the costs of poor assembly ergonomics (part 1).
International Journal of Industrial Ergonomics, 2014, 44, 140–147.
[25] Jose JA. Outcome measures and prognosis of WRMSD. Work, 2012, 41 Suppl 1, 4848–4849.
[26] Mahdavian N, Lind CM, Diaz-Olivares J, Pascual A, Högberg D, Brolin E, et al. Effect of giving feed-
back on postural working techniques. In: Advances in transdisciplinary engineering, 2018, 247-252.
[27] Shikdar AA, Al-Hadhrami MA. Smart workstation design: An ergonomics and methods engineering
approach. International Journal of Industrial and Systems Engineering, 2005, 2, 363–374.
[28] David GC. Ergonomic methods for assessing exposure to risk factors for work-related musculoskeletal
disorders. Occupational Medicine, 2005, 55, 190–199.
[29] Weech S, Kenny S, Barnett-Cowan M. Presence and cybersickness in Virtual Reality are negatively
related: A review. Frontiers in Psychology, 2019, 10, 158.
[30] Högberg D, Hanson L, Bohlin R, Carlson JS. Creating and shaping the DHM tool IMMA for ergo-
nomic product and production design. International Journal of the Digital Human, 2016, 1, 132–152.
[31] Bohlin R, Delfs N, Hanson L, Högberg D, Carlson JS. Automatic creation of virtual manikin motions
maximizing comfort in manual assembly processes. In: Hu SJ, editor. 4th CIRP Conference on Assem-
bly Technologies and Systems, 2012, Ann Arbor, USA, 209–212.
[32] Delfs N, Bohlin R, Hanson L, Högberg D, Carlson J. Introducing stability of forces to the automatic
creation of digital human postures. In: 2nd International Digital Human Modeling Symposium
(DHM2013), 2013.
[33] Mårdberg P, Carlson JS, Bohlin R, Delfs N, Gustavsson S, Keyvani A, Hanson L. Introducing a formal
high-level language for instructing automated manikins. In: 2nd International Digital Human Modeling
Symposium (DHM2013), 2013.
[34] Mårdberg P, Carlson JS, Bohlin R, Delfs N, Gustafsson S, Hanson L. Using a formal high-level lan-
guage to instruct manikins to assemble cables. Procedia CIRP, 2014, 23, 29–34.
[35] Mårdberg P, Yan Y, Bohlin R, Delfs N, Gustafsson S, Carlson JS. Controller hierarchies for efficient
virtual ergonomic assessments of manual assembly sequences. Procedia CIRP, 2016, 44, 435–440.
[36] Hanson L, Högberg D, Carlson JS, Delfs N, Brolin E, Mårdberg P, et al. Chapter 11 - Industrial Path
Solutions – Intelligently Moving Manikins. In: Scataglini S, Paul G, editors. DHM and posturography:
Academic Press, Cambridge, 2019, 115–124.
[37] Brolin E. Anthropometric diversity and consideration of human capabilities: Methods for virtual prod-
uct and production development [PhD Thesis]. Göteborg: Chalmers University of Technology, 2016.
R. Reinhard et al. / The Use and Usage of Virtual Reality Technologies 397
... (Cox, Theorem, 2024) By the way, VR enables ergonomic workstation analysis by recording worker movements and integrating with Digital Human Modeling (DHM) tools to test various body types. A VR application in automotive cable assembly demonstrated its effectiveness in simulating manual tasks and identifying physical limitations (Reinhard et al., 2020). However, some challenges remain, including VR's technical limitations, high setup costs, and discrepancies between simulations and real-world conditions. ...
... It also makes it possible to assess the relative discomfort associated with a work position, considering the arrangement of the back, arms and legs, along with the required load levels ( Figure 2). Through the assignment of a score, the system indicates the importance of implementing corrective measures to mitigate the discomfort associated with such a job position [28]. ...
Article
Full-text available
Industry 4.0, with its promise of a revolution, fuses advanced production techniques with smart technologies, benefiting both companies and workers. In this study, a systematic review was conducted to examine 4.0 technologies that improve work ergonomics. Out of 639 results, only 60 studies were selected using search algorithms. The role of virtual reality, artificial intelligence, digital twins and exoskeletons in ergonomics 4.0 was highlighted, improving the well-being of workers. 75% of the studies were conducted between 2020 and 2023, addressing aspects such as industrial process optimization, human safety, digital twins, and the use of collaborative robots and wearable sensors to improve ergonomics. These results highlight the importance of integrating 4.0 technologies into detailed analyses to improve the quality of working life and reduce musculoskeletal risks, promoting health in the work environment.
... In a virtual manufacturing model, early detection of ergonomic problems reduces the time and cost of finding solutions [17], protects workers from unnecessary risk of harm, and improves workplace well-being [18,19]. ...
Article
Full-text available
This paper presents the development and use of digital tools in the maintenance processes and ergonomics of work systems in the aerospace industry. The Industry 4.0 strategy aims to ensure the reliability of the human factor throughout the entire lifecycle of the maintenance process in the aerospace industry. Based on the requirement placed on the digital model of the working environment obtained from the 3D scanner data, an advanced software solution from TECNOMATIX, namely the TX JACK software 16.1.0 module, was used. The investigated digital ergonomic model, with two variants of workers with anthropometrically different weights, is the subject of analysis and simulation of the maintenance work process in an aerospace organization. Furthermore, the research also shows how the workers of maintenance and repair organizations are willing to develop their own knowledge and skills. The aviation industry should invest in the development of reliable software and hardware, improve safety at the level of digital ergonomics and the quality of jobs involving digitalization, and offer appropriate training for safety and quality personnel. The aim of this paper is to ensure the reliability of the human factor in the maintenance process and, consequently, to ensure technical safety by means of innovative tools in practice. The findings suggest that the investigated TESTER-STEND model with high-end adjustable pistons will improve ergonomics, worker performance, and work safety as a whole.
... Extended Reality (XR) technologies, including VR, Augmented Reality (AR), and Mixed Reality (MR), are expanding the possibilities of the present and future Industry 4.0. The main industrial applications of VR present in literature in recent years focus on virtual manufacturing for planning and simulation (Chandra Sekaran et al., 2021), which includes assembly evaluation and ergonomic assess-ment of workstations (Peruzzini et al., 2021;Reinhard et al., 2020;Garcia Rivera et al., 2020) and human-robot interactions (Ottogalli et al., 2021), factory layout planning with the use of virtual twins (Pérez et al., 2020;Havard et al., 2019;Gong et al., 2019), product prototyping (Wolfartsberger, 2019; Berg and Vance, 2017), robot path planning and teleoperation (González et al., 2021;Arnarson et al., 2021), and training (Schwarz et al., 2020). The results from the academic research on industrial application use cases tend to indicate that VR can be an adequate option for achieving interactive and immersive visualizations but that it still needs to be considered as a complementary tool, as it requires the integration with other tools and methods for all-rounded successful outcomes (Chandra Sekaran et al., 2021;Wolfartsberger, 2019). ...
Conference Paper
The growth of vehicle electrification is posing an increasing demand for electric batteries in the automotive industry. To respond to this growing demand, the automotive industry is starting to venture into the process of battery assembly. Multiple challenges arise from the complexities and the risks associated with these type of assembly processes. In this paper, the concept for a gamified virtual reality training system for battery assembly operators with the virtual representation of the risks is presented. A manual assembly process with automated assistance is considered. The results highlight the main aspects to consider for modeling a battery assembly process in a virtual environment. These aspects cover the areas of accurate modeling of risks, general user experience factors, and potential applications of the virtual reality training system.
... Head-mounted displays and projection-based systems are the most frequently utilized virtual reality visualization technologies in the industrial sector. Head-mounted displays are devices that are affixed to the head of a virtual reality user and generally feature one or two screens as the image source, as well as a collimating lens between the eyes and the display [62]. Projection virtual-reality systems, on the other hand, include single or multiple projector-based powerwalls, as well as surrounding, walk-in installations based on numerous projection screens (e.g., CAVEs). ...
Article
Full-text available
With the introduction of new devices, industries are turning to virtual reality to innovate their product development processes. However, before the technology’s possibilities can be fully harnessed, certain constraints must be overcome. This study identifies the benefits and challenges of virtual-reality-based usability testing and design reviews in industry through a patents and articles review. We searched Derwent Innovation, Scopus, and Web of Science and identified 7 patent filings and 20 articles. We discovered an increase in patent filings since 2016 and strong development in the technology space, offering opportunities to enter an area while it is still young. The most frequently researched field is the automotive industry and the most used device is the HTC VIVE head-mounted display, which is frequently paired with motion capture systems and Unity 3D game engines. Virtual reality benefits design reviews and usability testing by providing the visualization of new angles that stimulate novel insights, increasing team engagement, offering more intuitive interactions for non-CAD specialists, saving redesign cost and time, and increasing participants’ safety. The challenges faced by virtual-reality-based prototypes are a lack of realism due to unnatural tactile and visual interactions, latency and registration issues, communication difficulties between teams, and unpleasant symptoms. While these constraints prevent virtual reality from replacing conventional design reviews and usability testing in the near future, it is already a valuable contribution to the industrial product development process.
... Virtual reality (VR) technologies in the DHM domain focus on simulating products or systems, either real or metaphorical. They are extensively used for demonstrating how the design would look like on a large screen or headset to allow the users to "walk through" the product variants or scenarios (Jerald, 2015;Raschke & Cort, 2019;Reinhard et al., 2020). Similarly, augmented reality (AR) uses projection technologies to blend the computer-generated image and real-world view (e.g., the interactive 3D image is superimposed or overlaid on top of the regular field of view). ...
Article
The effective use of computational modeling and simulation tools early in the design process is arguably becoming a gold standard for modern product development. Compared to many mechanistic computational design approaches, modeling and simulating humans, due to their inherent complexities of physiological and cognitive attributes, provides one of the most challenging undertakings. With the rapidly expanding use of computer, sensor, and visualization technologies, digital human modeling (DHM) emerged as a computerized design support methodology that enables modeling and simulation of humans within a computer-aided design (CAD) or virtual environment (VE). Implementing DHM with physical or digital mockups brings the advantages of running various “what-if” design scenarios early in design; thus, enhancing concept generation efforts by filtering out infeasible ideas and exploring better design alternatives. A modern product development process with DHM can also help to reduce the overall cost and time required in the long run. Although several DHM software packages are available and many companies have been designing with DHM, the domain has not reached maturity in resolving theoretical research questions and fostering simulation-based ergonomics practice. Besides, the growing body of literature, software platforms, and technology integration makes it challenging for newcomers and specialists from disciplines other than human factors engineering (HFE) to recognize the power of DHM tools. This paper aims to provide a comprehensive review of the DHM domain and summarize the evolution, current status, and future trends of the DHM design support tools. We hope this review will provide a guideline for designers and serve as a roadmap for current and future researchers interested in DHM-related research to find out new venues and opportunities for further international collaborations.
Presentation
Full-text available
The VR component of IPS already offers many functionalities that are useful to approach topics like the planning, evaluation, and optimization of assembly processes or virtual product design. At the same time, there are extended reality (XR) technologies that are currently not implemented, but that may be of interest for future inclusion in IPS. On the one hand, this includes tracking capabilities, like the optical hand and finger tracking via a headset’s built in cameras, and on the other hand, visualization options, like the usage of augmented reality devices. For many of these XR options, their original developers offer directly available plugins for engines like Unity, Unreal or Omniverse. In order to test and evaluate the potential of these technologies for future IPS developments, we present an approach based on the interoperability between IPS and these engines. Here, tracking information from an engine like Unity is transferred to IPS, where it is utilized for the simulation of multiple cables. Selected data sources from the IPS simulation are then relayed back to the external engine, where it is used to include the cables in the visualization, with the option to show information about, e.g., the bend radius as colors on the cables’ segments. Thus, this approach allows the XR user to test out a wider array of XR technologies to interact with IPS simulated cables within the visualization options available in external engines.
Article
Virtual reality (VR) users interact with virtual objects using motion-tracked controllers. While many devices utilise abstract button pushes for interactions, some allow for limited finger tracking by estimating finger positions based on sensors. In this study, the Vive Wands and the Valve Index controllers were compared in three tasks: direct interaction with objects (throwing), tool usage (bow) and indirect control of a character (remote-control). Forty-four participants completed each task with both devices and rated the usability of the device after each task. Results showed only differences in preference for the remote-control task. Some participants noted that using the thumbstick of the Index instead of the touchpad of the Wands controller felt more natural in this task. However, performance did not differ between devices in any task. Therefore, future research should not only compare designs of controllers but also consider assets and interactions, as there may be preference and performance differences for certain combinations.
Article
Full-text available
The late detection of ergonomic component assembly issues during manufacturing processes has an influence on operator well-being and productivity, as well as having a high cost of correction. Although virtual reality may enhance digital human modeling, there is a knowledge gap on the combination of these technologies to assess ergonomics. This study aims to analyze the application of virtual reality and digital human modeling for physical ergonomics assessment during product development in the industry, through a review of patents and the literature. We searched the Derwent Innovation Index, Scopus, and Web of Science databases and found 250 patents and 18 articles. We observed an exponential increase in patents, concentrated among major technological players, and a wide range of technologies being invented. A significant number of studies focuses on the automotive and aviation industries. Despite a relative consensus in the literature on the benefits of integrating virtual reality and digital human modeling to assess physical ergonomics in the early stages of product development, the technologies are seldom combined in the same analysis; moreover, most cases continue to focus on analyzing pre-designed production processes, when resources are completely deployed. These outcomes may provide a reference for practitioners and researchers to develop novel solutions for the early detection of physical ergonomics issues in the industry.
Conference Paper
Full-text available
This paper presents a solution that integrates a smart textiles system with virtual reality to assess the design of workstations from an ergonomics point of view. By using the system, ergonomists, designers, engineers, and operators, can test design proposals of workstations in an immersive virtual environment while they see their ergonomics evaluation results displayed in real-time.. The system allows its users to evaluate the ergonomics of the workplace in a pre-production phase. The workstation design can be modified, enabling workstation designers to better understand, test and evaluate how to create successful workstation designs, eventually to be used by the operators in production. This approach uses motion capture together with virtual reality and is aimed to complement and integrate with the use of digital human modelling (DHM) software at virtual stages of the production development process.
Article
Full-text available
The scope of Industry 4.0 is to overlay a simulation on a real-time production line that can investigate poorly understood phenomena and aid the rectification of bottlenecks. VR application in Industry 4.0 allows companies to decrease design and production costs, maintain product quality and reduce the time needed to go from product concept to production. In this work, the authors present the VR’s technology aspects and limitations for supporting the VR developers for creating VR industrial environments that produce reliable/feasible simulations of the behavior of machines and real processes and stable operation of remotely real processes.
Article
Full-text available
In order to take advantage of the potential offered by the medium of virtual reality (VR), it will be essential to develop an understanding of how to maximize the desirable experience of “presence” in a virtual space (“being there”), and how to minimize the undesirable feeling of “cybersickness” (a constellation of discomfort symptoms experienced in VR). Although there have been frequent reports of a possible link between the observer’s sense of presence and the experience of bodily discomfort in VR, the amount of literature that discusses the nature of the relationship is limited. Recent research has underlined the possibility that these variables have shared causes, and that both factors may be manipulated with a single approach. This review paper summarizes the concepts of presence and cybersickness and highlights the strengths and gaps in our understanding about their relationship. We review studies that have measured the association between presence and cybersickness, and conclude that the balance of evidence favors a negative relationship between the two factors which is driven principally by sensory integration processes. We also discuss how system immersiveness might play a role in modulating both presence and cybersickness. However, we identify a serious absence of high-powered studies that aim to reveal the nature of this relationship. Based on this evidence we propose recommendations for future studies investigating presence, cybersickness, and other related factors.
Article
Full-text available
In this article, we compare three different input technologies (gamepad, vision-based motion controls, room-scale) for an interactive virtual reality (VR) environment. The overall system is able to visualize (open) data from multiple online sources in a unified interface, enabling the user to browse and explore displayed information in an immersive VR setting. We conducted a user interaction study (n=24; n=8 per input technology, between-group design) to investigate experienced workload and perceived flow of interaction. Log files and observations allowed further insights and comparison of each condition. We have identified trends that indicate user preference of a visual (virtual) representation, but no clear trends regarding the application of physical controllers (over vision-based controls), in a scenario that encouraged exploration with no time limitations.
Chapter
IPS IMMA (Industrial Path Solutions - Intelligently Moving Manikins) is a digital human modeling tool developed in close cooperation between academia and industry in Sweden. The academic consortium behind the software consists of expertise within applied mathematics, ergonomics, and engineering. The development of IMMA was initiated from the vehicle industries’ need of an effective, efficient, objective, and user-friendly software for verification of manufacturing ergonomics. The ‘Industrial path solutions - intelligently moving manikins’ chapter consists of two main sections: the first about the commercially available tool, and the second about current or recent research projects developing the software further. Commercial IPS IMMA is presented by describing the biomechanical model and appearance, anthropometrics module, motion prediction, instruction language, and ergonomics evaluation. The research projects focus on dynamic motions simulation, muscle modelling and application areas such as human-robot collaboration, occupant packaging, and layout planning.
Article
Virtual Reality (VR) technology still needs to evolve, but as the pace of innovations accelerates, systems allow for more novel modes of visualization and interaction to support engineering design reviews. Currently, the classic design review process is often performed on a PC with the support of CAD software packages. However, CAD on a screen cannot always meet all the requirements in regard to the functional and ergonomic validations of complex 3D models. In this paper, the development and evaluation of a VR-based tool to support engineering design review is described. "VRSmart" visualizes CAD data and allows for an intuitive interaction. In a preliminary user study, the tool was checked for its usability and user experience. VRSmart was then evaluated in a real industrial environment and tested in an authentic design review. The results indicate that a VR-supported design review allows users to see slightly more faults in a 3D model than in a CAD software-based approach on a PC screen. Furthermore, VR reduces the risk of exclusion of certain professional groups from the design review process. In addition, the intuitive interaction with the VR system allowed for a much faster entry into the design review. In summary, VR will not replace the traditional design review process on screen, but it provides a useful addition to engineering companies.
Article
Head tracking is a key technical component for AR and VR applications that use head‐mounted displays. Many different head‐tracking systems are currently in use, but one called “inside‐out” tracking seems to have the edge for consumer displays.
Conference Paper
Working postures and movements affect work efficiency and musculoskeletal health. To reduce the biomechanical exposure in physically demanding settings, working techniques may be improved by giving instant ergonomic feedback to the operator. This study investigates if feedback can be used to decrease adverse postures and movements in assembly work. A prototype solution of a smart textile workwear was used on a trainee assembly line. Posture and movement signals of 24 trainee operators were sampled via the workwear, transferred to a tablet for analyses and used to provide feedback suggesting improvements of work technique. Two modes of feedback were tested. Every participant’s work technique was measured before and after receiving the feedback and the results were compared. For upper arm elevation angle ≥60 degrees, behaviour change is indicated, supporting a positive work technique change, and indicated a future usefulness of technical automatic feedback for operators.
Article
The paper presents the VR-Ergo Log system, an inertial motion capture system integrated with immersive reality and combined with a heart rate monitoring. By using immersive reality, the operator will be able to move and interact within a virtual workplace environment, in order to permit a fast and efficient ergonomic assessment of future workplace solutions and to avoid all cost-consuming activities related to the pre-production design of the workplace or to the prototyping of new products. This integrated system allows to evaluate in advance the time-based and ergo-based indices which can help the practitioners on understanding how to design the workplace and the devices to be used by operators. In addition, the use of the heart rate monitor permits to have a real-time feedback regarding the fatigue the operator is perceiving. The use of such a system will help to make more efficient the early design phases of an industrial workspace, by also considering the impact of human diversity and avoiding non-ergonomic solutions especially when an ageing workforce will be enrolled in the system.