Lab
Maura Mengoni's Lab
Institution: Marche Polytechnic University
Featured research (4)
The use of eXtended Reality (XR) technologies, including augmented reality (AR), virtual reality (VR), and mixed reality (MR), has become increasingly popular in museums to enhance the visitor experience. However, the impact of XR technologies on Learning Performance in the context of archeological museums needs to be better understood. This study aims to investigate the relationships between Usability, Presence and Learning Performance by developing XR experiences showcasing archeological artefacts and conducting user testing to evaluate their effectiveness. A laboratory test is conducted to compare a VR application with a mobile AR one, presenting the digital models of five archeological findings. Descriptive statistics are used to compare the two case studies, providing valuable insights into the impact of XR technologies on the visitor experience from a learning perspective. The study confirms that Usability has a more significant effect on learning than Presence and can help designers and museum managers better understand the factors contributing to a successful XR experience. The findings suggest that while Presence is an important factor in improving visitors’ experience, Usability should be the priority when designing XR experiences for museums.KeywordsCultural heritageXR TechnologiesTechnological Benchmarking
This study aims at comparing three assembly training
applications based on different XR technologies characterized
by different degrees of immersion (i.e., an MR application based
on Hololens 2, a desktop AR application and a digital handbook
visualized on a monitor). A total of 54 subjects, recruited
among students and personnel of Universit`a Politecnica delle
Marche, have been involved. They were assigned to 3 groups
age and gender matching. Each group is asked to complete the
training related to the assembly of a Lego commercial set (i.e.,
LEGO 10593), using one of the three considered applications.
Results allows us to observe the effects of the immersion on the
recall performances, assessed in terms of recall completion time,
assembly mistakes, picking mistakes and sequence mistakes.
Featured Application
We introduce a motion capture tool that uses at least one RGB-camera, exploiting an open-source deep learning model with low computational requirements, already used to im-plement mobile apps for mobility analysis. Experimental results suggest the suitability of this tool to perform posture analysis aimed at assessing the RULA score, in a more efficient way.
Abstract
This paper introduces a low-cost and low computational marker-less motion capture system based on the acquisition of frame images through standard RGB cameras. It exploits the open-source deep learning model CMU, from the tf-pose-estimation project. Its numerical accuracy and its usefulness for ergonomic assessment are evaluated by a proper experiment, designed and performed to: (1) compare the data provided by it with those collected from a motion capture golden standard system; (2) compare the RULA scores obtained with data provided by it with those obtained with data provided by the Vicon Nexus system and those estimated through video analysis, by a team of three expert ergonomists. Tests have been conducted in standardized laboratory conditions and involved a total of six subjects. Results suggest that the proposed system can predict angles with good consistency and give evidence about the tool’s usefulness for ergonomist.