Content uploaded by Corrado Testi
Author content
All content in this area was uploaded by Corrado Testi on Jul 28, 2024
Content may be subject to copyright.
Use of Virtual Reality as a Tool for Evaluating a Lunar Habitat
Corrado Testi∗, David Nagy†, Joshua Dow‡
SICSA, Houston, TX, 77004
and
Dr. Olga Bannova§
SICSA, Houston, TX, 77004
This paper investigates the use of Extended Reality (XR) technologies, including Virtual
Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), in the iterative design
and evaluation of lunar habitats. Conducted over a four-month period at the University
of Houston’s Sasakawa International Center for Space Architecture (SICSA) and NASA’s
Marshall Space Flight Center (MSFC), the study aimed to create a comprehensive design
process that incorporates surface operations scenarios (ConOps), evaluation methodologies,
and human-centered design analysis. The research focused on a Surface Vertical Habitat,
a Node Surface Module, and two Horizontal Modules designed by the LASERR team. By
integrating VR into the design process, the study allowed for continuous monitoring and
real-time feedback, providing detailed insights into habitat configurations, interior layouts,
and operational scenarios from an immersive human perspective. The findings demonstrate
that XR technologies can significantly enhance design validation, offering faster and more
cost-effective methods compared to traditional approaches. This paper presents the results of
the study, discusses the limitations of XR applications, and outlines future steps for advancing
XR technologies in space habitat development.
I. Nomenclature
𝑋 𝑅 = Extended Reality
𝐴𝑅 = Augmented Reality
𝑀 𝑅 = Mixed Reality
𝑉 𝑅 = Virtual Reality
𝑆𝐼 𝐶 𝑆 𝐴 = Sasakawa International Center for Space Architecture
𝑀 𝑆𝐹 𝐶 = Marshall Space Flight Center
𝑇 𝐿 𝑋 = Task Load Index
𝑆𝑈𝑆 = Simulator Sickness Questionnaire
𝑀 𝑆𝑈 𝑆 = Modified Simulator Sickness Questionnaire
𝐸𝑉 𝐴 = Extra-Vehicular Activity
𝐼𝑉 𝐴 = Intra-Vehicular Activity
𝐿 𝐴𝑆 𝐸 𝑅𝑅 = Lunar Approach: Standardized, Expandable, Reusable, Relocatable
II. Introduction
T
his paper explores the integration of Extended Reality (XR) technologies, encompassing Virtual Reality (VR),
Augmented Reality (AR), and Mixed Reality (MR)
1
, into the design and evaluation processes for lunar habitats.
Conducted over a four-month period at the University of Houston’s Sasakawa International Center for Space Architecture
(SICSA) and NASA’s Habitation System Development Office at Marshall Space Flight Center (MSFC) in 2023, the
∗Graduate Student, University of Houston Sasakawa International Center for Space Architecture, ctesti@cougarnet.uh.edu
†Graduate Student, University of Houston Sasakawa International Center for Space Architecture, dznagy@cougarnet.uh.edu
‡Graduate Student, University of Houston Sasakawa International Center for Space Architecture, jedow@cougarnet.uh.edu
§SICSA Director, University of Houston Sasakawa International Center for Space Architecture, obannova@central.uh.edu
1
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
AIAA AVIATION FORUM AND ASCEND 2024
29 July - 2 August 2024, Las Vegas, Nevada
10.2514/6.2024-4842
Copyright © 2024 by Copyright © 2024 SICSA, University of Houston. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
AIAA Aviation Forum and ASCEND co-located Conference Proceedings
project followed an iterative development approach. XR technologies, particularly VR, were utilized as essential tools to
continuously monitor and assess design elements from a human-centered, immersive perspective.
XR technologies offer transformative capabilities across various industries by providing immersive and interactive
experiences. In the space industry, they are increasingly adopted for mission planning, astronaut training, and habitat
design
2
. VR enables rapid prototyping and real-time feedback, facilitating detailed evaluations of habitat configurations,
interior layouts, and operational scenarios. This iterative approach allows designers and testers to identify and resolve
potential human factors and usability issues early in the design phase, which is crucial for the practicalities of living and
working in a lunar environment.
The project focused on a Surface Vertical Habitat, a Node Surface Module, and two Horizontal Modules designed
by the LASERR team (Lunar Approach: Standardized, Expandable, Reusable, Relocatable). By immersing designers
and testers in a virtual lunar environment, VR provided insights into the practicalities and challenges of constructing
and maintaining lunar habitats. Commercially available VR headsets, such as Meta Quest and SteamVR, made these
advanced evaluations more accessible and cost-effective, demonstrating the potential for broader adoption of XR
technologies in space mission planning and habitat design.
This research aimed to define a comprehensive design process that includes possible surface operations scenarios
(ConOps), an evaluation methodology, and human-centered design analysis. The XR framework integrated VR into the
design process, offering a faster and more streamlined validation compared to traditional methods based on physical
mockups. The results showed that XR technologies could significantly enhance design validation, improving overall
design quality and efficiency.
The use of XR technologies in the design and evaluation process represents a significant advancement in space
architecture, providing a more comprehensive understanding of how proposed designs would function under the
demanding lunar surface conditions. The integration of XR offers new capabilities for in-depth design analysis and
aligns with current evaluation standards, potentially revolutionizing how space habitats are designed and tested
3
. This
study demonstrates the transformative impact of XR technologies on space habitat development and the implications of
these technologies on future development. The paper presents the methodologies employed, the findings from VR-based
evaluations, and outlines the next steps for advancing these methodologies in space exploration.
A. Augmented Reality (AR)
Augmented Reality (AR) overlays digital information onto the real world, enhancing the user’s perception of their
surroundings
4
. This technology allows users to interact with digital elements while still being aware of and engaged
with the physical world. AR is used in a variety of applications, such as overlaying navigational aids on a real-world
environment, enhancing educational experiences with interactive content, and assisting in complex tasks by providing
real-time data and visual instructions. In the context of space exploration, AR can be used to overlay technical data on
spacecraft components during maintenance operations or to guide astronauts during extravehicular activities (EVAs)
1
.
B. Virtual Reality (VR)
Virtual Reality (VR) creates a fully immersive digital environment that isolates the user from the real world
4
. Users
wear a VR headset that tracks their head movements and adjusts the display accordingly, creating the illusion of being
present in a different environment. VR is widely used in training simulations, gaming, architectural visualization, and
education. In the space industry, VR is particularly valuable for training astronauts by simulating various mission
scenarios, such as docking procedures, surface operations on the Moon or Mars, and emergency response drills. By
providing a risk-free environment for practice, VR helps prepare astronauts for the complexities and challenges of space
missions.
C. Mixed Reality (MR)
Mixed Reality (MR) combines elements of both AR and VR, allowing real and virtual objects to interact in real
time
4
. MR enables users to manipulate and interact with both physical and digital elements, creating a more integrated
and interactive experience. MR technologies often require more advanced hardware, such as headsets with built-in
cameras and sensors that map the physical environment and overlay digital content onto it. In space habitat design and
evaluation, MR can be used to visualize and test habitat configurations by blending real-world, low fidelity mockups
with digital models, allowing designers to interact with and modify their designs dynamically. A graphic illustrating an
overview of AR, VR, and MR can be seen below in Figure 1
2
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 1 Overview of XR (AR – VR – MR). Credit to: envision-is.com
III. XR Framework
Extended Reality (XR) technologies, which include augmented reality (AR), virtual reality (VR), and mixed reality
(MR), have experienced significant advancements over the past decade, driven largely by commercial developments
from companies like Meta and Apple
5
. These technologies are changing how we engage with digital information and
environments and are providing unique opportunities for advancement in various fields, including space exploration
2
.
Originally utilized by space agencies for scientific and training applications, the rapid development of hardware and
reduction in costs have made advanced VR headsets and XR applications accessible to more industries, academia, and
general consumers. The open-source nature of leading XR development platforms, Unreal and Unity3D, has further
propelled the technology’s growth, leading to its application in fields such as tourism, education, retailing, gaming,
healthcare, and manufacturing.
At the Sasakawa International Center for Space Architecture (SICSA) at the University of Houston, ongoing research
focuses on human space exploration and the design of habitats and infrastructure for space and other planets. Supported
by The Boeing Company, SICSA has initiated comprehensive research into the feasibility of using XR as a tool for
design validation. This research aims to develop a framework to standardize the use of XR in the design process,
addressing gaps in current practices and establishing measurable indicators for evaluating XR’s impact. NASA’s System
Design Process involves four main phases: identifying stakeholder expectations, defining requirements, decomposing
these requirements logically, and developing design solutions. However, the fourth phase — design solution definition
— faces several challenges, including limited resources for building physical mockups, subjective design evaluations,
and a lack of strategies for implementing testing results into design iterations
6
. These challenges restrict the potential
for design iterations and typically allow for mockup testing only after the design phase is nearly complete. To address
these issues, SICSA developed an XR testing framework to incorporate into and enhance the design solution definition
phase. This framework, shown in Figure 2, was created during research and design studies sponsored by Boeing in the
2020-2021 and 2021-2022 academic years, and aims to augment the existing System Design Process rather than replace
it, as seen in Figure 3.
3
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 2 XR Framework designed at SICSA6
Figure. 3 Design solution definition phase adaptation7
The next stage of the project, currently under development and also sponsored by Boeing, focuses on completing
the XR lab assembly and demonstrating the previously developed methodology. This framework seeks to provide
more efficient, cost-effective, and objective design evaluations, potentially revolutionizing the way space hardware and
habitats are tested and validated8.
The integration of XR technologies into the design and evaluation of lunar habitats represents a significant
advancement in space architecture. These technologies provide a more comprehensive and immersive way to visualize
and interact with habitat designs, enabling rapid prototyping and iterative testing. This approach not only enhances the
design process but also provides deeper insights into the practicalities and challenges of living and working in a lunar
4
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
environment. By leveraging XR technologies, designers and engineers can create more effective and human-centered
space habitats, ultimately improving the safety and efficiency of future space missions.
IV. Assets
The VR evaluation conducted in this study focuses on a phase in lunar exploration during which four astronauts live
in a base for 14 days. This habitat was designed by the LASERR Team, and in this phase, the habitat consists of four
modules: the Vertical Surface Habitat, the Node, and the Type 1 and Type 2 Horizontal Modules. A configuration of
this lunar architecture can be seen below in Figure 4.
Figure. 4 Vertical Habitat (Middle Top), Node (Middle Bottom), and Horizontal Type I (Left) and Type II
(Right)
A. Vertical Habitat
The Vertical Surface Habitat serves as the central structure of the outpost. Initially compressed for transport, the
top two floors of this habitat module inflate to maximize interior space. This habitat contains all essential life-support
systems and can function independently. In the phase evaluated in this study, the Pressurized Rover is no longer docked
beneath this habitat. Instead, the Vertical Habitat now connects directly to the Node, which facilitates the base’s
expansion in two directions. The habitat, equipped with adjustable landing legs, is deployed directly onto the lunar
surface. The legs, designed to elevate the habitat to a height of 4 meters, allow for connection flexibility and stability,
accommodating loads. The Vertical Habitat measures 5 meters in width and stands 11.8 meters tall when fully deployed.
This module offers access to the lunar surface by an airlock integrated with the rigid part of the module.
B. Nodes
The Node module acts as a critical connecting hub within the outpost, attaching beneath the Vertical Surface Habitat
and providing two expansion directions. This module enables flexible branching to the Horizontal Type 1 and Type
2 modules at angles of 90, 120, or 180 degrees, offering versatile layout options. Measuring 3 meters in width, the
Node also serves as a primary radiation shelter for the outpost. Each Node can accommodate up to four crewmembers,
enhancing the safety and resilience of the habitat.
C. Horizontal Modules
The outpost features two types of horizontal habitats: Type 1 and Type 2. Both modules have an undeployed diameter
of 3 meters to fit various launch fairings. The Type 1 module is designed to be slimmer and more modular, with an
undeployed length of 5 meters that extends to 10 meters when deployed. The Type 2 module is broader, measuring 7
5
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
meters in both length and width. This phase allows the pressurized rover to dock with either of the horizontal modules.
The interior of these modules is characterized by a movable rack system designed for a better utilization of the available
volume, inspired by FLEXRack
9
. This system contains embedded rails which allow the racks to slide along the length
of the module and free areas to work in as needed. This system offers a more adaptable usage of space and access from
multiple angles, benefiting tasks such as plant care and stowage.
V. Evaluation Process
Figure. 5 Evaluation and Design Process
The evaluation process for lunar habitat design, as illustrated in Figure 5 above, follows a structured and iterative
approach. Starting with the Preliminary Design phase, VR assets are developed through preliminary 3D modeling
and integrated into a virtual lunar environment. This setup undergoes the VR Evaluation, focusing on assessing
spatial configurations and functional aspects using basic virtual interactions. The next stage in this process is the VR+
Evaluation phase, where VR assets are tested using the XR Framework in the enhanced XSOS lunar environment,
incorporating biosensors and virtual interactions for a more comprehensive assessment. The MR Evaluation phase
introduces a physical frame to the XSOS lunar environment, blending virtual and physical interactions to refine the
design further. Finally, in the MR+ Evaluation phase, the XR Framework validation combines both virtual and physical
interactions, utilizing advanced features like partial gravity simulations to achieve Design Validation. This progressive
approach ensures thorough evaluation and continuous improvement of the habitat design, from initial concepts to final
validation.
A. VR Evaluation
The first evaluation served as a preliminary validation of the early design for the modules. The modules were
modeled using 3D software such as Rhinoceros and Blender, and the 3D model was then imported into Unreal Engine
4.7. The evaluation was conducted in a lunar environment designed and modeled at SICSA, where the modeled habitat
was integrated for the VR Evaluation.
For this initial test, two evaluators used VR headsets to experience the immersive perspective offered by the VR
simulation. The primary objective was to assess whether the designed spaces were functional from a human-centered
viewpoint. The VR simulation allowed exploration and interaction with the virtual habitat, providing a realistic sense of
scale and spatial arrangement, shown below in Figure 6.
6
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 6 VR Evaluation for the preliminary design
Several critical areas requiring improvement were identified during this process, like the usability of the internal
racks and the external ladder. An iterative approach was employed to modify the design based on the feedback obtained
from the VR simulation. The updated designs were then re-evaluated using the same VR setup in Unreal Engine to
ensure that the changes effectively addressed the identified issues.
In this phase, the simulation featured limited collision settings, with the primary focus on evaluating the volume and
spatial configuration within the modules. This iterative testing cycle enabled continual refinement of the habitat design,
improving its functionality and ensuring it met the needs of the astronauts who would inhabit it.
B. VR+ Evaluation
Similar to the first phase of the evaluation process, VR headsets are utilized for the evaluation of the second phase,
providing an immersive perspective that is essential for the testing process. The XOSS preliminary version is integrated
as the new lunar environment, specifically modeling the lunar south pole. This lunar environment was integrated
after the AAVS Moonshot workshop conducted at SICSA in collaboration with the Architectural Association School
of Architecture. The simulation is further deepened by adding more interactions within the virtual world in Unreal
Engine 5.4, which are designed to engange with the testers and enhance the realism of the experiments. In this phase,
preliminary tests using the XR Framework are also conducted6.
Three testers participated in this phase, providing diverse feedback on the habitat design. Two critical parts of
the lunar habitat underwent repeated testing, leading to iterative corrections and improvements in the design. This
thorough evaluation process ensures continuous refinement of the design to better meet the needs and constraints of a
lunar mission.
The two specific experiments that were performed are:
1. Habitat Ingress and Egress via Airlock
This experiment begins from the perspective of an astronaut inside an EVA suit attached to one of the two suit ports
in the airlock. The tester detaches from the suit-port, moves within the airlock, and interacts with the hatch to open
it. The tester then exits onto the stairs, climbs down to the lunar surface, and returns to the ladder. The experiment
concludes with the tester re-entering the airlock, closing the hatch, and reattaching to the suit-port. This sequence
provides a human-centric evaluation of the airlock’s spatial and functional design, ensuring that the design supports the
necessary movements and interactions of an astronaut in an EVA suit. The VR Evaluation for this test was conducted
wearing xEMU suits designed at SISCA for a more immersive experience
10
. This experiment is shown below in Figure
7 and Figure 8.
7
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 7 VR+ Evaluation for the Airlock with the suit mock-up
Figure. 8 VR+ Evaluation for the Airlock in Unreal Engine 5.4
2. Movable Racks in the Horizontal Module Type 1
This experiment starts at the entrance of the Horizontal Module Type 1. The tester can see two racks highlighted
in red to indicate interactive elements. The tester approaches the movable racks and tests their mobility along the
axis of the rails they are attached to. This evaluation assesses the practicality and ease of moving the racks during
intra-vehicular activities (IVA), ensuring that the design supports efficient use of space and accessibility of stored items.
The usability of this rack typology is inspired by the research conducted for Flexhab
9
. This experiment can be seen
below in Figure 9 and Figure 10
Figure. 9 VR+ Evaluation for the Movable Racks
8
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 10 VR+ Evaluation for the Movable Racks in Unreal Engine 5.4
After completing both experiments, the testers used the XR Framework to complete the NASA Task Load Index
(TLX)
11
and Modified Simulator Sickness Questionnaire (MSUS)
12
. Additionally, testers wore biosensors to gather
data on physiological responses, and to avoid possible bias from the Human Performance surveys, providing further
insights into potential areas for design improvement7.
C. MR Evaluation
After achieving an acceptable level of design, the next step in the evaluation process is the MR evaluation phase. This
phase uses the XR Framework, including biosensors, NASA Task Load Index (TLX), and Modified Simulator Sickness
Questionnaire (MSUS). Here, a single design component is focused on and brought to higher levels of fidelity for the
experiment. For example, this study would construct a spatially accurate frame for the airlock, and build low-fidelity
movable racks out of lightweight materials such as wood and aluminum. This provides a physical collision for the tester
while they are immersed in the virtual world using VR headsets. More integration for different tasks can also be added.
In this phase, Mixed Reality is adopted as the primary tool for evaluation, combining real-world physical interactions
with the virtual environment. The XR Framework is implemented to ensure comprehensive and unbiased evaluation,
utilizing bio data gathered during the tests. The final version of the integrated XOSS environment allows for extensive
evaluations across multiple fields, incorporating more assets and interactions.
D. MR+ Evaluation
The MR+ Evaluation phase, as a future task, aims to enhance the immersion during experiments. Mixed Reality
continues to be the main tool, with additional features to improve realism. A crane could be adapted to this phase for
partial gravity simulations, adding a new dimension to the testing environment
8
. Additionally, higher fidelity mockups
are expected to be introduced in this phase.
Further integration of collisions and interactions are to be included in the Unreal XOSS environment, providing a
richer virtual experience. The XR Framework is set to be fully integrated into the evaluation process, ensuring thorough
testing and validation of all design aspects. This comprehensive approach aims to facilitate a nuanced understanding of
the habitat’s functionality and allow for precise adjustments, leading to a more robust and human-centered design.
VI. Conclusion
The integration of Extended Reality (XR) technologies into the design and evaluation processes for lunar habitats
has proven to be a transformative approach. By leveraging VR, AR, and MR, significant advancements have been made
in assessing and refining habitat designs. The iterative process facilitated by these technologies allowed for continuous
improvements based on real-time feedback, enhancing the overall functionality and human-centered aspects of the
habitats.
The VR evaluations provided critical insights into spatial configurations and usability, identifying areas for
improvement that were addressed through iterative testing. The incorporation of biosensors and detailed human
performance metrics ensured that the evaluations were comprehensive and objective, minimizing potential biases.
In the MR phase, the blending of physical and virtual elements will add a new dimension to the testing environment,
enabling more realistic and interactive evaluations. The construction of physical frames and the integration of advanced
interaction mechanisms, such as partial gravity simulations, will further enhance the realism and effectiveness of the
tests.
9
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Looking ahead, future work will focus on expanding the use of MR evaluations. As the design becomes more
defined and refined through this iterative evaluation process, the final phase will involve the use of AR. This phase will
include building a physical mock-up of the habitat, integrated with AR to minimize virtual assets and maximize physical
assets. This approach will provide a comprehensive and tangible evaluation, ensuring that the habitat design is not only
functional but also practical for real-world implementation.
The iterative use of XR technologies not only streamlined the design process but also ensured that the habitats were
robust and well-suited to the demanding conditions of the lunar surface. The methodologies developed and employed in
this project demonstrate the potential of XR to revolutionize space habitat design and testing, offering new capabilities
for in-depth analysis and validation.
As XR technologies continue to evolve, their application in space architecture will likely expand, providing even
greater opportunities for innovation and improvement. The successful implementation of these technologies in this
project highlights their potential to enhance the safety, efficiency, and overall success of future space missions. This
study lays the groundwork for further advancements in the use of XR in space exploration, paving the way for more
effective and human-centered designs in the years to come.
VII. Acknowledgements
This research is in continuity with the research supported by Boeing, and Buendea who provided SICSA with XOSS
Lunar Environment for Unreal Engine. The authors would also like to thank Prof. Larry Toups and Vittorio Netti from
SICSA, for their precious help.
References
[1]
Rauschnabel, P. A., Felix, R., Hinsch, C., Shahab, H., and Alt, F., “What is XR? Towards a Framework for Augmented and
Virtual Reality,” Computers in Human Behavior, Vol. 133, 2022, p. 107289. https://doi.org/10.1016/j.chb.2022.107289, URL
https://www.sciencedirect.com/science/article/pii/S074756322200111X.
[2]
Stone, R., Panfilov, P., and Shukshunov, V., “Evolution of Aerospace Simulation: From Immersive Virtual Reality to Serious
Games,” 2011. https://doi.org/10.1109/RAST.2011.5966921.
[3]
Nilsson, T., Rometsch, F., Bensch, L., Dufresne, F., Demedeiros, P., Guerra, E., Casini, A., Vock, A., Gaeremynck, F., and
Cowley, A., “Using Virtual Reality to Shape Humanity’s Return to the Moon: Key Takeaways from a Design Study,” 2023, pp.
1–16. https://doi.org/10.1145/3544548.3580718.
[4]
Milgram, P., and Colquhoun, H., “A Taxonomy of Real and Virtual World Display Integration,” 2001. https://doi.org/10.1007/978-
3-642-87512-0_1.
[5]
Anthes, C., García-Hernández, R. J., Wiedemann, M., and Kranzlmüller, D., “State of the art of virtual reality technology,”
2016 IEEE Aerospace Conference, 2016, pp. 1–19. https://doi.org/10.1109/AERO.2016.7500674, URL https://ieeexplore.ieee.
org/abstract/document/7500674.
[6]
Netti, V., Bannova, O., and Rajkumar, A., “XR Testing Framework for Human-System Interaction Design Validation,” 2023.
URL https://www.researchgate.net/publication/372485842_XR_Testing_Framework_for_Human-System_Interaction_Design_
Validation.
[7]
Netti, V., Guzman, L., and Rajkumar, A., “A Framework for use of immersive technologies for human-system integration testing
of space hardware,” 2021. URL https://www.researchgate.net/publication/351637225_A_Framework_for_use_of_immersive_
technologies_for_human-system_integration_testing_of_space_hardware.
[8]
Netti, V., Bannova, O., Kaur, J., and Spolzino, R., “Mixed Reality (XR) as a Validation Method for Digital Modeling of Space
Habitats,” 2022. https://doi.org/10.1061/9780784484470.074.
[9]
Thallner, M., Häuplik-Meusburger, S., and Cowley, A., “FLEXHAB WORKING MODULE -ARCHITECTURAL REQUIRE-
MENTS AND PROTOTYPING FOR A LUNAR BASE ANALOGUE,” 2018. URL https://spacearchitect.org/pubs/IAC-18-
E5.1.10.pdf.
[10]
Netti, V., and Mangili, P., “DESIGN AND DEVELOPMENT OF A SPACE SUIT MOCK-UP FOR VR-BASED EVA
RESEARCH AND SIMULATION,” 2023. https://doi.org/https://doi.org/10.1016/j.jsse.2024.05.001, URL https://www.
sciencedirect.com/science/article/abs/pii/S2468896724000624?via%3Dihub.
10
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
[11]
Hart, S. G., and Staveland, L. E., “Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical
Research,” Advances in Psychology, Human Mental Workload, Vol. 52, edited by P. A. Hancock and N. Meshkati, North-Holland,
1988, pp. 139–183. https://doi.org/10.1016/S0166-4115(08)62386-9, URL https://www.sciencedirect.com/science/article/pii/
S0166411508623869.
[12]
Brooke, J., “SUS: A quick and dirty usability scale,” Usability Eval. Ind., Vol. 189, 1995. URL https://www.researchgate.net/
publication/228593520_SUS_A_quick_and_dirty_usability_scale.
11
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842