Conference PaperPDF Available

Use of Virtual Reality as a Tool for Evaluating a Lunar Habitat

Authors:

Abstract and Figures

This paper investigates the use of Extended Reality (XR) technologies, including Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), in the iterative design and evaluation of lunar habitats. Conducted over a four-month period at the University of Houston's Sasakawa International Center for Space Architecture (SICSA) and NASA's Marshall Space Flight Center (MSFC), the study aimed to create a comprehensive design process that incorporates surface operations scenarios (ConOps), evaluation methodologies, and human-centered design analysis. The research focused on a Surface Vertical Habitat, a Node Surface Module, and two Horizontal Modules designed by the LASERR team. By integrating VR into the design process, the study allowed for continuous monitoring and real-time feedback, providing detailed insights into habitat configurations, interior layouts, and operational scenarios from an immersive human perspective. The findings demonstrate that XR technologies can significantly enhance design validation, offering faster and more cost-effective methods compared to traditional approaches. This paper presents the results of the study, discusses the limitations of XR applications, and outlines future steps for advancing XR technologies in space habitat development.
Content may be subject to copyright.
Use of Virtual Reality as a Tool for Evaluating a Lunar Habitat
Corrado Testi, David Nagy, Joshua Dow
SICSA, Houston, TX, 77004
and
Dr. Olga Bannova§
SICSA, Houston, TX, 77004
This paper investigates the use of Extended Reality (XR) technologies, including Virtual
Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), in the iterative design
and evaluation of lunar habitats. Conducted over a four-month period at the University
of Houston’s Sasakawa International Center for Space Architecture (SICSA) and NASA’s
Marshall Space Flight Center (MSFC), the study aimed to create a comprehensive design
process that incorporates surface operations scenarios (ConOps), evaluation methodologies,
and human-centered design analysis. The research focused on a Surface Vertical Habitat,
a Node Surface Module, and two Horizontal Modules designed by the LASERR team. By
integrating VR into the design process, the study allowed for continuous monitoring and
real-time feedback, providing detailed insights into habitat configurations, interior layouts,
and operational scenarios from an immersive human perspective. The findings demonstrate
that XR technologies can significantly enhance design validation, offering faster and more
cost-effective methods compared to traditional approaches. This paper presents the results of
the study, discusses the limitations of XR applications, and outlines future steps for advancing
XR technologies in space habitat development.
I. Nomenclature
𝑋 𝑅 = Extended Reality
𝐴𝑅 = Augmented Reality
𝑀 𝑅 = Mixed Reality
𝑉 𝑅 = Virtual Reality
𝑆𝐼 𝐶 𝑆 𝐴 = Sasakawa International Center for Space Architecture
𝑀 𝑆𝐹 𝐶 = Marshall Space Flight Center
𝑇 𝐿 𝑋 = Task Load Index
𝑆𝑈𝑆 = Simulator Sickness Questionnaire
𝑀 𝑆𝑈 𝑆 = Modified Simulator Sickness Questionnaire
𝐸𝑉 𝐴 = Extra-Vehicular Activity
𝐼𝑉 𝐴 = Intra-Vehicular Activity
𝐿 𝐴𝑆 𝐸 𝑅𝑅 = Lunar Approach: Standardized, Expandable, Reusable, Relocatable
II. Introduction
T
his paper explores the integration of Extended Reality (XR) technologies, encompassing Virtual Reality (VR),
Augmented Reality (AR), and Mixed Reality (MR)
1
, into the design and evaluation processes for lunar habitats.
Conducted over a four-month period at the University of Houston’s Sasakawa International Center for Space Architecture
(SICSA) and NASA’s Habitation System Development Office at Marshall Space Flight Center (MSFC) in 2023, the
Graduate Student, University of Houston Sasakawa International Center for Space Architecture, ctesti@cougarnet.uh.edu
Graduate Student, University of Houston Sasakawa International Center for Space Architecture, dznagy@cougarnet.uh.edu
Graduate Student, University of Houston Sasakawa International Center for Space Architecture, jedow@cougarnet.uh.edu
§SICSA Director, University of Houston Sasakawa International Center for Space Architecture, obannova@central.uh.edu
1
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
AIAA AVIATION FORUM AND ASCEND 2024
29 July - 2 August 2024, Las Vegas, Nevada
10.2514/6.2024-4842
Copyright © 2024 by Copyright © 2024 SICSA, University of Houston. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
AIAA Aviation Forum and ASCEND co-located Conference Proceedings
project followed an iterative development approach. XR technologies, particularly VR, were utilized as essential tools to
continuously monitor and assess design elements from a human-centered, immersive perspective.
XR technologies offer transformative capabilities across various industries by providing immersive and interactive
experiences. In the space industry, they are increasingly adopted for mission planning, astronaut training, and habitat
design
2
. VR enables rapid prototyping and real-time feedback, facilitating detailed evaluations of habitat configurations,
interior layouts, and operational scenarios. This iterative approach allows designers and testers to identify and resolve
potential human factors and usability issues early in the design phase, which is crucial for the practicalities of living and
working in a lunar environment.
The project focused on a Surface Vertical Habitat, a Node Surface Module, and two Horizontal Modules designed
by the LASERR team (Lunar Approach: Standardized, Expandable, Reusable, Relocatable). By immersing designers
and testers in a virtual lunar environment, VR provided insights into the practicalities and challenges of constructing
and maintaining lunar habitats. Commercially available VR headsets, such as Meta Quest and SteamVR, made these
advanced evaluations more accessible and cost-effective, demonstrating the potential for broader adoption of XR
technologies in space mission planning and habitat design.
This research aimed to define a comprehensive design process that includes possible surface operations scenarios
(ConOps), an evaluation methodology, and human-centered design analysis. The XR framework integrated VR into the
design process, offering a faster and more streamlined validation compared to traditional methods based on physical
mockups. The results showed that XR technologies could significantly enhance design validation, improving overall
design quality and efficiency.
The use of XR technologies in the design and evaluation process represents a significant advancement in space
architecture, providing a more comprehensive understanding of how proposed designs would function under the
demanding lunar surface conditions. The integration of XR offers new capabilities for in-depth design analysis and
aligns with current evaluation standards, potentially revolutionizing how space habitats are designed and tested
3
. This
study demonstrates the transformative impact of XR technologies on space habitat development and the implications of
these technologies on future development. The paper presents the methodologies employed, the findings from VR-based
evaluations, and outlines the next steps for advancing these methodologies in space exploration.
A. Augmented Reality (AR)
Augmented Reality (AR) overlays digital information onto the real world, enhancing the user’s perception of their
surroundings
4
. This technology allows users to interact with digital elements while still being aware of and engaged
with the physical world. AR is used in a variety of applications, such as overlaying navigational aids on a real-world
environment, enhancing educational experiences with interactive content, and assisting in complex tasks by providing
real-time data and visual instructions. In the context of space exploration, AR can be used to overlay technical data on
spacecraft components during maintenance operations or to guide astronauts during extravehicular activities (EVAs)
1
.
B. Virtual Reality (VR)
Virtual Reality (VR) creates a fully immersive digital environment that isolates the user from the real world
4
. Users
wear a VR headset that tracks their head movements and adjusts the display accordingly, creating the illusion of being
present in a different environment. VR is widely used in training simulations, gaming, architectural visualization, and
education. In the space industry, VR is particularly valuable for training astronauts by simulating various mission
scenarios, such as docking procedures, surface operations on the Moon or Mars, and emergency response drills. By
providing a risk-free environment for practice, VR helps prepare astronauts for the complexities and challenges of space
missions.
C. Mixed Reality (MR)
Mixed Reality (MR) combines elements of both AR and VR, allowing real and virtual objects to interact in real
time
4
. MR enables users to manipulate and interact with both physical and digital elements, creating a more integrated
and interactive experience. MR technologies often require more advanced hardware, such as headsets with built-in
cameras and sensors that map the physical environment and overlay digital content onto it. In space habitat design and
evaluation, MR can be used to visualize and test habitat configurations by blending real-world, low fidelity mockups
with digital models, allowing designers to interact with and modify their designs dynamically. A graphic illustrating an
overview of AR, VR, and MR can be seen below in Figure 1
2
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 1 Overview of XR (AR VR MR). Credit to: envision-is.com
III. XR Framework
Extended Reality (XR) technologies, which include augmented reality (AR), virtual reality (VR), and mixed reality
(MR), have experienced significant advancements over the past decade, driven largely by commercial developments
from companies like Meta and Apple
5
. These technologies are changing how we engage with digital information and
environments and are providing unique opportunities for advancement in various fields, including space exploration
2
.
Originally utilized by space agencies for scientific and training applications, the rapid development of hardware and
reduction in costs have made advanced VR headsets and XR applications accessible to more industries, academia, and
general consumers. The open-source nature of leading XR development platforms, Unreal and Unity3D, has further
propelled the technology’s growth, leading to its application in fields such as tourism, education, retailing, gaming,
healthcare, and manufacturing.
At the Sasakawa International Center for Space Architecture (SICSA) at the University of Houston, ongoing research
focuses on human space exploration and the design of habitats and infrastructure for space and other planets. Supported
by The Boeing Company, SICSA has initiated comprehensive research into the feasibility of using XR as a tool for
design validation. This research aims to develop a framework to standardize the use of XR in the design process,
addressing gaps in current practices and establishing measurable indicators for evaluating XR’s impact. NASA’s System
Design Process involves four main phases: identifying stakeholder expectations, defining requirements, decomposing
these requirements logically, and developing design solutions. However, the fourth phase design solution definition
faces several challenges, including limited resources for building physical mockups, subjective design evaluations,
and a lack of strategies for implementing testing results into design iterations
6
. These challenges restrict the potential
for design iterations and typically allow for mockup testing only after the design phase is nearly complete. To address
these issues, SICSA developed an XR testing framework to incorporate into and enhance the design solution definition
phase. This framework, shown in Figure 2, was created during research and design studies sponsored by Boeing in the
2020-2021 and 2021-2022 academic years, and aims to augment the existing System Design Process rather than replace
it, as seen in Figure 3.
3
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 2 XR Framework designed at SICSA6
Figure. 3 Design solution definition phase adaptation7
The next stage of the project, currently under development and also sponsored by Boeing, focuses on completing
the XR lab assembly and demonstrating the previously developed methodology. This framework seeks to provide
more efficient, cost-effective, and objective design evaluations, potentially revolutionizing the way space hardware and
habitats are tested and validated8.
The integration of XR technologies into the design and evaluation of lunar habitats represents a significant
advancement in space architecture. These technologies provide a more comprehensive and immersive way to visualize
and interact with habitat designs, enabling rapid prototyping and iterative testing. This approach not only enhances the
design process but also provides deeper insights into the practicalities and challenges of living and working in a lunar
4
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
environment. By leveraging XR technologies, designers and engineers can create more effective and human-centered
space habitats, ultimately improving the safety and efficiency of future space missions.
IV. Assets
The VR evaluation conducted in this study focuses on a phase in lunar exploration during which four astronauts live
in a base for 14 days. This habitat was designed by the LASERR Team, and in this phase, the habitat consists of four
modules: the Vertical Surface Habitat, the Node, and the Type 1 and Type 2 Horizontal Modules. A configuration of
this lunar architecture can be seen below in Figure 4.
Figure. 4 Vertical Habitat (Middle Top), Node (Middle Bottom), and Horizontal Type I (Left) and Type II
(Right)
A. Vertical Habitat
The Vertical Surface Habitat serves as the central structure of the outpost. Initially compressed for transport, the
top two floors of this habitat module inflate to maximize interior space. This habitat contains all essential life-support
systems and can function independently. In the phase evaluated in this study, the Pressurized Rover is no longer docked
beneath this habitat. Instead, the Vertical Habitat now connects directly to the Node, which facilitates the base’s
expansion in two directions. The habitat, equipped with adjustable landing legs, is deployed directly onto the lunar
surface. The legs, designed to elevate the habitat to a height of 4 meters, allow for connection flexibility and stability,
accommodating loads. The Vertical Habitat measures 5 meters in width and stands 11.8 meters tall when fully deployed.
This module offers access to the lunar surface by an airlock integrated with the rigid part of the module.
B. Nodes
The Node module acts as a critical connecting hub within the outpost, attaching beneath the Vertical Surface Habitat
and providing two expansion directions. This module enables flexible branching to the Horizontal Type 1 and Type
2 modules at angles of 90, 120, or 180 degrees, offering versatile layout options. Measuring 3 meters in width, the
Node also serves as a primary radiation shelter for the outpost. Each Node can accommodate up to four crewmembers,
enhancing the safety and resilience of the habitat.
C. Horizontal Modules
The outpost features two types of horizontal habitats: Type 1 and Type 2. Both modules have an undeployed diameter
of 3 meters to fit various launch fairings. The Type 1 module is designed to be slimmer and more modular, with an
undeployed length of 5 meters that extends to 10 meters when deployed. The Type 2 module is broader, measuring 7
5
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
meters in both length and width. This phase allows the pressurized rover to dock with either of the horizontal modules.
The interior of these modules is characterized by a movable rack system designed for a better utilization of the available
volume, inspired by FLEXRack
9
. This system contains embedded rails which allow the racks to slide along the length
of the module and free areas to work in as needed. This system offers a more adaptable usage of space and access from
multiple angles, benefiting tasks such as plant care and stowage.
V. Evaluation Process
Figure. 5 Evaluation and Design Process
The evaluation process for lunar habitat design, as illustrated in Figure 5 above, follows a structured and iterative
approach. Starting with the Preliminary Design phase, VR assets are developed through preliminary 3D modeling
and integrated into a virtual lunar environment. This setup undergoes the VR Evaluation, focusing on assessing
spatial configurations and functional aspects using basic virtual interactions. The next stage in this process is the VR+
Evaluation phase, where VR assets are tested using the XR Framework in the enhanced XSOS lunar environment,
incorporating biosensors and virtual interactions for a more comprehensive assessment. The MR Evaluation phase
introduces a physical frame to the XSOS lunar environment, blending virtual and physical interactions to refine the
design further. Finally, in the MR+ Evaluation phase, the XR Framework validation combines both virtual and physical
interactions, utilizing advanced features like partial gravity simulations to achieve Design Validation. This progressive
approach ensures thorough evaluation and continuous improvement of the habitat design, from initial concepts to final
validation.
A. VR Evaluation
The first evaluation served as a preliminary validation of the early design for the modules. The modules were
modeled using 3D software such as Rhinoceros and Blender, and the 3D model was then imported into Unreal Engine
4.7. The evaluation was conducted in a lunar environment designed and modeled at SICSA, where the modeled habitat
was integrated for the VR Evaluation.
For this initial test, two evaluators used VR headsets to experience the immersive perspective offered by the VR
simulation. The primary objective was to assess whether the designed spaces were functional from a human-centered
viewpoint. The VR simulation allowed exploration and interaction with the virtual habitat, providing a realistic sense of
scale and spatial arrangement, shown below in Figure 6.
6
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 6 VR Evaluation for the preliminary design
Several critical areas requiring improvement were identified during this process, like the usability of the internal
racks and the external ladder. An iterative approach was employed to modify the design based on the feedback obtained
from the VR simulation. The updated designs were then re-evaluated using the same VR setup in Unreal Engine to
ensure that the changes effectively addressed the identified issues.
In this phase, the simulation featured limited collision settings, with the primary focus on evaluating the volume and
spatial configuration within the modules. This iterative testing cycle enabled continual refinement of the habitat design,
improving its functionality and ensuring it met the needs of the astronauts who would inhabit it.
B. VR+ Evaluation
Similar to the first phase of the evaluation process, VR headsets are utilized for the evaluation of the second phase,
providing an immersive perspective that is essential for the testing process. The XOSS preliminary version is integrated
as the new lunar environment, specifically modeling the lunar south pole. This lunar environment was integrated
after the AAVS Moonshot workshop conducted at SICSA in collaboration with the Architectural Association School
of Architecture. The simulation is further deepened by adding more interactions within the virtual world in Unreal
Engine 5.4, which are designed to engange with the testers and enhance the realism of the experiments. In this phase,
preliminary tests using the XR Framework are also conducted6.
Three testers participated in this phase, providing diverse feedback on the habitat design. Two critical parts of
the lunar habitat underwent repeated testing, leading to iterative corrections and improvements in the design. This
thorough evaluation process ensures continuous refinement of the design to better meet the needs and constraints of a
lunar mission.
The two specific experiments that were performed are:
1. Habitat Ingress and Egress via Airlock
This experiment begins from the perspective of an astronaut inside an EVA suit attached to one of the two suit ports
in the airlock. The tester detaches from the suit-port, moves within the airlock, and interacts with the hatch to open
it. The tester then exits onto the stairs, climbs down to the lunar surface, and returns to the ladder. The experiment
concludes with the tester re-entering the airlock, closing the hatch, and reattaching to the suit-port. This sequence
provides a human-centric evaluation of the airlock’s spatial and functional design, ensuring that the design supports the
necessary movements and interactions of an astronaut in an EVA suit. The VR Evaluation for this test was conducted
wearing xEMU suits designed at SISCA for a more immersive experience
10
. This experiment is shown below in Figure
7 and Figure 8.
7
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 7 VR+ Evaluation for the Airlock with the suit mock-up
Figure. 8 VR+ Evaluation for the Airlock in Unreal Engine 5.4
2. Movable Racks in the Horizontal Module Type 1
This experiment starts at the entrance of the Horizontal Module Type 1. The tester can see two racks highlighted
in red to indicate interactive elements. The tester approaches the movable racks and tests their mobility along the
axis of the rails they are attached to. This evaluation assesses the practicality and ease of moving the racks during
intra-vehicular activities (IVA), ensuring that the design supports efficient use of space and accessibility of stored items.
The usability of this rack typology is inspired by the research conducted for Flexhab
9
. This experiment can be seen
below in Figure 9 and Figure 10
Figure. 9 VR+ Evaluation for the Movable Racks
8
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Figure. 10 VR+ Evaluation for the Movable Racks in Unreal Engine 5.4
After completing both experiments, the testers used the XR Framework to complete the NASA Task Load Index
(TLX)
11
and Modified Simulator Sickness Questionnaire (MSUS)
12
. Additionally, testers wore biosensors to gather
data on physiological responses, and to avoid possible bias from the Human Performance surveys, providing further
insights into potential areas for design improvement7.
C. MR Evaluation
After achieving an acceptable level of design, the next step in the evaluation process is the MR evaluation phase. This
phase uses the XR Framework, including biosensors, NASA Task Load Index (TLX), and Modified Simulator Sickness
Questionnaire (MSUS). Here, a single design component is focused on and brought to higher levels of fidelity for the
experiment. For example, this study would construct a spatially accurate frame for the airlock, and build low-fidelity
movable racks out of lightweight materials such as wood and aluminum. This provides a physical collision for the tester
while they are immersed in the virtual world using VR headsets. More integration for different tasks can also be added.
In this phase, Mixed Reality is adopted as the primary tool for evaluation, combining real-world physical interactions
with the virtual environment. The XR Framework is implemented to ensure comprehensive and unbiased evaluation,
utilizing bio data gathered during the tests. The final version of the integrated XOSS environment allows for extensive
evaluations across multiple fields, incorporating more assets and interactions.
D. MR+ Evaluation
The MR+ Evaluation phase, as a future task, aims to enhance the immersion during experiments. Mixed Reality
continues to be the main tool, with additional features to improve realism. A crane could be adapted to this phase for
partial gravity simulations, adding a new dimension to the testing environment
8
. Additionally, higher fidelity mockups
are expected to be introduced in this phase.
Further integration of collisions and interactions are to be included in the Unreal XOSS environment, providing a
richer virtual experience. The XR Framework is set to be fully integrated into the evaluation process, ensuring thorough
testing and validation of all design aspects. This comprehensive approach aims to facilitate a nuanced understanding of
the habitat’s functionality and allow for precise adjustments, leading to a more robust and human-centered design.
VI. Conclusion
The integration of Extended Reality (XR) technologies into the design and evaluation processes for lunar habitats
has proven to be a transformative approach. By leveraging VR, AR, and MR, significant advancements have been made
in assessing and refining habitat designs. The iterative process facilitated by these technologies allowed for continuous
improvements based on real-time feedback, enhancing the overall functionality and human-centered aspects of the
habitats.
The VR evaluations provided critical insights into spatial configurations and usability, identifying areas for
improvement that were addressed through iterative testing. The incorporation of biosensors and detailed human
performance metrics ensured that the evaluations were comprehensive and objective, minimizing potential biases.
In the MR phase, the blending of physical and virtual elements will add a new dimension to the testing environment,
enabling more realistic and interactive evaluations. The construction of physical frames and the integration of advanced
interaction mechanisms, such as partial gravity simulations, will further enhance the realism and effectiveness of the
tests.
9
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
Looking ahead, future work will focus on expanding the use of MR evaluations. As the design becomes more
defined and refined through this iterative evaluation process, the final phase will involve the use of AR. This phase will
include building a physical mock-up of the habitat, integrated with AR to minimize virtual assets and maximize physical
assets. This approach will provide a comprehensive and tangible evaluation, ensuring that the habitat design is not only
functional but also practical for real-world implementation.
The iterative use of XR technologies not only streamlined the design process but also ensured that the habitats were
robust and well-suited to the demanding conditions of the lunar surface. The methodologies developed and employed in
this project demonstrate the potential of XR to revolutionize space habitat design and testing, offering new capabilities
for in-depth analysis and validation.
As XR technologies continue to evolve, their application in space architecture will likely expand, providing even
greater opportunities for innovation and improvement. The successful implementation of these technologies in this
project highlights their potential to enhance the safety, efficiency, and overall success of future space missions. This
study lays the groundwork for further advancements in the use of XR in space exploration, paving the way for more
effective and human-centered designs in the years to come.
VII. Acknowledgements
This research is in continuity with the research supported by Boeing, and Buendea who provided SICSA with XOSS
Lunar Environment for Unreal Engine. The authors would also like to thank Prof. Larry Toups and Vittorio Netti from
SICSA, for their precious help.
References
[1]
Rauschnabel, P. A., Felix, R., Hinsch, C., Shahab, H., and Alt, F., “What is XR? Towards a Framework for Augmented and
Virtual Reality,” Computers in Human Behavior, Vol. 133, 2022, p. 107289. https://doi.org/10.1016/j.chb.2022.107289, URL
https://www.sciencedirect.com/science/article/pii/S074756322200111X.
[2]
Stone, R., Panfilov, P., and Shukshunov, V., “Evolution of Aerospace Simulation: From Immersive Virtual Reality to Serious
Games,” 2011. https://doi.org/10.1109/RAST.2011.5966921.
[3]
Nilsson, T., Rometsch, F., Bensch, L., Dufresne, F., Demedeiros, P., Guerra, E., Casini, A., Vock, A., Gaeremynck, F., and
Cowley, A., “Using Virtual Reality to Shape Humanity’s Return to the Moon: Key Takeaways from a Design Study,” 2023, pp.
1–16. https://doi.org/10.1145/3544548.3580718.
[4]
Milgram, P., and Colquhoun, H., A Taxonomy of Real and Virtual World Display Integration,” 2001. https://doi.org/10.1007/978-
3-642-87512-0_1.
[5]
Anthes, C., García-Hernández, R. J., Wiedemann, M., and Kranzlmüller, D., “State of the art of virtual reality technology,
2016 IEEE Aerospace Conference, 2016, pp. 1–19. https://doi.org/10.1109/AERO.2016.7500674, URL https://ieeexplore.ieee.
org/abstract/document/7500674.
[6]
Netti, V., Bannova, O., and Rajkumar, A., “XR Testing Framework for Human-System Interaction Design Validation, 2023.
URL https://www.researchgate.net/publication/372485842_XR_Testing_Framework_for_Human-System_Interaction_Design_
Validation.
[7]
Netti, V., Guzman, L., and Rajkumar, A., A Framework for use of immersive technologies for human-system integration testing
of space hardware,” 2021. URL https://www.researchgate.net/publication/351637225_A_Framework_for_use_of_immersive_
technologies_for_human-system_integration_testing_of_space_hardware.
[8]
Netti, V., Bannova, O., Kaur, J., and Spolzino, R., “Mixed Reality (XR) as a Validation Method for Digital Modeling of Space
Habitats,” 2022. https://doi.org/10.1061/9780784484470.074.
[9]
Thallner, M., Häuplik-Meusburger, S., and Cowley, A., “FLEXHAB WORKING MODULE -ARCHITECTURAL REQUIRE-
MENTS AND PROTOTYPING FOR A LUNAR BASE ANALOGUE,” 2018. URL https://spacearchitect.org/pubs/IAC-18-
E5.1.10.pdf.
[10]
Netti, V., and Mangili, P., “DESIGN AND DEVELOPMENT OF A SPACE SUIT MOCK-UP FOR VR-BASED EVA
RESEARCH AND SIMULATION,” 2023. https://doi.org/https://doi.org/10.1016/j.jsse.2024.05.001, URL https://www.
sciencedirect.com/science/article/abs/pii/S2468896724000624?via%3Dihub.
10
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
[11]
Hart, S. G., and Staveland, L. E., “Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical
Research, Advances in Psychology, Human Mental Workload, Vol. 52, edited by P. A. Hancock and N. Meshkati, North-Holland,
1988, pp. 139–183. https://doi.org/10.1016/S0166-4115(08)62386-9, URL https://www.sciencedirect.com/science/article/pii/
S0166411508623869.
[12]
Brooke, J., “SUS: A quick and dirty usability scale,” Usability Eval. Ind., Vol. 189, 1995. URL https://www.researchgate.net/
publication/228593520_SUS_A_quick_and_dirty_usability_scale.
11
Downloaded by Corrado Testi on July 28, 2024 | http://arc.aiaa.org | DOI: 10.2514/6.2024-4842
... Various operation scenarios are simulated, such as EVA activities, mass handling, and collaboration with robotics procedures, but all are relative to missions to be performed on the ISS. The concept of leveraging VR to enhance the engineering of early-stage lunar mission prototypes is well described in [3,4]. In particular, [3] focuses on the development of an iterative design process of lunar habitats in which VR is integrated to obtain valuable feedback on design choices directly from a human perspective. ...
... The concept of leveraging VR to enhance the engineering of early-stage lunar mission prototypes is well described in [3,4]. In particular, [3] focuses on the development of an iterative design process of lunar habitats in which VR is integrated to obtain valuable feedback on design choices directly from a human perspective. In [4], instead, the authors modeled a lunar exploration scenario and showed how VR facilitates the user-centered design of tools and assets to be manipulated by astronauts during a future mission. ...
Preprint
Full-text available
In an era marked by renewed interest in lunar exploration and the prospect of establishing a sustainable human presence on the Moon, innovative approaches supporting mission preparation and astronaut training are imperative. To this end, the advancements in Virtual Reality (VR) technology offer a promising avenue to simulate and optimize future human missions to the Moon. Through VR simulations, tests can be performed quickly, with different environment parameters and a human-centered perspective can be maintained throughout the experiments. This paper presents a comprehensive framework that harnesses VR simulations to replicate the challenges and opportunities of lunar exploration, aiming to enhance astronaut readiness and mission success. Multiple environments with physical and visual characteristics that reflect those found in interesting Moon regions have been modeled and integrated into simulations based on the Unity graphical engine. We exploit VR to allow the user to fully immerse in the simulations and interact with assets in the same way as in real contexts. Different scenarios have been replicated, from upcoming exploration missions where it is possible to deploy scientific payloads, collect samples, and traverse the surrounding environment, to long-term habitation in a futuristic lunar base, performing everyday activities. Moreover, our framework allows us to simulate human-robot collaboration and surveillance directly displaying sensor readings and scheduled tasks of autonomous agents which will be part of future hybrid missions, leveraging the ROS2-Unity bridge. Thus, the entire project can be summarized as a desire to define cornerstones for human-machine design and interaction, astronaut training, and learning of potential weak points in the context of future lunar missions, through targeted operations in a variety of contexts as close to reality as possible.
... The width of the module is limited to 3 meters, presenting several difficulties in the internal organization of space. Multiple virtual reality tests were conducted to improve the module design in an iterative process [15]. A system of racks mounted on rails that expand along the length of the module was chosen, taking inspiration from the system studied by the project in collaboration with ESA FlexHab [16]. ...
Conference Paper
Full-text available
Establishing a permanent human presence on the Moon is a crucial step towards becoming an interplanetary species and enabling further exploration of the Solar System. This paper presents a comprehensive design approach for the initial modules that will form the foundation of a lunar colony at the South Pole, accommodating the first crew of four astronauts. Our multidisciplinary team conducted an extensive study of historical literature and state-of-the-art human habitat designs to develop an innovative proposal tailored for lunar colonization. The proposed outpost is designed to evolve through distinct phases, utilizing four different module types: a Vertical Surface Habitat for the living environment, two distinct Horizontal Modules serving various functions such as laboratories, storage, and greenhouse, and an evolvable Node module that facilitates grid expansion and connection of additional modules. Leveraging 3D modelling tools, architectural design principles, and immersive virtual reality simulations, we consolidated our final design for the layout, structure, and interior configurations of these modules. The paper emphasizes the importance of incorporating hybrid modules from the outset of lunar colonization efforts. This hybrid approach, combining rigid and inflatable components, offers remarkable gains in terms of mass and volume optimization, which are critical factors for the initial human settlement on the Moon's surface. Through digital evaluation systems and trade studies, we demonstrate the significance of standardization and reconfigurability of internal usable volume within the modules. The hybrid design allows for efficient utilization of space while accommodating the evolving needs of the colony as it grows and expands over time. This work underscores the importance of hybrid module design for lunar colonization following the Artemis missions. Future research should focus on further optimizing the mass of these modules through advanced materials and construction techniques, as well as exploring additional configuration possibilities for the interiors of the horizontal modules to support a wider range of activities. The design presented in this paper is the result of a collaborative effort with the Sasakawa
Conference Paper
Full-text available
The current standard for human-system integration testing for the development of space hardware makes large use of high-fidelity mockups to test operational scenarios and human interactions with the hardware. This process is iterated at different development stages with the utilization of large amounts of time and resources which limits the speed at which designs/products can be developed. Immersive technologies can help overcome this limitation by minimizing the dependency on physical prototyping of assets and help condense the iterative evaluation/implementation process optimizing the transition from design to human-in-the-loop testing. In this study, we present the current development stage of a testing framework to be used in Human-System Interaction design validation. This study has been developed at Sasakawa International Center for Space Architecture's immersive laboratory as a breakthrough towards the implementation of those technologies in space hardware testing. To validate this framework, we use the Lunar Surface Infrastructure Project, a pilot project developed at SICSA and sponsored by The Boeing Company. The aim of this project is to design a minimal lunar infrastructure that allows 4 astronauts to carry out an exploration class mission on the lunar surface, of 14 days duration. The scope of the project includes the design of a Class I pre integrated habitat, an unpressurized teleoperated/autonomous rover and the development of surface operations scenarios (ConOps). Our XR (Extended Reality) testing framework makes use of the Task Load Index by NASA (TLX) to assess the task workload and the System Usability Scale (SUS) to study the usability of the immersive technology system for these applications. On top of those two qualitative evaluation methods, we will use biometric data to validate the qualitative evaluations through quantitative feedback interpretation. The biometric data are collected using a suite of wearable sensors developed by Astradyne SRL. A test campaign will take place in the first quarter of 2023, using the XR assets of the Immersive Laboratory at SICSA. The data will be collected and used to evaluate the current habitat and rover design of the Lunar Surface Infrastructure Project using the proposed testing framework. The findings of the evaluation phase will so be used to further develop the project.
Conference Paper
Full-text available
Revived interest in lunar exploration is heralding a new generation of design solutions in support of human operations on the Moon. While space system design has traditionally been guided by prototype deployments in analogue studies, the resource-intensive nature of this approach has largely precluded application of proficient user-centered design (UCD) methods from human-computer interaction (HCI). This paper explores possible use of Virtual Reality (VR) to simulate analogue studies in lab settings and thereby bring to bear UCD in this otherwise engineering-dominated field. Drawing on the ongoing development of the European Large Logistics Lander, we have recreated a prospective lunar operational scenario in VR and evaluated it with a group of astronauts and space experts (n=20). Our qualitative findings demonstrate the efficacy of VR in facilitating UCD, enabling efficient contextual inquiries and improving project team coordination. We conclude by proposing future directions to further exploit VR in lunar systems design.
Conference Paper
Full-text available
This paper presents and discusses strategies for integrating XR technologies into iterative design processes for modeling and testing space structures, specifically habitats and other facilities for crew operations. The proposed methodology was developed during a six-month research and design study of lunar surface architectures sponsored by The Boeing Company and conducted by the faculty and students of the University of Houston's Sasakawa International Center for Space Architecture. The study aimed to define a design process that includes possible surface operations scenarios (ConOps), development of an evaluation methodology and surface ConOps analysis for the conceptual design of a Lunar Terrain Vehicle (LTV) and a Small Lunar Habitat (SLH). Design decision validation methodology involved development of a 3D computer model of surface systems that included a Small Lunar Habitat and Lunar Transfer Vehicle, in addition, low fidelity physical operation elements such as a STB of a selected size were developed for integrating it within the created VR environment for immersive testing and evaluation. The proposed in the study framework suggests a number of available technologies that are part of the Industry Standard for VR applications. The objective of the phased implementation strategy is to integrate a new evaluation process in the testing loop without disrupting currently employed design validation standards, offer a new capability that will provide an in-depth design analysis improving the overall design process and interfacing it with current evaluation standards. The paper presents the results of the study and defines the next steps of advancing XR technologies applications for testing and evaluation of design solutions for space applications. Such steps will include defining the limitations of XR applications for testing and validating design options and associated interfaces; the most efficient way of integrating ConOps and mission scenarios in the simulated environment; and technology requirements for performing immersive simulations and design evaluations. 2 Earth & Space 2022 INTRODUCTION Optimization of human-system integration is a fundamental task of the hardware design process for crewed space missions. Currently used design verification methodology consider high fidelity mockups to be the primary evaluation method for design iterations and their compatibility with human ergonomics. Such approach involves high costs for resources and long implementation time. It is still a process that has not been innovated for few decades. While the space industry is struggling to implement new evaluation techniques that maximizes the humans-in-the-loop inclusion, development of commercial immersive technologies spiked in the last few years. Now, everyone can purchase a commercial VR headset that does not differ in technical specifications from high-end industrial products that are used to be reserved for RD purposes. These technologies should be harnessed to mitigate the dependency on physical prototyping of assets and help optimize the design process, drastically reducing RD time and providing a higher level of immersion if compared to traditional evaluation platforms. However, there is a gap in the general understanding of how to utilize these emerging technologies effectively and apply them for space hardware development. This paper proposes a framework for human-system integration testing of space hardware using immersive technologies. The framework utilizes relatively inexpensive Commercial Off the Shelf (COTS) components to build a testing system and develop a new industry standard for human-rated hardware design. The next step is advancement of this study to produce tangible results through conducting a series of testing with single and multiuser participation, possible integration of multiple virtual simulation tools (e.g. structural analysis) with the XR technologies, and investigating conditions and outcomes of integrating environment-specific settings in the simulation.
Article
Full-text available
Augmented Reality (AR), Virtual Reality (VR), Mixed Reality, and Extended Reality (often – misleadingly – abbreviated as XR) are commonly used terms to describe how technologies generate or modify reality. However, academics and professionals have been inconsistent in their use of these terms. This has led to conceptual confusion and unclear demarcations. Inspired by prior research and qualitative insights from XR professionals, we discuss the meaning and definitions of various terms and organize them in our proposed framework. As a result, we conclude that (1) XR should not be used to connote extended reality, but as a more open approach where the X implies the unknown variable: xReality; (2) AR and VR have fundamental differences and thus should be treated as different experiences; (3) AR experiences can be described on a continuum ranging from assisted reality to mixed reality (based on the level of local presence); and (4), VR experiences can be conceptualized on a telepresence-continuum ranging from atomistic to holistic VR.
Conference Paper
Full-text available
The current standard for human-system integration in space hardware development makes use of high fidelity mockups to test operational scenarios and human interactions. This process is iterated at different scales and development stages and it usually requires the use of great resources and implementation time. Immersive technologies can help mitigate this problem by minimizing the dependency on physical prototyping of assets and help condense the iterative evaluation/implementation process optimizing the transition from CAD modeling to human-in-the-loop testing. In this paper we propose a framework for human-system integration testing of space hardware using immersive technologies. The proposal makes large use of Components Off the Shelves (COTS) to ease the R&D time, such as Virtual Reality (VR) headsets and sensors. VR trackers are a viable option to reduce the gap between digital assets development and low fidelity mock-up production, enabling that same level of interaction usually reserved for physical models and the immersion typical of Virtual Reality. We will use the Multi-mission Extra-Vehicular Robot (MMEVR), developed at SICSA as a case study for the proposed framework application. Design-derived tasks are to be performed using the MMEVR low fidelity mockup, built at the University of Houston, in conjunction with VR trackers and a VR Headset. Currently, no standard could be identified for optimized VR testing in space applications and this paper aims to lay down initial guidelines for development of such a standard. We will use the Task Load Index by NASA (TLX) to assess the task workload and the System Usability Scale (SUS) to study the usability of the immersive technology system for these applications.
Conference Paper
Full-text available
In this paper we provide a retrospective overview of the evolution of simulation hardware and software technologies, beginning from the early days of the NASA VIEW Virtual Reality system, the European Space Agency's first steps in exploring the capabilities and limitations of immersive technologies for visualization and training, and developments underpinning the delivery of the first interactive 3D model of the Roscosmos Virtual Mir Space Station. Today's best practice, latest developments and future concepts of human-in-the-loop (HITL) simulation for the aerospace industry will be presented. The paper discusses additional findings relating to the exploitation and testing of a range of recent VR and serious games applications of relevance to aerospace research, development, education and training in the UK and Russia and seeks to define those issues demanding urgent consideration.
Article
Full-text available
Usability does not exist in any absolute sense; it can only be defined with reference to particular contexts. This, in turn, means that there are no absolute measures of usability, since, if the usability of an artefact is defined by the context in which that artefact is used, measures of usability must of necessity be defined by that context too. Despite this, there is a need for broad general measures which can be used to compare usability across a range of contexts. In addition, there is a need for "quick and dirty" methods to allow low cost assessments of usability in industrial systems evaluation. This chapter describes the System Usability Scale (SUS) a reliable, low-cost usability scale that can be used for global assessments of systems usability.
Conference Paper
Analogues and simulations play an important role in preparation for space exploration. They allow us to test experiments before sending them into space, develop countermeasures for the special conditions, or study human behaviour in similar environments. The so called FLEXhab, Future Lunar Exploration Habitat, will serve as a lunar base analogue in order to create a training and experimental environment at the European Astronaut Center, EAC. The design of a potential lunar base is very much affected by the specific environmental conditions. Therefore, these are analysed to implement the effects into the analogue, with the goal to create a similar interior and to test countermeasures for specific challenges. Additionally, the potential tasks carried out inside a lunar base are influencing the habitat design, as well as the analogue and are therefore transferred into possible use-cases for FLEXhab. This paper presents a design for the interior of the FLEXhab working module, based on a previously developed concept. (O. Punch, T. Dijkshoorn, Spaceship EAC, 2016) Furthermore, the so called FLEXrack is presented, a recently developed movable rack system. The concept will increase flexibility for exploration module design and increase the available space for tasks and experiments inside the module. To give a first impression of the design and the FLEXrack concept operationally, a prototype is built and evaluated at the EAC. Finally, design requirements for the final design of FLEXhab, the integration of FLEXrack within FLEXhab, as well as for analogues in general are presented.