Conference Paper

Spacedesign: A mixed reality workspace for aesthetic industrial design

DIMeG, Politecnico di Bari, Italy
DOI: 10.1109/ISMAR.2002.1115077 Conference: Mixed and Augmented Reality, 2002. ISMAR 2002. Proceedings. International Symposium on
Source: DBLP

ABSTRACT Spacedesign is an innovative mixed reality (MR) application addressed to aesthetic design of free form curves and surfaces. It is a unique and comprehensive approach which uses task-specific configurations to support the design workflow from concept to mock-up evaluation and review. The first-phase conceptual design benefits from a workbench-like 3-D display for free hand sketching, surfacing and engineering visualization. Semitransparent stereo glasses augment the pre-production physical prototype by additional shapes, textures and annotations. Both workspaces share a common interface and allow collaboration and cooperation between different experts, who can configure the system for the specific task. A faster design workflow and CAD data consistency can be thus naturally achieved. Tests and collaborations with designers, mainly from automotive industry, are providing systematic feedback for this ongoing research. As far as the authors are concerned, there is no known similar approach that integrates the creation and editing phase of 3D curves and surfaces in virtual and augmented reality (VR/AR). Herein we see the major contribution of our new application.

Download full-text


Available from: Michele Fiorentino, Sep 26, 2015
165 Reads
  • Source
    • "A small number of works combining AR and sketching can be identified in the literature. Different AR researchers have considered possible views of the real world and the dimensionality of the physical space for sketching, such as Fiorentino et al [14], Cheok et al. [15], Yee et al. [16] or Jung Von Matt/next [17]. Seichter et al. [18] proposed the usage of sketching for the design of extruded architectural models, which can be overlaid on real architectural mock-ups. "
    Computers in Entertainment 02/2015; 12(3):1-18. DOI:10.1145/2702109.2633419
  • Source
    • "One of them is to position VR at the core of development cycles by providing a common tool to all the professions associated with new products design. Immersive modeling environments, such as the one developed by Fiorentino et al. [7], allow designers to create shapes directly within the 3D space. Another example is VR-CAD environments such as the VRAD demonstrator presented by Bourdot et al. [4] which provides an immersive and multimodal user interface allowing the creation of curves, surfaces and solids. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This document is a draft - the final paper is about to be published on ACM Digital library
  • Source
    • "There have been various studies in various domains done on virtual reality and MR-based skill/task learning and training support and a number of systems have been developed, e.g., in the industry domain: constructing machine-maintenance training system (Ishii et al., 1998), metal inert gas welding training system (Chambers et al., 2012), object assembly training system (Jia et al., 2009), overhead crane training system (Dong et al., 2010), firefighting tactical training system (Yuan et al., 2012), esthetic industrial design (Fiorentino et al., 2002), job training system for casting design (Watanuki and Kojima, 2006); in the science and education domain: electrical experimental training system (Kara et al., 2010), application of geography experimental simulation (Huixian and Guangfa, 2011), collaborative learning (Jackson and Fagan, 2000); in the medicine domain: ultrasound guided needle biopsy training system (de Almeida Souza et al., 2008), baby feeding training system (Petrasova et al., 2010), endoscopic surgery simulation training system (Song et al., 2009); in the tourist domain: tourist guide training system (Minli et al., 2010); in the military domain: missile maintenance training system (Cheng et al., 2010); in the sports domain: Kung-Fu fight game (Hamalainen et al., 2005), martial arts (Chua et al., 2003; Kwon and Gross, 2005; Patel et al., 2006), physical education and athletic training (Zhang and Liu, 2012), golf swing learning system (Honjou et al., 2005); in the dance domain: dance training system (Nakamura et al., 2005; Chan et al., 2010), collaborative dancing (Yang et al., 2006); in the cooking and eating domain: augmented reality kitchen (Bonanni et al., 2005), augmented reality flavors (Narumi et al., 2011), augmented perception of satiety (Narumi et al., 2012), etc. Many of these systems have employed a virtual teacher to perform the physical task in front of the learner (Yang and Kim, 2002; Nakamura et al., 2003; Honjou et al., 2005; SangHack and Ruzena, 2006; Chua et al., 2003). "
    [Show abstract] [Hide abstract]
    ABSTRACT: In this research, we investigated the virtual teacher's positions and orientations that led to optimal learning outcome in mixed-reality environment. First, this study showed that the virtual teacher's position and orientation have an effect on learning efficiency, when some teacher-settings are more comfortable and easy to watch than others. A sequence of physical-task learning experiments have been conducted using mixed-reality technology. The result suggested that the virtual-teacher's close side-view is the optimal view for learning physical-tasks that include significant one-hand movements. However, when both hands are used, or rotates around, a rotation-angle adjustment becomes necessary. Therefore, we proposed a software automatic-adjustment method governing the virtual teacher's horizontal rotation angle, so that the learner can easily observe important body motions. The proposed software method was revealed to be effective for motions that gradually reposition the most important moving part. Finally, to enhance the proposed method in the future, we conducted an experiment to find out the effect of setting the vertical view-angle. The result recommended that the more motion's rotation involved the more vertical view angles are wanted to see the whole motion clear.
    Journal of Systems and Software 07/2013; 86(7):1738–1750. DOI:10.1016/j.jss.2012.08.060 · 1.35 Impact Factor
Show more