Article

Augmented Reality in Product Development and Manufacturing

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Augmented Reality (AR) is a technology that provides intuitive interaction experience to the users by combining the real world seamlessly with computer-generated texts, images, animations, etc. An augmented world with functions designed for the intended application as well as the end user interaction experience is rendered to the user, resulting in efficient perception and creation for them. AR technology has evolved from one that only appears in science fiction movies to one that has been applied broadly in industries. A review paper published in 2008 discussed the application of AR in industries [1]. In this chapter, research papers focused on the development of industrial applications that were published from 2006 to 2010 are reviewed. These papers are categorized according to the product development cycle shown in Fig. 30.1.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In industrial AR, most efforts have been put into several focused areas such as assembly [5,6], maintenance [7,8], product development [9], and manufacturing layout [10] as described in Ong et al. [11]. While previous studies generated application-specific AR scenes based on non-standard data for a specific task, few studies have focused on merging AR capabilities into the model-based integrated design and inspection frameworks [12]. ...
... This is due to different units and coordinate system conventions used by software vendors. For example, 16 Conducted via toolsets offered from Origin International Inc. 9 Copyright © by ASME Unity uses a left-handed y-up coordinate system, while another popular game engine, the Unreal Engine, uses a lefthanded z-up convention. Additionally, by default, many 3D modeling programs use right-handed coordinate conventions. ...
Article
Augmented reality (AR) has already helped manufacturers realize value across a variety of domains, including assistance in maintenance, process monitoring, and product assembly. However, coordinating traditional engineering data representations into AR systems without loss of context and information remains a challenge. A major barrier is the lack of interoperability between manufacturing-specific data models and AR-capable data representations. In response, we present a pipeline for porting standards-based design and inspection data into an AR scene. As a result, product manufacturing information with three-dimensional (3D) model data and corresponding inspection results are successfully overlaid onto a physical part. We demonstrate our pipeline by interacting with annotated parts while continuously tracking their pose and orientation. We then validate the pipeline by testing against six fully toleranced design models, accompanied by idealized inspection results. Our work (1) provides insight on how to address fundamental issues related to interoperability between domain-specific models and AR systems and (2) establishes an open software pipeline from which others can implement and further develop.
... Furthermore, as new more complex products emerge and these attend the user's context, the evaluation process conducted during the design and development phases also get more complex (e.g., generating large amounts of data for post-processing and analysis) [3]. Traditionally, the evaluation is performed using physical mock-ups, aiming to assess if the characteristics (e.g., physical design, usability, etc.) of a given product match the requirements previously established [21]. ...
Conference Paper
The rapid changes experienced in the industrial development processes to ensure interconnectivity and customization have led to an increasing need to adopt collaborative design practices and consider end-users' context at earliest stages. Such contextual information generates large amounts of data that are usually oversimplistic, overlooking the spatial dimension in its representation and visualization. Usually, this is analyzed in post-processing using digital tools, which lack proper methods to analyze the information collected. This position paper proposes a vision for using data collection and Mixed Reality (MR), thus supporting industrial product co-design. This combination can be highly flexible in providing a perspective that overlays user feedback in a digital form on top of a physical artifact. As such, it may be possible to conduct co-design tasks in a more informed manner, having a higher level of awareness of the product characteristics. These and other arguments are presented and future directions are proposed. *
... To overcome the shortcomings of NC codes verification only in physical space or cyber space, CPS is also applied for CNC machining simulation. Some enabling technical frameworks such as AR (Ong et al., 2011;Paelke, 2014;Ragni et al., 2018;Wang et al., 2016) and Digital Twin (DT) (Redelinghuys et al., 2020;Tao & Zhang, 2017;Tong et al., 2020;Zhao et al., 2020) are used to implement CPS. As a friendly Human-Machine Interface (HMI) and Cyber-Physical fusion method, AR can provide a physical machining environment with cyber information. ...
Article
Full-text available
Numerical control (NC) codes verification is an important issue in computer numerical control (CNC) machining simulation because wrong NC codes will lead to the workpiece scrap and collision. The NC code verification methods both in physical space and cyber space (such as 3D computer graphics environment) have been widely investigated in recent years. However, physical verification methods have the problems that the simulation takes time and improper operations may cause danger. On the other hand, cyber verification methods only support some types of machines and cannot reflect the actual conditions of machine tools. This study proposes a cyber-physical prototype system for NC codes verification and CNC machining simulation. Based on the RGB-D camera, the depth-to-stereo model is constructed to obtain the 3D information in images. Without connecting with the CNC controller, the cutting tool and workpiece coordinate system (WCS) movement information in physical space can be got from images captured by the RGB-D camera through a convolutional neural network (CNN). Workpiece size and NC codes are imported into cyber space to render virtual workpiece with augmented reality (AR) technology. So that the operator can directly see the virtual workpiece in the physical machining scene. The virtual workpiece is machined by the cyber-physical system according to cutting tool movement in physical space. This research further confirms the feasibility of using computer vision (CV) methods to build the cyber-physical CNC simulation system based on an RGB-D camera. The potential application of the system is to obtain simulation results from CNC machine tools (especially those that are forbidden to connect the controller) and transfer the machining results to the Internet of Things (IoT).
... With the advancement in digital technologies, we are now employing new digital technologies while developing products. Technologies such as Cloud [17], [18] and [19], Deep Learning [20], [21], [22] and [23], Virtual Reality [24], [25], [26], [27] and [28], Augmented Reality [29], [30], [31], [32] and [33] and Mixed reality [34], [35], [36], [37] and [38] are widely used at different stages of product development. Customer data from various sources especially those from the internet [39], [40], [41], [42], [43], [44] and [45] are extensively used in NPD. ...
... Because of its ability to combine virtual information and physical environments, AR is promising for maintenance tasks for various devices and systems. Ong et al. (2011) reported that in maintenance activities, AR has the advantage of being ubiquitous in allowing remote collaborations. Morkos et al. (2012) reported that using AR can help technicians communicate, obtain assembly information more quickly, and access audio/video information more easily. ...
Article
This article presents an augmented reality‐based instruction (ARBI) system for maintenance tasks. A traditional manual instruction method and a computer‐assisted instruction method were compared. Three maintenance instruction methods, three task difficulty levels (low, medium, and high), and the user's gender (male and female) were specified as the independent variables in the experimental design. The dependent variables included task completion time and error rate as objective measures, and system usability scale (SUS) and NASA‐task load index (NASA‐TLX) scores as subjective measures. There were 30 participants (15 males and 15 females) in the experiment. The results indicated that the instruction method and task difficulty significantly affected the task completion time, error rate, SUS, and NASA‐TLX. Among the instruction methods, the ARBI method exhibited the highest SUS score, lowest NASA‐TLX score, shortest task completion time, and minimum error rate. In conclusion, the proposed ARBI method was beneficial for assisting iPhone maintenance tasks.
... The ease with which this digital layer can be edited has attracted interest from the fields of packaging and product design, as multiple iterations of the surface design can be explored using the same physical prototype. This is especially important near the end of the design process where modifications are less centred around physical shape (Ong et al., 2011). The technology is of particular interest for the packaging and advertising industry as well as any design whose colour, material, and finish is being evaluated (Caruso et al., 2016). ...
Article
Full-text available
Spatial Augmented Reality (SAR) differs from other forms of AR by allowing the projection of digital images onto a model. This allows the AR to be more tangible and for interaction to be more realistic. The scale of the model plays a role in the realism but may be constrained by technical factors. This study attempts to understand the influence scale has on a design session by analysing the concept generation process, the ease of designing and the design behaviour. Understanding how these factors are influenced by the model scale betters the understanding of how SAR can influence design.
... From the perspective of Industry 4.0, Augmented Reality (AR) represents a key technology that can overcome these limitations and inefficiencies. In fact, in the last decades, AR technology has been successfully and efficiently applied in many sectors of the industrial field to help workers to accomplish several tasks [1][2][3][4][5] thanks to its capability to superimpose virtual data directly on the real environment. In particular, the display technology adopted for the development of AR-based solutions typically relies on head-mounted displays (HMDs) [6,7], projectors [8,9], and mobile devices [10,11] such as smartphones and tablets. ...
Chapter
Augmented Reality (AR) is an innovation accelerator for Industry 4.0 that supports the digitalization and improves the efficiency of the industrial sector by providing powerful tools able to enhance the workers’ visual perception by combining the real world view with computer-generated data.
... To provide a credible combination of real and virtual objects it is crucial to perform a camera calibration both of the internal camera geometric and optical characteristics (intrinsic parameters) and of the position and orientation of the camera frame (extrinsic parameters) with respect to a reference system [8]. ...
Article
Full-text available
Manual spot welding loses the comparison with automated spot welding, not because of a higher execution time, but due to an inferior quality of welded points, mostly a low repeatability. It is not a human fault. Human welder is compelled to operate without having at disposal the knowledge of significant process features that are known by the robot: exact position of the welding spot, electric parameters to be adopted for every specific point, quality of the welded spot and, based on it, possible need for repetition of a defective weld.
... To provide a credible combination of real and virtual objects it is crucial to perform a camera calibration both of the internal camera geometric and optical characteristics (intrinsic parameters) and of the position and orientation of the camera frame (extrinsic parameters) with respect to a reference system [16]. ...
Article
(Stadnicka D., Antonelli D.: Implementation of augmented reality in welding processes. Technologia i Automatyzacja Montażu. Vol. 4, 2014, pp. 56-60. ) Welding processes are special processes and special attention should be put on them when we are talking about the quality assurance problems. Standard quality management systems based on ISO standards indicate areas on which this special attention should be put, in order to obtain results which will meet the requirements. Thus, it is necessary to ensure the adequate employees’ preparation, adequate control of equipment used in a welding process, adequate work methods and systems supporting a welder in the error-free welding process realization. In this paper, applying augmented reality in order to support a spot welding process is described. The solution presented is simple and makes use of low cost devices. At the same time, a welder training time can be shortened through the use of the proposed solution. The tests of work performed with the use of the proposed solution gave good results allowing an operator to avoid mistakes during a welding process performing. Simultaneously, shortening of the process realization time is obtained.
... Augmented Reality (AR) applications have demonstrated to be effective in the design review phase, when new products have been designed and require an evaluation [11]. In fact, AR offers the possibility of evaluating 3D virtual models of these products, which can be easily modified, in their real context of use, without the need to produce real prototypes. ...
Article
This paper proposes a new interactive Augmented Reality (AR) system, which has been conceived to allow a user to freely interact with virtual objects integrated in a real environment without the need to wear cumbersome equipment. The AR system has been developed by integrating the Fog Screen display technology, stereoscopic visualization and the Microsoft Kinect. The user can select and manage the position of virtual objects visualized on the Fog Screen display by using directly his/her hands. A specific software application has been developed to perform evaluation testing sessions with users. The aim of the testing sessions was to verify the influence of issues related to tracking, visualization and interaction modalities on the overall usability of the AR ‘system. The collected experimental results demonstrate the easiness of use and the effectiveness of the new interactive AR system and highlight the features preferred by the users.
... AR technology has been applied widely in various areas, including medical, entertainment, military, education, etc. [3,4]. Last decade saw various AR applications in the manufacturing industries [5]. By enhancing the users' understanding and interactions with the manufacturing environment, shorter lead time and lower manufacturing costs can be achieved. ...
Article
Full-text available
CNC machining simulation has been developed to validate NC codes and optimize the machining process. Conventional simulation systems are usually performed in virtual environments, and this has the limitation that the operator needs to transfer the knowledge from the software environment to the real machining environment. The ARCNC system is proposed in this paper where the operator can observe in situ simulation between the real cutter and a virtual workpiece. The system design and the research problems addressed are discussed in this paper, including the tracking and registration methods and the physical simulation approach based on an enhanced dexel model.
... No haptic feedback and trajectories are used as a support. Two surveys concerning the use of AR in the industrial field (Ong et al. 2008(Ong et al. , 2011) describe a series of applications developed to support the assembly and maintenance activities demonstrating the effectiveness of the AR approach. Tang et al. (2003) have demonstrated that overlaying 3D assembly instructions on the actual work pieces can reduce the error rate of an assembly task by 82%. ...
Article
The paper describes an application based on Virtual and Augmented Reality technologies specifically developed to support maintenance operations of industrial products. Two scenarios have been proposed. In the first an operator learns how to perform a maintenance operation in a multimodal Virtual Reality environment that mixes a traditional instruction manual with the simulation, based on visual and haptic technologies, of the maintenance training task. In the second scenario, a skilled user operating in a multimodal Virtual Reality environment can remotely train another operator who sees the instructions about how the operations should be correctly performed, which are superimposed onto the real product. The paper presents the development of the application as well as its testing with users. Furthermore limits and potentialities of the use of Virtual and Augmented Reality technologies for training operators for product maintenance are discussed.
Conference Paper
Full-text available
The rapid proliferation of Augmented Reality (AR) technologies over the past 30 years has introduced new capabilities and opportunities to further support design activities. It is therefore not surprising that there is an increasing body of knowledge on the application of AR within design. However, little work has been performed on consolidating this knowledge to enable the identification of general trends and gaps in the successful application of AR across design activities. This is both critical to design research to ensure that the field provides comprehensive coverage of the potential of AR in design, and for industry who are looking to AR to enhance the productivity of their design processes. To meet this need, the paper presents a review of the design literature relating to AR and maps this research in relation to the type of AR technology and the stage in the design process. From this review, the paper identifies the AR technologies that show greatest promise in supporting design activity and areas in design that have had little to no research with regards to AR. Through this investigation it was possible to determine that while there are currently some AR technologies aimed at supporting design not all forms of AR technologies are currently being investigated with SAR and HHD having more commercially available platforms than HMDs. More importantly, this review found that not every stage of the design process is currently supported by AR technologies. It appears that the initial and final stages of the design process are the areas that lack the most support. Indeed the Task, Design Specification and Product Documentation stages (first, second and final stages in the design process respectively) are the areas where little to no research has been undertaken. These stages in the design process are thus identified as potential areas for further development of AR technologies to support the growth and acceptance of AR in design.
Conference Paper
Full-text available
Augmented Reality (AR) has been a focus of research in the field of production for many years with the goal to improve e.g. monitoring and maintenance processes. Although the benefits have been well researched, the AR technology has not been widespread yet in the industry. This article discusses reasons why existing approaches have not become established outside the research context. Following this, solutions are examined for three identified problematic areas. First, this article presents a theoretical method to create use cases for AR applications in the product design process based on the Double Diamond design process model. The second main part discusses which device properties are relevant for a production environment. A catalogue of criteria is presented, weighting properties according to their importance. Third, a discussion is described regarding data integration concepts, problems and solutions for AR design and production use cases.
Article
Full-text available
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
Article
Full-text available
The application of augmented reality (AR) technology for assembly guidance is a novel approach in the traditional manufacturing domain. In this paper, we propose an AR approach for assembly guidance using a virtual interactive tool that is intuitive and easy to use. The virtual interactive tool, termed the virtual interaction panel (VirIP), is an easy-to-use tool that can be used to interactively control AR systems. The VirIP is composed of virtual buttons, which have meaningful assembly information that can be activated by an interaction pen during the assembly process. The interaction pen can be any general pen-like object with a certain colour distribution. It is tracked using a restricted coulomb energy (RCE) network in real-time and used to trigger the relevant buttons in the VirIPs for assembly guidance. Meanwhile, a visual assembly tree structure (VATS) is used for information management and assembly instructions retrieval in this AR environment. VATS is a hierarchical tree structure that can be easily maintained via a visual interface. It can be directly integrated into the AR system or it can alternatively act as an independent central control station on a remote computer to control the data flow of the assembly information. This paper describes a typical scenario for assembly guidance using VirIP and VATS. The main characteristic of the proposed AR system is the intuitive way in which an assembly operator can easily step through a pre-defined assembly plan/sequence without the need of any sensor schemes or markers attached on the assembly components. Several experiments were conducted to validate the performance of the proposed AR-based method using a monitor and a head-mounted display. The results show that the AR-based method can provide an efficient way for assembly guidance.
Article
Full-text available
A computer code was developed to simulate radiograph of complex casting products in a CAVETM-like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists.
Article
Full-text available
In this work we present how Augmented Reality (AR) can be used to create an intimate integration of process data with the workspace of an industrial CNC (computer numerical control) machine. AR allows us to combine interactive computer graphics with real objects in a physical environment - in this case, the workspace of an industrial lathe. ASTOR is an autostereoscopic optical see-through spatial AR system, which provides real-time 3D visual feedback without the need for user-worn equipment, such as head-mounted displays or sensors for tracking. The use of a transparent holographic optical element, overlaid onto the safety glass, allows the system to simultaneously provide bright imagery and clear visibility of the tool and workpiece. The system makes it possible to enhance visibility of occluded tools as well as to visualize real-time data from the process in the 3D space. The graphics are geometrically registered with the workspace and provide an intuitive representation of the process, amplifying the user's understanding and simplifying machine operation.
Article
Full-text available
Ecce Homology, a physically interactive new-media work, visualizes genetic data as calligraphic forms. A novel computervision based user interface allows multiple participants, through their movement in the installation space, to select genes ...
Article
Full-text available
Industrial robots play an important role in industry, due to their flexibility. Many applications (almost all that require human intervention) may be performed with advantages by robots. Nevertheless, set-up operations, necessary when changing production models, are still tricky and time-consuming. It is common to have detailed data of working pieces in computer aided design (CAD) files, resulting from product design and project. This information is not used satisfactorily, or even not used at all, for robot programming. In this paper, we propose a solution capable of extracting robot motion information from the CAD data.
Article
Full-text available
Products fabricated by metal casting are affected by casting design and pouring. In order to design products efficiently, engineers must consider the shape of the pouring gate and the fluidity of metal in the runner. Engineers must envision the shape of the product and the fluidity of the metal in three dimensions. However, this task is difficult, especially for beginners. Therefore, we developed an augmented reality-based training system for metal casting that allows visualization of fluidity. The proposed training system uses simple physical operations to compute and display fluidity in real-time. Visualization of three-dimensional images as well as training on this system is simple. KeywordsAugmented reality-Skill transfer-Metal casting-Interactive interface
Article
Full-text available
We present a novel approach, called tangible digital master (TaDiMa), based on a flexible access to Product Lifecycle Management (PLM) database using a tangible/graphical user interface and augmented reality. We developed and tested a technical drawing template with automatically embedded markers for augmentation. We embedded the Mozilla FireFox engine into our annotation system for the integration of web 2.0 applications in the AR environment. We also extended the concept of web feed to PLM for notification of technical contents. We implemented a dynamic labelling management with view-driven filtering and placement. We also present a tangible management for layers visualization and users access by marked tokens. We validated the TaDiMa approach in a selection of possible scenarios for different PLM applications. The main benefit of the TaDiMa system is the easy and low-cost integration in the product lifecycle process.
Conference Paper
Full-text available
Augmented Reality (AR) is a technology which has become very popular in the last years. In this context also the idea of using of AR for training applications has become very important. AR offers a large potential for training only if the training is well focused to the skills that have to be trained and if the training protocol is well designed. On the other hand, the generation of the training content to be transferred via AR is a comprehensive problem that is addressed in this paper. Thus, this paper tries to describe the whole chain of implementations and general aspects involved in the creation of AR training applications, including examples for used multimodal devices. This chain starts with the capturing of expert actions to be hold in the "digital representation of skill". The digital representation of skill is transferred to the training protocol that specifies the storyboard of the AR training session. The paper includes two different implementations of AR training systems and describes the general idea of informational abstraction from low level data up to interaction and from design to application.
Conference Paper
Full-text available
In this paper, we describe the demonstration system in which a wooden 3D puzzle is assembled using an augmented reality system. The 3D puzzle describes a simplified assembly task in a factory. The 3D puzzle is a means to study how to implement an augmented assembly system to a real setting in a factory and explore the advantages and disadvantages of augmented reality techniques. With this demonstration system we can also test, demonstrate and evaluate different input modalities for augmented assembly setup. Furthermore, a demonstration system helps us discuss with industry and get invaluable feedback.
Conference Paper
Full-text available
D object design has many applications including flexi- ble 3D sketch input in CAD, computer game, webpage con- tent design, image based object modeling, and 3Dobject re- trieval. Most current 3D object design tools work on a 2D drawing plane such as computer screen or tablet, which is often inflexible with one dimension lost. On the other hand, virtual reality based methods have the drawbacks that there are awkward devices worn by the user and the virtual envi- ronment systems are expensive. In this paper, we propose a novel vision-based approach to 3D object design. Our sys- tem consists of a PC, a camera, and a mirror. We use the camera and mirror to track a wand so that the user can de- sign 3D objects by sketching in 3D free space directly with- out having to wear any cumbersome devices. A number of new techniques are developed for working in thissystem, in- cluding input of object wireframes, gestures for editing and drawing objects, and optimization-based planarand curved surface generation. Our system provides designers a new user interface for designing 3D objects conveniently.
Conference Paper
Full-text available
For simulating hands-on tasks, the ease of enabling two-handed interaction with virtual objects gives Mixed Reality (MR) an expected advantage over Virtual Reality (VR). A user study examined whether two-handed interaction is critical for simulating hands-on tasks in MR. The study explored the effect of one- and two-handed interaction on task performance in a MR assembly task. When presented with a MR system, most users chose to interact with two hands. This choice was not affected by a user's past VR experience or the quantity and complexity of the real objects with which users interacted. Although two-handed interaction did not yield a significant performance improvement, two hands allowed subjects to perform the virtual assembly task similarly to the real-world task. Subjects using only one hand performed the task fundamentally differently, showing that affording two-handed interaction is critical for training systems.
Conference Paper
Full-text available
This paper presents a new user interface methodology for Spatial Augmented Reality systems. The methodology is based on a set of physical tools that are overloaded with logical functions. Visual feedback presents the logical mode of the tool to the user by project- ing graphics onto the physical tools. This approach makes the tools malleable in their functionality, with this change conveyed to the user by changing the projected information. Our prototype appli- cation implements a two handed technique allowing an industrial designer to digitally airbrush onto an augmented physical model, masking the paint using a virtualized stencil.
Conference Paper
Full-text available
Collaborative Augmented Reality (AR) setups are becoming increasingly popular. We have developed a collaborative tabletop environment that is designed for brainstorming and discussion meetings. Using a digital pen, participants can annotate not only virtual paper, but also real printouts. By integrating both forms of physical and digital paper, we combine virtual and real 2d drawings, and digital data which are overlaid into a single information space. In this paper, we describe why we have integrated these devices together in a unique way and how they can be used efficiently during a design process.
Conference Paper
Full-text available
This paper presents an Augmented Reality system for aiding a pump assembling process at Grundfos, one of the leading pump producers. Stable pose estimation of the pump is required in order to augment the graphics correctly. This is achieved by matching image edges with synthesized edges from CAD models. To ensure a system which operates at interactive-time the CAD models are pruned off-line and a two-step matching strategy is introduced. On-line the visual edges of the current synthesized model are extracted and compared with the image edges using chamfer matching together with a truncated L2 norm. A dynamic visualization of the augmented graphics provides the user with guidance. Usability tests show that the accuracy of the system is sufficient for assembling the pump.
Conference Paper
Full-text available
The use of Augmented Reality (AR) in training or assisting operators during an assembly task can be considered an innovative and efficient method in terms of time saving, error reduction, and accuracy improvement. Nevertheless, the implementation of an AR-based application is quite difficult, requiring to take into account several factors. This paper provides a general procedure to follow for a correct implementation, starting from an assessment of the assembly task, until the practical implementation. To assess the procedure, it has been applied to the training of unskilled operators during the assembly of a planetary gearbox, with the help of a hand-held device.
Article
Full-text available
In developing new personal electronic products, the development time has shortened to a few months due to high competition in the market. Now it has become very hard to meet the time to market by evaluating products by making physical prototypes. To overcome this problem, we propose an immersive modeling system (IMMS) that enables us to interact with a digital product model in the augmented virtual environment using a multi-modal interface. The IMMS allows the user to evaluate the product model realistically using visual, auditory, and tactile/force senses. The architecture and main modules of the system are described in detail. The integration problems encountered during the development of the test bed are discussed. Application examples to personal electronic products are also included.
Article
Full-text available
Assembly planning and evaluation is an important component of the product design process in which details about how parts of a new product will be put together are formalized. A well designed assembly process should take into account various factors such as optimum assembly time and sequence, tooling and fixture requirements, ergonomics, operator safety, and accessibility, among others. Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and fixtures and evaluation of clearances and tolerances or use simulated human mannequins to approximate human interaction in the assembly process. Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al. in Proc I MECH E Part B J Eng 213(5):461–474, 1999). This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design. This paper provides a review of the research in virtual assembly and categorizes the different approaches. Finally, critical requirements and directions for future research are presented.
Article
Full-text available
Modeling tools typically have their own interaction methods for combining virtual objects. For realistic composition in 3D space, many researchers from the fields of virtual and augmented reality have been trying to develop intuitive interactive techniques using novel interfaces. However, many modeling applications require a long learning time for novice users because of unmanageable interfaces. In this paper, we propose two-handed tangible augmented reality interaction techniques that provide an easy-to-learn and natural combination method using simple augmented blocks. We have designed a novel interface called the cubical user interface, which has two tangible cubes that are tracked by marker tracking. Using the interface, we suggest two types of interactions based on familiar metaphors from real object assembly. The first, the screw-driving method, recognizes the user’s rotation gestures and allows them to screw virtual objects together. The second, the block-assembly method, adds objects based on their direction and position relative to predefined structures. We evaluate the proposed methods in detail with a user experiment that compares the different methods.
Article
Full-text available
Industrial designers make sketches and physical models to start and develop ideas and concept designs. Such representations have advantages that they support fast, intuitive, rich, sensory exploration of solutions. Although existing tools and techniques provide adequate support where the shape of the product is concerned, the exploration of surface qualities such as material and printed graphics is supported to a much lesser extent. Moreover, there are no tools that have the fluency of sketching that allow combined exploration of shape, material, and their interactions. This paper evaluates Skin, an augmented reality tool designed to solve these two shortcomings. By projecting computer-generated images onto the shape model Skin allows for a tangible interaction where designers can explore surface qualities on a three-dimensional physical shape model. The tool was evaluated in three design situations in the domain of ceramics design. In each case, we found that the joint exploration of shape and surface provided creative benefits in the form of new solutions; in addition, a gain in efficiency was found in at least one case. The results show that joint exploration of shape and surface can be effectively supported with tangible augmented reality techniques and suggest that this can be put to practical use in industry today.
Article
Providing assistance in remote maintenance has been one of the major application domains of Augmented Reality (AR) technology. However, creating AR-based maintenance instructions remains an issue in many of these applications. In this paper, an AR online authoring system, Online Authoring for Augmented Reality Remote Maintenance (OAR2M), is proposed to enable an experienced operator to create intuitive AR-based instructions in an environment with no a-priori knowledge conveniently and efficiently. In addition, the Double Window User Interface (DWUI) is proposed to broaden the experienced operator’s view and improve the robustness of the authoring process. Finally, a case study is designed and carried out to test the performance of the proposed authoring system.
Article
Augmented reality (AR) is a novel human–machine interaction that overlays virtual computer-generated information on a real world environment. It has found good potential applications in many fields, such as military training, surgery, entertainment, maintenance, assembly, product design and other manufacturing operations in the last ten years. This paper provides a comprehensive survey of developed and demonstrated AR applications in manufacturing activities. The intention of this survey is to provide researchers, students, and engineers, who use or plan to use AR as a tool in manufacturing research, a useful insight on the state-of-the-art AR applications and developments.
Conference Paper
The product development process, of industrial products, includes a phase dedicated to the design review that is a crucial phase where various experts cooperate in selecting the optimal product shape. Although computer graphics allows us to create very realistic virtual representations of the products, it is not uncommon that designers decide to build physical mock-ups of their newly conceived products because they need to physically interact with the prototype and also to evaluate the product within a plurality of real contexts. This paper describes the hardware and software development of our Augmented Reality design review system that allows to overcome some issues related to the D visualization and to the interaction with the virtual objects. Our system is composed by a Video See Through Head Mounted Display, which allows to improve the D visualization by controlling the convergence of the video cameras automatically, and a wireless control system, which allows us to create some metaphors to interact with the virtual objects. During the development of the system, in order to define and tune the algorithms, we have performed some testing sessions. Then, we have performed further tests in order to verify the effectiveness of the system and to collect additional data and comments about usability and ergonomic aspects. Bibtex entry for this abstract Preferred format for this abstract (see Preferences) Find Similar Abstracts: Use: Authors Title Abstract Text Return: Query Results Return items starting with number Query Form Database: Astronomy Physics arXiv e-prints
Article
RFID technology provides an invisible ‘visibility’ to the end user for tracking and monitoring any objects that have been tagged. Research on the application of RFID in assembly lines for overall production monitoring and control has been reported recently. This paper presents a novel research on implementing the RFID technology in the application of assembly guidance in an augmented reality environment. Aiming at providing just-in-time information rendering and intuitive information navigation, methodologies of applying RFID, infrared-enhanced computer vision, and inertial sensor is discussed in this paper. A prototype system is established, and two case studies are presented to validate the feasibility of the proposed system.
Article
The application background and practical requirement of virtual simulation system of tunnel boring machine have been discussed in this paper. Software MultiGen Creator and Vega are used to build the tunnel boring machine model. Application program of virtual simulation system based on virtual reality is developed by using C++, The author also studys on some key technologie of constructing virtual simulation system of tunnel boring machine, such as multi-cylinder synchronous simulation in virtual environment. The result shows that the work station can be simulated well, and this virtual simulation system can be extended in virtual training and manipulation virtual construction and so on.
Article
In the product development process, designability and operability are evaluated using physical mock-ups which require high costs and a long time to build. In this paper, virtual mock-ups (3D digital models) are used instead of physical mock-ups to solve these problems of cost and time. Given that designers want to touch and operate the designed product in the evaluation process, to cover the disadvantage of virtual mock-ups which do not allow this, this paper proposes a system for evaluating designability and operability of products under development using mixed reality technology. With the proposed method, a physical mock-up covered with a designed virtual mock-up can be touched and operated to evaluate the design being developed in the mixed space of virtual and real ones from the viewpoint of designability and operability.
Article
This paper presents an implementation of machining simulation in a real machining environment applying Augmented Reality (AR) technology. This in situ machining simulation system allows a machinist to analyze the simulation process, adjust the machining parameters, and observe the results in real-time in a real machining environment. Such a system is useful for machinists and trainees during the trial and learning stages, allowing them to experiment with different machining parameters on a real machine without having to worry about possibilities of machine and tool breakages. Three research aspects, namely, the tracking and registration module, the physical simulation module, and the implementation and performance of these proposed methodologies are discussed in details in this paper. Experiments were conducted on a real 3-axis CNC machine to validate and evaluate the performance of the system and the feedback from a survey carried out with the experiments is very positive. Copyright © 2009 John Wiley & Sons, Ltd.
Article
Context-aware, ubiquitous computing is a vision of our future computing lifestyle in which computer systems seamlessly integrate into our everyday lives, providing services and information in anywhere and anytime fashion. Augmented reality (AR) can naturally complement ubiquitous computing by providing an intuitive and collaborative visualization and simulation interface to a three-dimensional information space embedded within physical reality. This paper presents a service framework and its applications for providing context-aware and adaptable 3D visualization and collaboration services for ubiquitous cars (U-cars) using augmented reality, which can support a rich set of ubiquitous car services and collaboration services for distributed maintenance and repair. It utilizes augmented reality for providing visual interactions by superimposing virtual models of car components or sub-assemblies onto real cars, which realizes bi-augmentation between physical and virtual models. It also offers a context processing module to acquire, interpret and disseminate context information. In particular, the context processing module considers user’s preferences for providing customer’s context adaptable services. The prototype system has been implemented to support 3D animation, text-to-speech (TTS), augmented manual, and pre- and post-augmentation services in ubiquitous car service environments.
Article
In this paper, a computer-aided methodology based on augmented reality to manipulate and interactively assemble virtual objects is presented. The developed system, based on both hardware and software implementation, allows the user to be immersed in an augmented scene (real world and virtual objects) by means of an head-mounted display and interact with the virtual components using a glove with sensors. The author presents the implementation of the system facing the problems concerning the interpretation of user’s intent to grasp and move the objects and the computer-aided assembly based on a novel approach concerning active features dynamic recognition. An example of implementation is herein reported and discussed.
Article
Handheld information devices such as digital cameras and smart-phones are products that unify hardware and software. The body of such products is compact, but it has various functions. Usability is one of the most important factors affecting their commercial competitiveness. However, it is difficult to estimate usability precisely in the design process. Because of usability is strongly affected by the shape of the body, the types and arrangement of the switches etc. In order to accurately evaluate the usability of these products, users must operate them directly. However it is difficult to manufacture prototypes that are operable. Therefore, we have adopted an augmented reality technique to solve the problem. This paper introduces, a touch-sensitive augmented reality system, and describes the construction of a touch sensor made of pressure sensitive conductive rubber, a fingertip tracing method using an LED marker, and an augmented reality system. Finally, we have conducted basic system evaluations.
Article
Five-axis milling offers many advantages over the conventional three-axis milling process. However, because of the potentially complex motions, it is difficult for the machine tool operator to anticipate the actual movement based on the NC program. In this paper a software system for the NC path validation and manipulation during the milling process is introduced. This system is meant to expand the information given to the machine tool operator by enriching the view with data of a concurrently running milling simulation. The simulation is synchronized with the real-world machine tool movement by detecting the position of a marker that is mounted on the head stock. With this combination of real-world view and computer-generated data, which is called Augmented Reality, the machine tool operator is able to detect critical situations—like collisions between tool holder and workpiece or excessive forces—and may adjust the NC program accordingly.
Article
In the environment of mixed reality (MR) or augmented reality (AR), there have been several previous works dealing with user interfaces for manipulating and interacting with virtual objects aimed at improving immersive feeling and natural interaction. However, it is still considered that there must be much progress in supporting human behavior-like interactions for providing control efficiency and natural feeling in MR/AR environments. This paper proposes a tangible interaction method by combining the advantages of soft interactions such as hand gesture and MR and hard interactions such as vibro-tactile feedback. One of the main goals is to provide more natural interaction interfaces similar to the manipulation task in the real world by utilizing hand gesture-based tangible interactions. It also provides multimodal interfaces by adopting the vibro-tactile feedback and tangible interaction for the virtual object manipulation. Thus, it can make users get more immersive and natural feeling in the manipulation and interaction with virtual objects. Furthermore, it provides an alternative instruction guideline based on the analysis of the previous interaction while manipulating virtual objects, which makes it possible for the user to minimize manipulation errors during the interaction phase and the learning process, which guides the user to the right direction. We will show the effectiveness and advantage of the proposed approach by demonstrating several implementation results. KeywordsAugmented reality-Hand gesture-Human–computer interface-Mixed reality-Pinch glove-Tangible interface-Vibro-tactile feedback
Conference Paper
Gaining and improving welding skill is important for welders. The advent of virtual reality (VR) technology provides a new kind of medium for skill training. In this paper, a haptic arc welding training method is proposed based on VR and haptic guidance. The training method is designed to emulate the presence of a human tutor which feedbacks forces to a welder to show the proper force/position relation within pre-defined trajectories for attaining hand-mind-eye coordination skills in a virtual environment. Three basic welding operation skills, namely maintaining the proper arc length, maintaining the proper electrode angle, and maintaining the proper traverse speed are selected for training. The Phantom haptic device is used as the haptic interface. The haptic guidance is realized with proportional-plus-derivative (PD) feedback control of the error between the current and ideal trajectory. The proposed method is cost-less, effective, and environment-friend for the training of both novice and skilled welders.
Article
An augmented reality (AR) cueing method designed to improve teleoperator performance under conditions of display-control misalignment is investigated. The teleoperation task was designed to mimic the operation of space robot arms, which are manipulated using hand controllers (HCs) to orient and translate the body-fixed coordinates at the end effector (EE). Cameras provide visual feedback. However, the pose of the EE seen through the camera hinders operator performance due to misalignments between the displayed EE axes and the HC axes. In this paper, the coordinate system of the EE is graphically overlaid in three dimensions on the video views with uniquely colored axes using AR. The same color scheme is used to label the corresponding axes on the HCs. Operators use these color cues to map each axis on the HCs to the corresponding colored axis of the augmented coordinates at the EE to obtain EE movement in the desired direction. Between-groups and within-participant experiments comparing EE trajectory distance, deviation from path, navigation errors, and HC axis usage were used to determine the effectiveness of the augmented coordinates over conventional teleoperation without augmented coordinates. Significant reductions in EE trajectory distance, deviation from path, navigation errors, and single-axis HC usage were observed when participants manipulated the remote robot with augmented coordinates. The results demonstrate that the use of simple AR cues in remote robot arm teleoperation is beneficial to the operator.
Article
Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed.
Article
Layout problems are found in several types of manufacturing systems. Typically, layout problems are related to the location of facilities (e.g., machines, departments) in a plant. They are known to greatly impact the system performance. Most of these problems are NP hard. Numerous research works related to facility layout have been published. A few literature reviews exist, but they are not recent or are restricted to certain specific aspects of these problems. The literature analysis given here is recent and not restricted to specific considerations about layout design.We suggest a general framework to analyze the literature and present existing works using such criteria as: the manufacturing system features, static/dynamic considerations, continual/discrete representation, problem formulation, and resolution approach. Several research directions are pointed out and discussed in our conclusion.
Article
To survive the cut-throat competition in the manufacturing industry, many companies have introduced digital manufacturing technology. Digital manufacturing technology not only shortens the product development cycle times but also improves the precision of engineering simulation. However, building the virtual objects needed for a digital manufacturing environment requires skilled human resources; it is also costly and time-consuming. A high precision environment with the similar resources is also needed for a high precision simulation. In this paper, we propose a method of constructing a mixed reality-based digital manufacturing environment. The method integrates real objects, such as real images, with the virtual objects of a virtual manufacturing system. This type of integration minimizes the cost of implementing virtual objects and enhances the user's sense of reality. We studied several methods and derived a general framework for the system. Finally, we developed our idea into a virtual factory layout planning system. To assign the pose and position of real objects in virtual space, we applied a circle-based tracking method which uses a safety sign instead of the planar-square-shaped marker generally used for registration. Furthermore, we developed the framework to encapsulate simulation data from legacy data and process data for visualization based on mixed reality.
Article
The authors present a task oriented programming system for remote-laser-welding (RLW) with conventional optics, which considers real workpiece data and process limits for the calculation of robot paths. This approach relies on an augmented-reality-based user interface for effective 3D-interaction and task definition. A powerful task and motion planning system has been developed, that transfers task-descriptions to optimized, executable robot operations. The resulting system offers fast and efficient spatial interaction methods that allow even novel users to quickly define operations on a task-level. The approach simplifies and accelerates the programming process, consequently leading to a reduction of cycle time for a RLW-task by more than 30%.
Article
This paper presents a mixed reality (MR) environment for collaborative product design and development. This MR collaborative environment is based on a client/server architecture where multiple users can create and modify product features in a 3D physical space simultaneously. A tri-layer model representation scheme is designed to facilitate product creation and visualization. Intuitive feature manipulation methods and grid-and-snap modes have been designed to support solid modelling in the MR environment. Bi-directional communication between the MR environment and the CAD system ensures any modifications made by one user are propagated to the views of other users for maintaining design data consistency.
Article
This paper discusses the benefits of applying Augmented Reality (AR) to facilitate intuitive robot programming, and presents a novel methodology for planning collision-free paths for an n degree-of-freedom (DOF) manipulator in an unknown environment. The targeted applications are where the end-effector is constrained to move along a visible 3D path/curve, which position is unknown, at a particular orientation with respect to the path, such as arc welding and laser cutting. The methodology is interactive as the human is involved in obtaining the 3D data points of the desired curve to be followed through performing a number of demonstrations, defining the free space relevant to the task, and planning the orientations of the end-effector along the curve. A Piecewise Linear Parameterization (PLP) algorithm is used to parameterize the data points using an interactively generated piecewise linear approximation of the desired curve. A curve learning method based on Bayesian neural networks and reparameterization is used to learn and generate 3D parametric curves from the parameterized data points. Finally, the orientation of the end-effector along the learnt curve is planned with the aid of AR. Two case studies are presented and discussed.
Article
In this work we integrate augmented reality technology in a product development process using real technical drawings as a tangible interface for design review. We present an original collaborative framework for Augmented Design Review Over Network (ADRON). It provides the following features: augmented technical drawings, interactive FEM simulation, multimodal annotation and chat tools, web content integration and collaborative client/server architecture. Our framework is intended to use common hardware instead of expensive and complex virtual or augmented facilities. We designed the interface specifically for users with little or no augmented reality expertise proposing tangible interfaces for data review and visual editing for all the functions and configurations. Two case studies are presented and discussed: a real-time “touch and see” stress/strain simulation and a collaborative distributed design review session of an industrial component.
Conference Paper
Although there has been much speculation about the potential of Augmented Reality (AR), there are very few empirical studies about its effectiveness. This paper describes an experiment that tested the relative effectiveness of AR instructions in an assembly task. Task information was displayed in user's field of view and registered with the workspace as 3D objects to explicitly demonstrate the exact execution of a procedure step. Three instructional media were compared with the AR system: a printed manual, computer assisted instruction (CAI) using a monitor-based display, and CAI utilizing a head-mounted display. Results indicate that overlaying 3D instructions on the actual work pieces reduced the error rate for an assembly task by 82%, particularly diminishing cumulative errors - errors due to previous assembly mistakes. Measurement of mental effort indicated decreased mental effort in the AR condition, suggesting some of the mental calculation of the assembly task is offloaded to the system.
Conference Paper
This paper presents a new technique to simultaneously model in both the physical and virtual worlds. The intended application domain for this technique is industrial design. A designer physically sculpts a 3D model from foam using a hand-held hot wire foam cutter. Both the foam and cutting tool are tracked, allowing the system to digitally replicate the sculpting process to produce a matching 3D virtual model. Spatial Augmented Reality is used to project visualizations onto the foam. Inspired by the needs of industrial designers, we have developed two visualizations for sculpting specific models: Target, which shows where foam needs to be removed to produce a model, and Cut Animation, which projects the paths for cuts to be made to reproduce a previous artifact. A third visualization of the wireframe of the generated model is projected onto the foam and used for verification. The final visualization employs 3D procedural textures such as a wood grain texture, providing a simulation of volumetric rendering. Volumetric rendering techniques such as this provide a more natural look that is projected onto the foam. Once the object has been modeled physically and virtually, the designer is able to annotate and paint the finished model. The system has been evaluated through a user study conducted with students from the School of Industrial Design at the University of South Australia.
Conference Paper
Human-robot interaction issues, especially for industrial robots, have largely been confined to finding better ways to reconfigure or program the robots. In this paper, an Augmented Reality based Robot Programming (RPAR-II) system is proposed. A virtual robot, which is a replicate of a real robot, is used in a real environment to perform and simulate the robot trajectory planning process. The RPAR-II system assists the users during the robot programming process from task planning to execution. Stereo vision-based methodologies for virtual objects registration as well as interactive device position tracking are employed in this system. Practical issues concerning the system implementation are discussed.
Conference Paper
Augmented Reality (AR) is an increasingly important application domain for computer graphics and user interface design. In this paper, the authors propose an AR interface for assembly design and evaluation. The methodologies to establish an AR assembly environment and the architecture of the proposed system are presented. A key novelty of this research is the use of bare hands for human-computer interaction in the AR assembly environment. An effective hand segmentation method based on a restricted coulomb energy (RCE) neural network is developed. Experimental results show that the proposed hand segmentation method is effective and robust and the finger tracking algorithm runs in real time under a wide range of lighting conditions. An experiment using two fingertips to control virtual objects to simulate assembly in a 2D space is conducted.
Conference Paper
This paper describes, Napkin Sketch, a 3D sketching interface which attempts to support sketch-based artistic expression in 3D, mimicking some of the qualities of conventional sketching media and tools both in terms of physical properties and interaction experience. A portable tablet PC is used as the sketching platform, and handheld mixed reality techniques are employed to allow 3D sketches to be created on top of a physical napkin. Intuitive manipulation and navigation within the 3D design space is achieved by visually tracking the tablet PC with a camera and mixed reality markers. For artistic expression using sketch input, we improve upon the projective 3D sketching approach with a one stroke sketch plane definition technique. This coupled with the hardware setup produces a natural and fluid sketching experience.
Conference Paper
A large number of augmented reality (AR) applications have been developed for handheld platforms built around ultra mobile computers. Unfortunately, these platforms, created for research, are not appropriate for demonstrations and user testing because their ad hoc nature affects the user experience. An AR platform designed for the average user will provide more accurate data on software usability. This paper presents such a platform designed with user input to meet their expectations. The result is a two handed device that a randomly selected group of users would want to have in their workplace. This platform is ideal for testing and commercializing industrial augmented reality applications.
Conference Paper
Augmented reality (AR) based human-machine interaction (HMI) provided a seamless interface between user and application environment, but occlusion handing remained as a tough problem in AR applications. Recently most AR occlusion handling algorithms aimed at general environment, more special research on AR occlusion handling for application in the context of industrial assembly, such as assembly training and assembly task guidance, is required. An occlusion handling method aimed at video see-through AR-based assembly system is presented based on the analysis of the occlusion between virtual part, visual assembly feature, navigation information, physical part and physical assembly environment in the context of virtual/physical assembly working space. The method was implemented in prototype AR-assembly system and was proved to be efficient in handling the virtual/physical occlusion in augmented assembly scene.
Conference Paper
Mixed reality (MR) based human-machine interaction provides a seamless interface between user and application environment, which synthesizes the advantages of the convenient interaction of virtual reality, and the strong realistic of augmented reality. In this paper, MR is applied in the context of industrial assembly process, and a MR based assembly verification and training platform is proposed. In the MR based assembly environment, virtual model, real images and augmented information are jointly displayed on the assembly scene, accessorizing multi-video display windows of different angles of view to browse the real assembly scene. Additionally, constraint proxies figuratively reconstruct part’s constraint relationship in the MR environment, and avoid the complex calculation of constraint’s match. By using a virtual hand with constraints guided to assemble, an effective and realistic assembly process experience is provided to the user.
Conference Paper
Market rules show that most of the times the aesthetic impact of a product is an important aspect that makes the difference in terms of success among different products. The product shape is generally created and represented during the conceptual phase of the product and the last trends show that the use of haptic devices allows users to more naturally and effectively interact with 3D models. Nevertheless the shape needs to satisfy some engineering requirements, and its aesthetic and functional analysis requires the collaboration and synchronization of activities performed by various experts having different competences and roles. This paper presents the description of an environment named PUODARSI that allows designers to modify the shape of a product and engineers to evaluate in real-time the impact of these changes on the structural and fluid dynamic properties of the product, describing the choice of the software tools, the implementation and some usability tests.