ArticlePDF Available

Artificial intelligence in engineering: evolution of virtual product development in the context of medical device industry

Authors:
  • :em engineering methods AG

Abstract

In this paper a framework is introduced to elicit requirements of Artificial Intelligence (AI) towards the System Model in order to support engineers in Virtual Product Development (VPD). The framework supports to shape the evolution necessary in system modelling to provide the right data in the right quality for AI. Depending on the business benefit that a company wants to realize, AI can provide capabilities and solutions to be implemented in VPD. Therefore, differing requirements need to be fulfilled for each business benefit that a company pursues. This framework is applied in a case study in the Medical Device Industry, where a marked leader wants to improve their capability to create innovations by automatically increasing their market knowledge. Therefore, a Natural Language Processing system is applied to automatically enhance the company knowledge base with an external source. This is realized in an initial prototype by analyzing Tender Documents and automatically connecting the new knowledge generated to the company internal knowledge in the system model. This paper is part of research activities within the Research and Development department of a global Medical Device Company. The Objective of these research activities is to explore the use of Artificial Intelligence to analyze and support the Virtual Development Process.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Current processes for highly interdisciplinary and complex smart products rely on the support of a big variety of IT systems. At the lowest level, authoring systems are used to create digital models and engineering data objects. A multitude of data management systems which in turn have to be integrated into comprehensive Product Lifecycle Management (PLM) and Enterprise Resource Planning (ERP) solutions enable the management of this product information and engineering processes (e.g. for engineering change and release management) along the entire product lifecycle. Engineers use these information management approaches for their daily work processes like engineering change or release management. In industrial environments, several heterogeneous IT systems coexist but cannot easily be connected and thus provide critical barriers for engineering collaboration. Being able to flexible access required engineering information from these IT systems supported by a platform that provides integrated metadata repository engineering could improve engineering workflows like change management dramatically. The approach introduced in this paper describes a Metadata Repository for Semantic Product Lifecycle Management (SP²IDER) which provides an additional information management layer that uses an IT architecture based on a minimalistic core to view and access data from a multitude of IT source systems. Instead of storing data from these source systems in SP²IDER, open web technologies like Linked Data principles and JSON-LD allow providing real-time access to the source systems.
Conference Paper
Full-text available
Machine learning (ML) has demonstrated practical impact in a variety of application domains. Software engineering is a fertile domain where ML is helping in automating different tasks. In this paper, our focus is the intersection of software requirement engineering (RE) and ML. To obtain an overview of how ML is helping RE and the research trends in this area, we have surveyed a large number of research articles. We found that the impact of ML can be observed in requirement elicitation, analysis and specification, validation and management. Furthermore, in these categories, we discuss the specific problem solved by ML, the features and ML algorithms used, and datasets, when available. We outline lessons learned and envision possible future directions for the domain.
Conference Paper
Full-text available
The enterprise level software application that supports the strategic product-centric, lifecycle-oriented and information-driven Product Lifecycle Management business approach should enable engineers to develop and manage requirements within a Functional Digital Mock-Up. The integrated, model-based product design ENOVIA/CATIA V6 RFLP environment makes it possible to use parametric modelling among requirements, functions, logical units and physical organs. Simulation can therefore be used to verify that the design artefacts comply with the requirements. Nevertheless, when dealing with document-based specifications, the definition of the knowledge parameters for each requirement is a labour-intensive task. Indeed, analysts have no other alternative than to go through the voluminous specifications to identify the values of the performance requirements and design constraints, and to translate them into knowledge parameters. We propose to use natural language processing techniques to automatically generate Parametric Property-Based Requirements from unstructured and semi-structured specifications. We illustrate our approach through the design of a mechanical ring.
Article
Produkt- und Betriebsdaten können durch den Einsatz von Künstlicher Intelligenz (KI) genutzt werden, um die Produktentwicklung zu verbessern. Voraussetzung dafür ist, dass Daten in der richtigen Qualität und in ausreichender Anzahl verfügbar sind. Um diese Voraussetzung (KI-Readiness) besser zu erfüllen, benötigt es häufig großer Anstrengungen im Unternehmen. Prozesse, Methoden und IT-Systeme müssen global harmonisiert werden. In diesem Beitrag werden die Evolutionsschritte hin zur “KI-Readiness” in der Produktentwicklung beschrieben.
Article
Natural language is ubiquitous in the workflow of medical imaging. Radiologists create and consume free text in their daily work, some of which can be amenable to enhancements through automatic processing. Recent advancements in deep learning and "artificial intelligence" have had a significant positive impact on natural language processing (NLP). This article discusses the history of how researchers have extracted data and encoded natural language information for analytical processing, starting from NLP's humble origins in hand-curated, linguistic rules. The evolution of medical NLP including vectorization, word embedding, classification, as well as its use in automated speech recognition, are also explored. Finally, the article will discuss the role of machine learning and neural networks in the context of significant, if incremental, improvements in NLP.
Article
Complex, cyber-physical systems must be founded on a digital blueprint that provides the most accurate representation of the system by federating information from engineering models across multiple enterprise repositories. This blueprint would serve as the digital surrogate of the system and evolve as the actual system matures across its lifecycle, from conception and design to production and operations. This paper presents a graph-based approach for realizing the digital blueprint, which we refer to as the Total System Model. The paper is divided into five parts. Part 1 provides an introduction to use cases for model-based systems engineering. Part 2 introduces graph concepts for the Total System Model. Part 3 provides a demonstration of the graph-based approach using Syndeia software as a representative application. Part 4 provides a summary of this paper, and Part 5 lays out potential directions for future work.
Article
Neural networks are a family of powerful machine learning models. This book focuses on the application of neural network models to natural language data. The first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries. The second part of the book (Parts III and IV) introduces more specialized neural network architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. These architectures and techniques are the driving force behind state-of-the-art algorithms for machine translation, syntactic parsing, and many other a...
Chapter
Das Engineering unterliegt derzeit einem massiven Wandel. Smarte Systeme und Technologien, Cybertronische Produkte, Big Data und Cloud Computing im Kontext des Internet der Dinge und Dienste sowie Industrie 4.0. Die Medien überschlagen sich mit Meldungen über die neue, die vierte industrielle Revolution. Der amerikanische Ansatz des „Industrial Internet“ beschreibt diese (R)evolution jedoch weitaus besser als der eingeschränkte und stark deutsch geprägte Begriff Industrie 4.0. Industrial Internet berücksichtigt den gesamten Produktlebenszyklus und adressiert sowohl Konsum- und Investitionsgüter als auch Dienstleistungen. Dieser Beitrag beleuchtet das zukunftsträchtige Trendthema und bietet fundierte Einblicke in die vernetzte Engineering-Welt von morgen, auf Ihre Konstruktionsmethoden und -prozesse sowie auf die IT-Lösungen.
Article
The idea of lean product development (LPD), with Toyota used as the main case in point of demonstrating its abilities, has gained attention among managerial levels of companies dealing with product development. Allegedly the main gains of LPD are a high rate of successful projects in terms of cost and quality along with shorter lead times as well as fewer overruns in time and budget [1, 2]. This paper investigates the LPD concept in comparison with established models in the current product development paradigm in order to map out the main differences. It also compares LPD to the way product development is carried out in practice on the example of two European automotive companies. The results show that the main differences, among others, can be found first in the way knowledge is honored and managed, and second in how and when decisions are made along the process. From the discussion of the results, conclusions are drawn for potential improvements to traditional product development models.