Conference PaperPDF Available

Information Quality in PLM: A Production Process Perspective

Authors:

Abstract and Figures

Recent approaches for Product Lifecycle Management (PLM) aim for the efficient utilization of the available product information. A reason for this is that the amount of information is growing, due to the increasing complexity of products, and concurrent, collaborative processes along the lifecycle. Additional information flows are continuously explored by industry and academia – a recent example is the backflow of information from the usage phase. The large amount of information, that has to be handled by companies nowadays and even more in the future, makes it important to separate the “fitting” from the “unfitting” information. A way to distinguish both is to explore the quality of the information, in order to find those information that are “fit for purpose” (information quality). Since the amount of information is so large and the processes along the lifecycle are diverse in terms of their expectations about the information, the problem is similar to finding a needle in a hay stack. This paper is one of two papers aiming to address this problem by giving examples why information quality matters in PLM. It focuses on one particular lifecycle process, in this case production. An existing approach, describing information quality by 15 dimensions, is applied to the selected production process.
Content may be subject to copyright.
adfa, p. 1, 2011.
© Springer-Verlag Berlin Heidelberg 2011
Information Quality in PLM:
A production process perspective
Thorsten Wuest
1
, Stefan Wellsandt
2
, Klaus-Dieter Thoben
2,3
1
Industrial and Management Systems Engineering, Benjamin M. Statler College of
Engineering and Mineral Resources, West Virginia University, Morgantown, USA
thwuest@mail.wvu.edu
2
BIBA - Bremer Institut für Produktion und Logistik GmbH, Bremen, Germany
3
Faculty of Production Engineering, University of Bremen, Germany
{wel, tho}@biba.uni-bremen.de
Abstract. Recent approaches for Product Lifecycle Management (PLM) aim
for the efficient utilization of the available product information. A reason for
this is that the amount of information is growing, due to the increasing com-
plexity of products, and concurrent, collaborative processes along the lifecycle.
Additional information flows are continuously explored by industry and aca-
demia a recent example is the backflow of information from the usage phase.
The large amount of information, that has to be handled by companies nowa-
days and even more in the future, makes it important to separate the “fitting”
from the unfitting” information. A way to distinguish both is to explore the
quality of the information, in order to find those information that are “fit for
purpose” (information quality). Since the amount of information is so large and
the processes along the lifecycle are diverse in terms of their expectations about
the information, the problem is similar to finding a needle in a hay stack.
This paper is one of two papers aiming to address this problem by giving exam-
ples why information quality matters in PLM. It focuses on one particular
lifecycle process, in this case production. An existing approach, describing in-
formation quality by 15 dimensions, is applied to the selected production pro-
cess.
Keywords: Product Lifecycle Management; quality management; manufactur-
ing; production; production planning and control; data quality
1 Introduction and problem description
Closing the information loops along the product lifecycle is a recent effort under-
taken by research projects [1], [2]. One of the reasons why closing information loops
is so important is the expectation that activities like design, production, sales and
maintenance will be able to realize products and services with a better ratio between
expected and delivered characteristics (i.e. higher quality). As product quality is di-
Post-Print Version
The final publication is available at Springer via http://link.springer.com/chapter/10.1007/978-3-319-33111-9_75
rectly influenced by the quality of the production processes [3, 4], an increased avail-
ability of information will benefit production planning and control activities. Addi-
tional information can help to understand complex problems and take the most suita-
ble decisions to address them in a timely manner common examples for these bene-
fits are Concurrent Engineering [5] and agile software development [6].
Technical capabilities for the collection and analysis of information, as well as a
sound business case are important prerequisites to increase the availability of infor-
mation in decision-making. However, the growing amount and heterogeneity of in-
formation caused by, e.g. the industrial Internet and the Internet of Things, foster the
need to identify information that is fit-for-purpose (i.e. focus on information quality).
Recent literature about PLM puts little emphasis on this aspect.
This paper will discuss the importance of information quality in PLM from the per-
spective of production. The same issue but from a product design perspective is de-
scribed in a sister paper (see [7]). Section two of this paper outlines information flows
in PLM and an exemplary approach to describe information quality.
2 Related work
2.1 Information flows in PLM
Handling product data and information along the complete product lifecycle is
stated as PLM [8]. A product’s lifecycle can be structured into three subsequent phas-
es stated as ‘beginning of life’ (BOL), ‘middle of life’ (MOL) and ‘end of life’ (EOL).
The initial concept of PLM was extended during the EU-funded large-scale research
project PROMISE it demonstrated the possibilities of closing information loops
among different processes of the lifecycle [9]. The recent concept of PLM is illustrat-
ed in Figure 1. Internal information flows within the phases are not covered in the
illustration.
Fig. 1. A product lifecycle model and its major information flows [10]
Among the three lifecycle phases, at least two types of information flows can be
established. The forward-directed flows are the ones that are typically mandatory to
design, produce, service and dismiss the product. Backwards-directed flows are typi-
cally optional and allow optimization of processes/activities.
2.2 Information Quality (IQ)
Seamless decision-making processes are largely based on high-quality information.
Decision-makers realize issues with information quality if their expectations about the
information are not met. Examples for potential problems caused by low information
quality are summarized in Table 1.
Table 1. Examples of problems related to flawed information [11]
not based on fact
consists of inconsistent meanings
is irrelevant to the work
of doubtful credibility
is incomplete
is hard to manipulate
presents an impartial view
is hard to understand
-
From a general perspective, the quality of information can be defined as the degree
that the characteristics of specific information meet the requirements of the infor-
mation user (derived from ISO 9000:2005 [12]). Since the topic is intensely discussed
for at least two decades, several sophisticated definitions for ‘information quality’
exist. Since the purpose of this paper is not to discuss these fundamental concepts, a
thoroughly discussed definition is selected for this paper. Rohweder et al. propose a
framework for information quality that is an extension of the work conducted by
Wang and Strong [13] the framework contains 15 information quality dimensions
that are assigned to four categories as summarized in Table 2.
Table 2. Dimensions of information quality [14]
Quality category
Scope
Inherent
Content
Representation
Appearance
Purpose-dependent
Use
System support
System
The selected definition of information quality contains four categories of dimen-
sions that are related to a specific scope. Each category has dimensions that character-
ize information by two to five dimensions (15 in total). A brief description of some
dimensions is provided in Table 3.
Table 3. Excerpt of quality dimensions and their description (based on [14])
Quality dimension
Description
Reputation
Credibility of information from the information user’s perspective
Free of error
Not erroneous; consistent with reality
Objectivity
Based on fact; without judgment
Believability
Follows quality standards; significant effort for collection and processing
Understandability
Meaning can be derived easily by information user
Interpretability
No ambiguity concerning the actual meaning; wording and terminology
Concise representation
Clear representation; only relevant information; suitable format
Consistent representation
Same way of representing different information items
Accessibility
Simple tools and methods to reach information
Ease of Manipulation
Easy to modify; reusable in other contexts
In order to receive a specific statement about the actual quality of an information
item, the as-is characteristics of the item must be compared with the required charac-
teristics (preferably using all of the quality dimensions). The better the matching is,
the higher the information quality is considered.
3 Approach
Production is the process of realizing products according to the specifications orig-
inating from product development. In this paper, production includes production
planning, manufacturing and assembly processes. During production, several charac-
teristics of the later product and its behavior during usage are defined, e.g. by the
chosen materials, machines and machine parameters. The decision of which materials,
machines and parameters are going to be used is taken during the production planning
phase and the previous product development phase. In this paper, the product devel-
opment phase is not in the focus.
During production, information between different manufacturing processes is ex-
changed. The exchanged information is highly important to ensure the final product
quality [15]. In manufacturing, especially in the area of process monitoring and con-
trol, information quality can play a decisive role in whether an analysis and the subse-
quent action is successful or not. In order to apply the selected quality dimensions to
production, relevant information flows are divided into three categories as illustrated
in Fig. 2.
Fig. 2. Types of information flows in production
Information flows within production (internal). In production, information qual-
ity is generally of very high relevance as it often has a direct impact on key figures of
a company or a production network. Information used in manufacturing is often used
as input for machines with a low level of robustness against, e.g., missing values.
Today, production involves multiple processes exchanging not only physical goods
but also information. Those process chains can become rather complex and can be
considered dynamic. Looking at manufacturing at a more granular level, each process
and product has to be considered individually due to, e.g., deviations in its materials.
Fig. 3. Internal information flows of a production process chain
Through the backflow of information about the individual product earlier in the
same process or from previous processes individual adjustment of process parameters
becomes possible. These adaptions of the process may lead to significantly improved
performance and/or avoidance of significant problems. Today, many decisions regard-
ing value adding production processes are taken based on available information. Con-
trol loops, scheduling decisions and program planning are just some examples which
strongly depend on information quality. Information in this context can include real-
time sensor information, e.g., for monitoring and control purposes, as well as quality
measures for subsequent process adjustments. A practical example can be information
about the individual chemical composition of the steel, used during heat treatment.
This information is vital for reaching the quality goal.
Information flows towards production (inbound). Extending this towards the
potential use of information from lifecycle phases other than production to support
production processes in a similar way, certain differences come to mind and present
specific requirements towards the information quality. The two main inbound infor-
mation sources are depicted in Fig. 4.: 1) information from the product design phase
and 2) information from the usage and maintenance phase.
Fig. 4. Inbound information flows towards production
The product design phase is essential for production. In design the main properties
of the later product are set and the processes and process parameters are chosen ac-
cording to the information received from design. For information from usage, there
are two possibilities. First, the information is directly transferred and utilized or the
usage information is indirectly utilized via the design phase. An example for relevant
information from usage/maintenance is the surface quality of a product that depends
on environmental factors during usage. A products surface characteristics can be
influenced to some extend during the production process (e.g., heat treatment).
Information flows from production (outbound). In production, information is
not only utilized but also produced in large quantities machinery and tools are
equipped with sensors continuously producing information. Also process monitoring
and advanced systems, like Manufacturing Execution Systems, contribute to the in-
creased information generation. This information may be a valuable source for stake-
holders outside of production. In Fig. 5., outbound information flows to three other
lifecycle phases are depicted: 1) recycling and disposal, 2) usage and maintenance and
3) product design and development. Examples cases for these three information flows
are:
1. Information about potentially hazardous materials of the product introduced during
production (e.g. heavy metals).
2. Information about lubricants used during production which could influence the ar-
eas of application of the product (e.g. regulation in food processing industry).
3. Information can be directly utilized for future design improvements that lead to a
variety of improvements, e.g. quality, efficiency or safety.
From the perspective of production, the example number three can be considered
as the most important outbound information flow. Popular approaches like ‘Design
for Manufacturing’ actually rely on such outbound information from production.
Fig. 5. Outbound information flows from production
Within all these different possible information flows, an important aspect to con-
sider is the information quality. Whereas the right information in the right quality may
lead to significant improvements, flawed information may even have a worse impact
than no information at all. In the next section, this is discussed according to the previ-
ously introduced IQ dimensions.
4 Discussion
In this section, the feasibility of the different IQ dimensions during application in
manufacturing and production planning is discussed. The overall structure follows the
one depicted in table 2, with sub-sections resembling the four ‘scope’ categories. As
mentioned, production is very diverse in the applied processes and also individual for
each product type. Therefore, the given examples used to emphasize certain quality
aspects are not meant to be comprehensive there are, most likely, multiple other
influences and aspects that are not covered in this paper. For each scope of the select-
ed IQ framework, the three different information flows introduced in the previous
section (i.e. internal, inbound and outbound) are briefly discussed.
4.1 Content scope
The content scope, resembling the IQ dimensions ‘reputation’, ‘free of error’, ‘ob-
jectivity’ and ‘believability’, is very relevant in production. For information flows
within production (internal), ‘free of error’ is very important as the information is
often directly utilized by technical systems. Given that process chains are often dis-
tributed between different locations and companies, ‘reputation’ and ‘believability’
may also be relevant. However, ‘objectivity’ may be considered less relevant in this
area as measuring and sensor data can be considered rather objective by nature. For
outbound and inbound information however, all four IQ dimensions are highly rele-
vant. From a production perspective, these IQ dimensions matter most for inbound
information. However, for other stakeholders within the product lifecycle, the im-
portance of information quality of outbound information can be considered equally
high. Here ‘objectivity’ is also relevant as these information items may contain hu-
man-authored feedback information including its characteristic of subjectivity also
stated as response-bias [15].
4.2 Appearance scope
The relevant IQ dimensions of this scope are ‘understandability’, ‘interpretability’,
‘concise representation’ and ‘consistent representation’. For information flows within
production (internal), all four IQ dimensions discussed here are important. In highly
automated production environments, the appearance of information is mostly defined,
due to standardization or design of the production system itself. If standards are not
met nor the communication rules of the automated system are not followed, the sys-
tem will fail in most cases. Thus, these IQ dimensions are hard requirements, which
have to be fulfilled. In production processes with more manual work and thus more
human-based decision-making, the appearance of information is less regulated and,
therefore, must be controlled more. For inbound and outbound information flows,
these IQ dimensions cannot be assumed fulfilled due to standardization. There, the
information is more diverse and the possibility of different systems and/or require-
ments is rather high. Thus, these IQ dimensions have to be carefully considered prior
to establishing collaboration along the product lifecycle.
4.3 Use scope
Regarding the use scope, the IQ dimensions ‘timeliness’, ‘value-added’, ‘com-
pleteness’, ‘appropriate amount of data’ and ‘relevancy’ are in the focus. Within pro-
duction (internal) it can be observed, that timeliness, completeness, appropriate
amount of data, and relevancy are highly important. The often-automated use of in-
formation by machinery and monitoring tools relies on information fulfilling these
requirements. For instance, even though today’s computing power and algorithms can
handle large amounts of data rather well, it is still important to evaluate what data is
really relevant with the goal in mind. For inbound and outbound information flows
in production these factors are also of relevance, however, there the potential use is
broader and thus the variety of quality requirements acceptable may be higher. For all
information flows in production the IQ dimensions ‘value-added’ is very important,
as it is after-all a business operation.
4.4 System scope
From a system perspective, ‘accessibility’ and ‘ease of manipulation’ are the de-
sired IQ dimensions. Within production (internal), accessibility is critical, especially
in distributed production environments. Assuming that in production information is
mostly based on sensor or other non-human-authored data, the access is mostly de-
pending on a) available communication means (technical) and b) the access rights.
Ease of manipulation is on the other hand not considered critical within production.
Regarding inbound and outbound information flows, accessibility is again highly
critical, with access rights being rather complicated to manage. Ease of manipulation
is more important here, as it might be necessary to reformat or pre-process infor-
mation for different purposes.
5 Conclusion and Outlook
This paper discusses the importance of information quality in PLM from a produc-
tion process perspective. From literature, a framework with 15 IQ dimensions is se-
lected. Then three different categories of information flows are defined to structure
the discussion. These flows concern the usage of information within production (in-
ternal), coming from production used elsewhere (outbound) and coming towards pro-
duction form different phases (inbound). In the following discussion, the importance
of information quality in production is discussed by mapping the IQ dimensions with
the three types of information flows identified before.
While the depth of the investigation conducted in this paper remains rather low
(e.g. few examples and no consistent use case), it aims to substantiate a debate about
the importance of information quality in PLM. This topic is of major importance, as
the amount, heterogeneity and velocity of available information is growing and the
selection of relevant information becomes more difficult. The definition of three types
of information flows (i.e. internal, inbound and outbound) can be applied to other
processes along the product lifecycle, in order to receive examples for all major
lifecycle phases. In future work, a combined paper is envisaged for that purpose.
Acknowledgement
This project has received funding from the European Union’s Horizon 2020 research
and innovation programme under grant agreement no. 636951 (Manutelligence). The
authors wish to acknowledge the Commission and all the project partners for a fruitful
collaboration. Finally, the authors would like to thank the reviewers for their com-
ments that helped to improve the manuscript.
References
1. Manutelligence Consortium. Manutelligence project. Available: www.manutelligence.eu.
Accessed: 29.05.2015.
2. Falcon Consortium. Falcon project. EU-funded research and innovation project. 2015.
3. Brinksmeier, E. (1991). Prozeß- und Werkstückqualität in der Feinbearbeitung. Fortschritt-
Berichte VDI, Reihe 2: Fertigungstechnik. Düsseldorf: VDI-Verlag.
4. Jacob, J., & Petrick, K. (2007). Qualitätsmanagement und Normung. In R. Schmitt & T.
(Eds. . Pfeifer (Eds.), Masing Handbuch Qualitätsmanagement (pp. 101121). München:
Carl Hanser Verlag.
5. Fohn, D.S.M., Greef, D.A., Young, P.R.E., O’Grady, P.P., 1995. Concurrent engineering,
in: Adelsberger, H.H., Lažanský, J., Mařík, V. (Eds.), Information Management in Com-
puter Integrated Manufacturing, Lecture Notes in Computer Science. Springer Berlin Hei-
delberg, pp. 493505.
6. Highsmith, J., Cockburn, A., 2001. Agile software development: the business of innova-
tion. Computer 34, 120127. doi:10.1109/2.947100
7. Wellsandt S., Wuest T., Hribernik K., Thoben, K.-D. Information Quality in PLM: A
Product Design Perspective. In: APMS 2015 International Conference. Tokyo, Japan, Sep-
tember 5-9, 2015, submitted paper.
8. Kiritsis D. Closed-loop PLM for intelligent products in the era of the Internet of things.
Computer-Aided Design, vol. 43, no. 5, Mai 2011, pp. 479501.
9. Promise Integrated Project. promise-innovation.com. Available: http://promise-
innovation.com/components. Accessed: 26.03.2015.
10. S. Wellsandt, K. Hribernik, and K.-D. Thoben, “Sources and characteristics of information
about product use,” in 25
th
CIRP Design Conference, Haifa, Israel, 2015.
11. Ge M.; Helfert M.: A Review of Information Quality Research - Develop a Research
Agenda. Proceedings of the 12
th
International Conference on Information Quality, MIT,
Cambridge, MA, USA, November 9-11, 2007.
12. ISO 9000:2005. Quality management systems Fundamentals and vocabulary. Interna-
tional Organization for Standardization.
13. Wang, R. Y.; Strong, D. M.: Beyond Accuracy: What Data
Quality Means to Data Consumers. In: Journal of Management Information Systems,
1996, Vol. 12, Issue 4, pp. 5 33.
14. Rohweder J.; Kasten G.; Malzahn D.; Piro A.; Schmid J. (2008): Informationsqualität
Definitionen, Dimensionen und Begriffe. In: Hildebrand K.; Gebauer M.; Hinrichs H.;
Mielke M.: Daten und Informationsqualität. Springer, 2008, pp. 2545.
15. Wuest, T. (2015). Identifying Product and Process State Drivers in Manufacturing Systems
Using Supervised Machine Learning. Heidelberg Berlin: Springer International Publishing.
... The integration of the customer and supplier at each stage supports the development of a PSS platform that will provide opportunities for value cocreation during the MOL phase (Fig. 4). Information flows between the different phases of the lifecycle were described by Wuest et al. (2016), but there is a need to orchestrate these flows (in effect, create feedback) between the actors and between the phases, allowing both tacit and explicit information to be integrated by the actors into a solution and then lead to value co-creation. The framework must also support productivity from the cocreation through joint experiences or working, and it must have the ability to provide information that is suitable for the actor receiving it, in a form that enables the recipient to use it to make decisions and to build their knowledge base. ...
... Where are the data and informa on bo lenecks? Fig. 4 Framework for supporting value co-creation within the context of digitallyenabled PSS (based on Grönroos, 2011;Terzi et al., 2010;Wuest et al., 2016) mainly from the middle of life phase within the perspective of PSS. The Framework in Fig. 4 identifies the important role of a so-called "moderator" or "resource integrator" to orchestrate the value co-creation and to govern the overall process. ...
... Grönroos (2011) stated that the value co-creation in the BOL phase is driven by the supplier, nevertheless the BOL in a PSS provides the platform for value-in-use in the MOL phase. Therefore, the integration of information that different actors have is important and in line with the backward and forward information flows identified by Wuest et al. (2016). ...
Chapter
This book chapter describes a conceptual framework that can support the identification of value co-creation within the context of digitally-enabled Product-Service Systems (PSS). The framework was developed based on the understanding of how and where value co-creation takes place along the first two phases of the product lifecycle. It does this by understanding how and where co-creation occurs, and it also considers the translation of data into information that can become knowledge within the context of the digitally-enabled PSS. The framework glues the different aspects together with an underlying orchestration and governance that focusses on supporting value co-creation based on the integration of information with data. The starting assumption is that the framework could be applied to existing PSS with their underlying value propositions and business models.
... Fig. 1 Average frequency (per paper) of "outcome" and "performance" terms by field 137 Fig. 2 The annual balance between the use of the terms "Outcome" and "Performance" A Conceptual Guideline to Support Servitization Strategy Through Individual Actions Fig. 1 Interrelationships between the external environment and wider ecosystem, strategy of the firm and the actions necessary to support changes and overcome barriers (Based on, , Kindström et al. [2013], and on Hou and Neely [2013]) 313 Fig. 2 A framework to create a roadmap to overcome service challenges 322 Terzi et al., 2010;Wuest et al., 2016) 408 Fig. 4 Framework for supporting value co-creation within the context of digitally-enabled PSS (based on Terzi et al., 2010;Wuest et al., 2016) 411 ...
... Fig. 1 Average frequency (per paper) of "outcome" and "performance" terms by field 137 Fig. 2 The annual balance between the use of the terms "Outcome" and "Performance" A Conceptual Guideline to Support Servitization Strategy Through Individual Actions Fig. 1 Interrelationships between the external environment and wider ecosystem, strategy of the firm and the actions necessary to support changes and overcome barriers (Based on, , Kindström et al. [2013], and on Hou and Neely [2013]) 313 Fig. 2 A framework to create a roadmap to overcome service challenges 322 Terzi et al., 2010;Wuest et al., 2016) 408 Fig. 4 Framework for supporting value co-creation within the context of digitally-enabled PSS (based on Terzi et al., 2010;Wuest et al., 2016) 411 ...
... Furthermore, Bertoni et al. (2013) use visualization of the value in a PSS in the design phase (BOL) as part of an attempt to link BOL and MOL value creation. Journey mapping can be used as a tool to identify value co-creation opportunities as it describes the journey Terzi et al., 2010;Wuest et al., 2016) of customers visually, clearly showing their touchpoints and actor interrelationships (Lemon & Verhoef, 2016). Journey mapping in combination with personas can provide a deeper understanding of value co-creation opportunities by identifying the actors involved along the lifecycle, along with the resource and knowledge that they may possess (West et al., 2020). ...
... A paradox is developing that hinders exploitation of digitally-enabled solutions in PSS [7]; it is due to the transformational aspects of digitally-enabled PSS and servitization [8]. A lifecycle perspective is useful when considering data and information flows and how they can assist value creation [9]. Value co-creation has been identified as a complex process in this context and requires further investigation. ...
... The framework applies feedback between the parties, supporting productivity, and adaptability from integration design, to aid build knowledge (within and between different life cycle phases). The integration of the actors at each phase of the lifecycle and between each phase, needs orchestration to ensure two-way flows [9]. Each touchpoint or transaction between actors provides an opportunity for value co-creation. ...
Chapter
This paper describes the development of a conceptual framework to support the identification of value co-creation within the context of digitally-enabled Product-Service Systems (PSS). The framework was developed based on five themes. It considers how and where value co-creation occurs and also the translation of data into information that can become knowledge for individuals and organizations within the digitally-enabled PSS context. The model brings together the different actors and beneficiaries with a governance process that focuses on supporting value co-creation by integrating the information with data. The framework supports new innovation and improvements to existing PSS.
... • Technological Issues, such as standards and interfaces, data analytics, data security, data quality, sensors and actuators (Wuest, Wellsandt, & Thoben, 2015;Zhang, Ren, Liu, & Si, 2017). ...
... However, this needs to be studied in detail and is not in the focus of this study. More information regarding the information flows between different phases themselves can be found in Wellsandt et al. 24,44 The demand-side (end user) openness requirements at the interface between BOL and MOL are expected to be high. Information access over lifecycle phase borders is essential for many applications. ...
Article
Industrial Internet platforms have the ability to access, manage and control product-related data, information and knowledge across all the lifecycle phases (beginning of life, middle of life and end of life). Traditional product lifecycle management/product data management software have many limitations when it comes to solving product lifecycle management challenges, like interoperability for instance. Industrial Internet platforms can provide real-time management of data and information along all the phases of a product’s lifecycle. Platform openness in combination with the above-mentioned industrial internet platform characteristics helps solve the product lifecycle management challenges. This article describes the product lifecycle management challenges in detail from the existing literature and presents solutions using industrial internet platform openness and related dimensions as well as sub-dimensions. A wide pool of platforms is narrowed down to specific platforms that can solve the documented product lifecycle management challenges and allow the manufacturing companies to collaborate as well as enhance their business. We also present in detail managerial implications toward long-term and sustainable selection of industrial internet platform.
Experiment Findings
Full-text available
Any business begins with value creation as it is the purpose of an organisation. If the delivery of value via tangibles and intangibles is efficient enough it will generate profit after cost . Too often in business, we consider the value in its monetary form, this is a reductive approach and misses the complexities, ambiguities and paradoxes associated with value co-creation and value co-delivery in our interconnected worlds. Lean thinking brings with it the concept of waste (i.e. reducing cost). This is the negative form of value. Lean also allows us to consider performance management, allowing us to use proxies for value creation in the form of metrics (both KPIs and APIs). With management metrics, we often focus on the value for “us” rather than considering the value for the “system”. In firms we also talk of “risk transfer” as a way to create value by transferring risk (again an intangible aspect of value) to other actors who are better able to manage the risk, this is in effect the use of real option contracts or call contracts (e.g., insurance). Service science, particularly Service-Dominant logic provides a useful set of axioms and foundational premises that can help support our value co-creation within digitally enabled ecosystems, particularly when digital can be used to orchestrate or enable value co-creation through new institutional arrangements. It is nevertheless not the only way to understand value and value co-creation in today’s emerging digitally enabled ecosystems and others from other disciplines from engineering and scial science provide approaches that allow us to explore and measure value co-creation. We want to explore methodologies, managerial frameworks ,approaches, case studies, and results from value creation efforts when moving towards digitally connected ecosystems. A mix of theoretical studies and applied cases would allow us to move forward in establishing research based best practices for value co-creation. In particular we are looking for papers that considers value co-creation: - methods, managerial frameworks, indicator systems, industrial case studies, best practices.
Conference Paper
Artificial Intelligence has come into focus anew in the context of digitization and global competition. So has the tension between human ethics, regulation, and the potential gains this technology field offers for economic and societal progress. This paper is intended to contribute to the ongoing debate about opportunities and uncertainties in particular with respect to the use of private data in AI. We discuss the status of AI outcomes in terms of their validity, and of AI input as to the quality of data. In a first order approach we suggest to distinguish between the commercial, public, industrial, and scientific data spheres of AI systems. We resume the ethical and regulative approaches to the utilization and protection of massive private data for AI. Regarding the currently favoured ways of organizing the collection and protection of data we refer to respective ruling and denominate distributed ledger systems and open data spaces as functional means. We conclude by arguing that governing data privacy and quality by distinguishing different AI data spheres will enable a reasonable balance of these two aspects.
Article
Full-text available
Knowledge about the activities happening beyond the point of sale is valuable for product and product-service design. In the product design community, the importance of this knowledge is accepted for several years, for instance through concepts like participatory design, as well as the living lab movement. An extensive involvement of users, in order to gain the desired knowledge, may proof time consuming and thus too expensive. Therefore, it appears expedient to utilize the existing information that is generated beyond the point of sale as effective as possible. In order to support research in this field, this paper provides an overview about different types of currently existing product information originating from the so-called middle of life phase. The overview is based on application cases that belong to different research and innovation projects, as well as practical examples from internet-based services. It briefly covers, for instance, data from embedded information devices, maintenance information, user-generated contents such as videos and product reviews. Within the subsequent discussion, some characteristics of middle of life information are highlighted. The characteristics are related to the different appearance of information and concern, e.g. differences among measured and articulated information, as well as the relation between instance- and class-based product information.
Conference Paper
Full-text available
Recent approaches for Product Lifecycle Management (PLM) aim for the efficient utilization of the available product information. A reason for this is that the amount of information is growing, due to the increasing complexity of products, and concurrent, collaborative processes along the lifecycle. Additional information flows are continuously explored by industry and academia – a recent example is the backflow of information from the usage phase. The large amount of information that has to be handled by companies nowadays and even more in the future, makes it important to separate “fitting” from “unfitting” information. A way to distinguish both is to explore the characteristics of the information, in order to find those information that are “fit for purpose” (information quality). Since the amount of information is so large and the processes along the lifecycle are diverse in terms of their expectations about the information, the problem is similar to finding a needle in a hay stack. This paper is one of two papers aiming to address this problem by giving examples why information quality matters in PLM. It focuses on one particular lifecycle process, in this case product design. An existing approach, describing information quality by 15 dimensions, is applied to the selected design process.
Book
Full-text available
Pre-print available here: http://elib.suub.uni-bremen.de/edocs/00104199-1.pdf The book reports on a novel approach for holistically identifying the relevant state drivers of complex, multi-stage manufacturing systems. This approach is able to utilize complex, diverse and high-dimensional data sets, which often occur in manufacturing applications, and to integrate the important process intra- and interrelations. The approach has been evaluated using three scenarios from different manufacturing domains (aviation, chemical and semiconductor). The results, which are reported in detail in this book, confirmed that it is possible to incorporate implicit process intra- and interrelations on both a process and programme level by applying SVM-based feature ranking. In practice, this method can be used to identify the most important process parameters and state characteristics, the so-called state drivers, of a manufacturing system. Given the increasing availability of data and information, this selection support can be directly utilized in, e.g., quality monitoring and advanced process control. Importantly, the method is neither limited to specific products, manufacturing processes or systems, nor by specific quality concepts.
Conference Paper
Full-text available
Recognizing the substantial development of information quality research, this review article analyzes three major aspects of information quality research: information quality assessment, information quality management and contextual information quality. Information quality assessment is analyzed by three components: information quality problem, dimension and assessment methodology. Information quality management is analyzed from three perspectives: quality management, information management and knowledge management. Following an overview of contextual information quality, this article analyzes information quality research in the context of information system and decision making. The analyzing results reveal the potential research streams and current research limitations of information quality. Aiming at bridging the research gaps, we conclude by providing the research issues for future information quality research and implications for empirical applications.
Chapter
Die Verbesserung und Sicherstellung der Informationsqualität wird in immer mehr Unternehmen als eigenständige Managementaufgabe von großer Wichtigkeit begriffen. IQ-Management ist ein elementarer Baustein in Systemintegrationsprojekten. Aber auch in bestehenden Prozessen mit heterogenen Datenquellen und Informationsnutzern ist eine hohe Informationsqualität die Grundvoraussetzung für funktionierende betriebliche Abläufe. Voraussetzung für ein effektives IQ-Management ist die Bewertung der Informationsqualität [Lee et al. 2006, S. 13 und S. 27]. In vielen Unternehmen ist Informationsqualität nur ein gefühlter Wert. Die meisten Anwender bringen ein gewisses Misstrauen den Daten gegenüber zum Ausdruck, dies jedoch ohne genaue Angabe der Fehlerart und -häufigkeit. Nicht selten werden kostspielige Projekte angestoßen, um die Informationsqualität zu verbessern, ohne sich vor einer IQ-Maßnahme durch eine Analyse ein genaues Bild über die tatsächlichen Probleme zu verschaffen. Nur auf der Basis einer umfassenden Bewertung der Informationsqualität können die notwendigen Ressourcenentscheidungen herbeigeführt, Ziele gesetzt und der Erfolg des IQ-Management beurteilt werden.
Chapter
In this chapter, the philosophy behind Concurrent Engineering is presented as well as current approaches used to implement Concurrent Engineering. Concurrent Engineering design encourages the simultaneous consideration of all aspects of a product's life-cycle at the design stage. It has been shown to be successful in shortening product development time and costs by avoiding the typical problems associated with sequential design. Companies competing in today's global and volatile marketplace cannot afford long development leadtimes or high costs. Success stories in Concurrent Engineering have primarily relied on the design team approach, a collaboration of people from different departments representing different life-cycle perspectives. However, the design team approach and other approaches suffer in their inability to manage (i.e., store, access, update, etc.) the immense amount of data and information required to perform Concurrent Engineering.
Article
With the advent of the information and related emerging technologies, such as RFID, small size sensors and sensor networks or, more generally, product embedded information devices (PEID), a new generation of products called smart or intelligent products is available in the market.Although various definitions of intelligent products have been proposed, we introduce a new definition of the notion of Intelligent Product inspired by what happens in nature with us as human beings and the way we develop intelligence and knowledge. We see an intelligent product as a product system which contains sensing, memory, data processing, reasoning and communication capabilities at four intelligence levels. This future generations of Intelligent Products will need new Product Data Technologies allowing the seamless interoperability of systems and exchange of not only Static but of Dynamic Product Data as well. Actual standards for PDT cover only lowest intelligence of today’s products. In this context, we try to shape the actual state and a possible future of the Product Data Technologies from a Closed-Loop Product Lifecycle Management (C-L PLM) perspective.Our approach is founded in recent findings of the FP6 IP 507100 project PROMISE and follow-up research work. Standards of the STEP family, covering the product lifecycle to a certain extend (PLCS) as well as MIMOSA and ISO 15926 are discussed together with more recent technologies for the management of ID and sensor data such as EPCglobal, OGC-SWE and relevant PROMISE propositions for standards.Finally, the first efforts towards ontology based semantic standards for product lifecycle management and associated knowledge management and sharing are presented and discussed.
Article
Poor data quality (DQ) can have substantial social and economic impacts. Although firms are improving data quality with practical approaches and tools, their improvement efforts tend to focus narrowly on accuracy. We believe that data consumers have a much broader data quality conceptualization than IS professionals realize. The purpose of this paper is to develop a framework that captures the aspects of data quality that are important to data consumers.A two-stage survey and a two-phase sorting study were conducted to develop a hierarchical framework for organizing data quality dimensions. This framework captures dimensions of data quality that are important to data consumers. Intrinsic DQ denotes that data have quality in their own right. Contextual DQ highlights the requirement that data quality must be considered within the context of the task at hand. Representational DQ and accessibility DQ emphasize the importance of the role of systems. These findings are consistent with our understanding that high-quality data should be intrinsically good, contextually appropriate for the task, clearly represented, and accessible to the data consumer.Our framework has been used effectively in industry and government. Using this framework, IS managers were able to better understand and meet their data consumers' data quality needs. The salient feature of this research study is that quality attributes of data are collected from data consumers instead of being defined theoretically or based on researchers' experience. Although exploratory, this research provides a basis for future studies that measure data quality along the dimensions of this framework.
Article
The rise and fall of the dotcom-driven Internet economy shouldn't distract us from seeing that the business environment continues to change at a dramatically increasing pace. To thrive in this turbulent environment, we must confront the business need for relentless innovation and forge the future workforce culture. Agile software development approaches, such as extreme programming, Crystal methods, lean development, Scrum, adaptive software development (ASD) and others, view change from a perspective that mirrors today's turbulent business and technology environment
Manutelligence project
  • Manutelligence Consortium
Manutelligence Consortium. Manutelligence project. Available: www.manutelligence.eu. Accessed: 29.05.2015.