Analyzing the Value of Product Lifecycle Management from Grounded
Theory & Models
Abram Walton1, Michael Grieves2, Darrel Sandall3, Matthew Breault4
1Director of Center for Lifecycle and Innovative Management, Florida Institute of Technology, USA,
2Executive Director of Center for Advanced Manufacturing and Innovative Design, Florida Institute of
Technology, USA, email@example.com
3College of Business, Florida Institute of Technology, USA, firstname.lastname@example.org
4Researcher, Florida Institute of Technology, USA, email@example.com
Abstract. With the understanding of the enormous economic value inherent in next generation products,
business enterprises repeatedly describe the value metrics and Return On Investment (ROI) of Product
Lifecycle Management (PLM) as difficult to assess, thus confounding their ability to quickly and strategically
make decisions. Given that PLM initiatives are such sizable investments and interdisciplinary, executives are
struggling to agree upon value metrics to measure throughout the project to understand an accurate value
created. Thus a shortage of known ROI information is available to make the case for financing a PLM
project; moreover unique industry and organizational variables make it unfeasible for companies to easily
benchmark competitors and predetermine the values gained. A literature review reveals a void of information
relative to quantifiable value metrics and ROI of PLM. This article proposes a framework to use Grounded
Theory research methodology to capture value metrics and formulate models, contributing to future
organizational decisions on PLM implementations.
Keywords: Product Lifecycle Management; PLM; ROI; Value; PLM 3.0; Grounded Theory; Internet of
Product lifecycle management (PLM) is often cited as one of the most important strategic capital
investments product-based companies can make, resulting in a boosted and sustained competitive advantage (e.g.
decreasing product time-to-market, increasing reuse during new product development, and increasing the capacity
for change control) when implemented in a holistic approach . Yet, due to the size of the investment and the
interdisciplinary impact of PLM, executives are struggling to make the case for financing a PLM project as there is a
shortage of known Return On Investment (ROI) or similar value metric information available. Unique industry and
organizational variables make it unfeasible for companies to easily benchmark competitors and predetermine the
values gained . Issues related to how firms measure performance are not isolated to technologically driven firms.
Rather, nearly every organization is facing increasing pressure to justify Information Technology (IT) decisions and
to quantify how a new system will change human-computer interactions, and ultimately affect the bottom line. An
organization’s ability to innovate depends not only on the overarching system and IT infrastructure, but how humans
employ the system to maximize innovation. Advancements in PLM and their implications for an ever evolving and
changing workforce require that organizations embrace and employ effective decision making methodologies.
This article explains the challenges executive management face understanding the values of PLM. A
literature review reveals the lack of value metrics for companies attempting to determine the value created from their
PLM investments. Reviewing previously published value metrics for IT systems creates a baseline for our research
on the value of PLM. Future exhaustive research using Grounded Theory will allow for the creation of effective
models and organic theories from data gathered on proposed value metrics of PLM. The specific value metrics
captured and components analyzed are set forth in this article.
2 Review of Literature
An extensive search conducted for publications directly relating to the value metrics of PLM produced no
results. This exposes a clear gap in the field of knowledge. Metrics for understanding value created by PLM have
not been documented and executives are struggling to gather information supporting their PLM investments and
mitigating the risks of future investments . This research proposes to fill the void of information. Investigating
publications concerning value metrics and models of IT systems can be used as a starting point for value metrics for
PLM. These publications were chosen based on their relativity to the scope of research and their number of
recitations reflecting their applicability into industry.
2.1 Value Metrics of IT Systems
In the Harvard Business Review article “Six IT Decisions Your IT People Shouldn’t Make,” Ross and
Weill  wrote that the most frequent complaints they heard from top executives were about struggling to calculate
payback on IT, realizing the business value from high priced technology, and justifying ongoing increases in IT
spending. Their research reveals that companies successfully managing IT investments lead to returns that are as
much as 40% higher than those of their competitors . Due to advancements in IT systems and PLM solutions,
this delta, based on returns 40% greater than those of competitors, created by successful management will arbitrate
companies’ ability to survive in the future.
Weill and Olson  analyzed a diverse group of case studies relating to IT investments such as Supply
Chain Management (SCM), Customer Relationship Management (CRM), and Enterprise Resource Planning (ERP)
systems. They established connections between the IT investment objectives and the performance measures that
should be tracked per each investment. Three main connections were established: revenue growth rates should be a
performance measure for strategic IT investments, return on assets  should be a performance measure for
informational IT investments, and indirect labor should be a performance measure for transactional IT investments.
Nine years later, Weill and Broadbent  expounded on the above mentioned reference to include their
fourth IT investment area of infrastructure. They subsequently created a new model containing the four types of IT
investments and their corresponding value added areas, as shown in Figure 1: IT Investment Area Model.
Figure 1. IT Investment Area Model
- Business integration
- Business flexibility
- Reduced marginal cost of
business unit’s IT
- Reduced IT costs
- Cut costs
- Increase throughput
- Increased sales
- Competitive advantage
- Competitive necessity
- Market Positioning
- Increased control
- Better information
- Better integration
- Improved quality
- Faster cycle time
During the IT system investment justification process, ROI is often a discussion point since it is a widely
used metric. Parker  details a method to calculate ROI for IT projects in which he places tangible benefits into
five major categories as shown in Figure 2: Tangible Benefits of IT Investment.
Figure 2. Tangible Benefits of IT Investment
Further outlined in Figure 3: Non-tangible Benefits of IT Investment are the non-tangible, yet highly important
benefits. Parker (2012) suggests that these should not be placed in ROI calculations due to their difficulty to
Figure 3. Non-tangible Benefits of IT Investment
Beyond the benefits factored into the ROI calculation, there are three other considerations that need to be
made. Timeframe, Consistency, and Precision are additional relevant factors. Timeframe is the period in which
those benefits are calculated for and Parker recommends this to be around five years for IT systems. Consistency
refers to aspects like inflation, taxation, and other assumptions being kept uniform across all IT system project
calculations to maintain equal evaluations. Precision is describing all dollar values with a balance of certainty and
accuracy and maintaining regularity for all IT investment decisions.
Wen, Yen  reviewed the most commonly accepted methodologies for measuring IT investment payoffs.
The evaluation methods include ROI, Cost-Benefit Analysis (CBA), Return On Management (ROM), and
Information Economics (IE). The tangible benefits cited above by Broadbent, Olsen, Parker, and Weill can be
defined as the profit or return and then can be divided by the investment required for the ROI calculation. Further
time value functions are applied to provide deeper analytical framework. Net Present Value (NPV) and Discounted
Cash Flow (DCF) are additional ROI methods dependent upon an interest rate to perform the calculation. These
methods are predominantly used for tangible, quantitative benefits; however, the intangible benefits are better used
for CBA, ROM, and IE.
The CBA approach is ideal for two main complications. They include quantifying the value of benefits that
do not flow back to the investor, and identifying the market value of costs and benefits from intangible factors.
CBA requires an agreement on the measures of value for intangible benefits. If there is disagreement on the
appropriate values then one of the following methods should be used. ROM uses a simple ratio of productivity as
“output/input.” Strassmann  defines the output as the delta between the direct operating costs and the value added
due to direct labor. Simons and Dávila  express it as productive organizational energy released divided by
management time and attention invested. The advantage of the ROM method is that it can focus on the
contributions of IT to the management process . The IE method is analogous to the CBA method with the
addition of a ranking and scoring technique of intangibles and risk factors associated with the IT investment. The
justifications have been made for systems such as ERP, CRM, and SCM; however, the cross-functional impact of
PLM is problematic for executives making the case for financing a PLM project (Walton et al., 2013).
2.2 PLM Then and Now
PLM is not just a better product data management (PDM) system, but rather, it becomes an innovation
enabler . By definition PLM is “an integrated, information-driven approach comprised of people,
processes/practices, and technology to all aspects of a product’s life and its environment, from its design through
manufacture, deployment and maintenance – culminating in the product’s removal from service and final disposal”
. The essence of PLM is to transform how enterprises handle product lifecycle phases and the associated data
with each phase, including engineering development, manufacturing, operations, support, disposal, and recycling [3,
12]. Grieves  further subdivides the four main phases of the product lifecycle in Figure 4: Product Lifecycle
Figure 4. Product Lifecycle Phases
The version of product lifecycle management that is modelled around the four contemporary stages of
design, build, service, and dispose/recycle is not a new construct. Grieves  states “one of the intrinsic appeals of
PLM and a source of confidence in its effectiveness is that the concept has worked in the past” . PLM as a
standalone construct in the literature, and separate from PDM or earlier Lifecycle Management (LM) constructs,
only dates back to around 2004. It is continuing to gain awareness and garner adoption due to increasing advances
in information technology. The ability to create smart, connected devices allows for companies to capitalise on and
create opportunities to monetize previously undervalued data. Enterprises are now creating the ability to obtain in-
use information on their products. This opens new value streams in the service and disposal phases as well as
increasing a company’s ability to employ ‘as-used’ data to enhance the next generation of their product lines. Figure
5: Progressive Stage Model describes PLM now in its third surge due to technological advances over time.
Figure 5. PLM Progressive Stage Model
The original PLM implementations were structured around design engineers as the end users. Reported
examples of this are shown across various industries using different PLM solution providers. Within the computer
manufacturing industry, Hewlett Packard implemented a solution that defined engineering change order workflows
to cut their revision and approval times in half . In the aerospace industry, General Electric’s Aviation division
implemented a PLM system that achieved an 82% reduction in the amount of data entered into the Engineering and
Manufacturing Bill Of Materials (EBOM and MBOM) by linking the inputs to automatically update each other
wherever changes were made. Another documented use case benefit described Grand Soleil, a plastics
manufacturer, achieving a 10% reduction in technical design time and a 25% reduction in time for design change
control . After this first wave of PLM, companies utilising PLM systems remained with engineers as end users,
but were also able to move into the manufacturing stage of the lifecycle. This evolved their efforts in the spaces of
virtual modelling, computer aided manufacturing, and global collaboration.
Through the advancements in three dimensional printing and modelling, further strategic PLM
implementations utilised Model Based Engineering (MBE) and rapid prototyping to integrate PLM into the create,
build, and service phases of the product’s lifecycle. The first origins of PLM 2.0 were from Dassault during the
release of its new V6 platform in 2008. Their Chief Executive Officer (CEO), Bernard Charles, used the analogy of
Web 2.0 to the Web to describe the declaration of PLM 2.0 to differentiate from the original PLM concepts up
until that time. The phenomena that pushed this new concept were increased availability of three dimensional
modelling, 'test driving' virtual products, and collaborative global networks creating the ability to harness collective
intelligence from online communities . Joy Global, a manufacturing and servicing company in the mining
machinery industry, began using a model based visualization program that increased the information to
manufacturing, training, and service personnel, resulting in a reduction in costs of training manuals, service
manuals, and simultaneously improving the overall user experience for both training and operations. This resulted
in tripling their revenues over a five-year period for their training and services department . American Axle &
Manufacturing (AAM) began using mechatronics during the development of their products. Mechatronics was a
fast emerging field of engineering that combined traditional mechanical and electrical design elements with the
integration of embedded controls. While using mechatronics, capturing engineering’s design inputs into their PLM
system lead AAM to reduce the necessary efforts by 80% to complete their integration testing and rework on new
products . In another instance from the household products industry, Proctor and Gamble used a platform that
created a centralised corporate standards system allowing them to qualify suppliers 50% faster. The magnitude of
this accomplishment can be seen by realizing the company’s globalized landscape of 1.2 million different
specifications and manufacturing occurring in over 80 countries . As Joy Global, AAM, and Proctor and
PLM 3.0 (2015)
-Capturing a product's
Products with Internet
-Digital Twin with real
options due to
PLM 2.0 (2008 to 2014)
-Model Based Engineering (MBE)
-Computer Aided Manufacturing
PLM (Early 2000's)
-Inclusion of PDM
-Workflows for engineering
change control and design
Gamble operated their PLM 2.0 systems, their ability to access and reuse information added to their competitive
advantage within their respective industries.
The third and most disruptive resurgence of PLM is owed to the actuality of Moore’s Law . IT has
been growing exponentially over the past forty years creating new capabilities for products, and only promises to
continue as the economy moves into an era of the Internet of Things (IoT). The IoT is changing the value structure
of products over their lifecycles by breaking the normal industry barriers, becoming smart, connected, and creating
new PLM use cases.
3 The Importance of PLM 3.0
The world renowned strategist and author of the seminal work on the Five Forces of Competition, Michael
Porter, and the President of PTC Inc., James Heppelmann, jointly published an article in Harvard Business Review
titled “How Smart, Connected Products Are Transforming Competition” . Centered around the IoT, these smart,
connected products allow organizations to manufacture and service an altogether original and unique combination of
technology infrastructure and sensor-containing products that have the capability to monitor, control, optimize, and
autonomously operate in new and competitive ways . These smart, connected products of the new age
amalgamate to form the IoT commonly defined by Vermesan and Friess  as “a dynamic global network
infrastructure with self-configuring capabilities based on standard and interoperable communication protocols where
physical and virtual ‘things’ have identities, physical attributes, and virtual personalities and use intelligent
interfaces, and are seamlessly integrated into the information network” . With a projected potential economic
impact of $11.1 trillion per year by 2025, the IoT revolution is receiving much legitimate hype and has already
drastically expanded prospective areas for value creation using data associated with a product’s lifecycle . By
gaining the in-use data on their products, companies are reshaping their business models for their products and are
able to follow a product through its entire lifecycle and the customary barriers between industries will be broken and
new industries, services, and roles will be created . These changes in technology are increasing the stakes and
opportunities to successfully implement PLM.
Mauborgne and Kim  elaborated on the strategic descriptions of the “Blue Ocean Strategy” in Harvard
Business Review. Their description of the variances between the competitive red ocean markets versus the blue
ocean market is a pivotal paradigm shift on how industries are labelled and how companies will compete. Red
oceans are constrained and conditioned industries seen as limited to competition within the boundaries of their
industry and are at the mercy of economic circumstances. In contrast to this customary approach, the blue oceans
are where market constraints are no longer barriers and the industry lines are drawn by the actions and beliefs of the
competitors, which Mauborgne and Kim referred to as the Reconstructionist view. The keynote example of
reconstructing these competitive boundaries is a company acquired by Google, Nest, for $3.2 billion. Beyond the
typical functions of a thermostat, Nest is breaking barriers and creating connectivity into energy, home security,
household appliance, consumer electronic, home audio, and wearable technology industries and sectors. The
evolution of products similar to Nest are expanding into broader systems, radically reshaping companies and
competition. Data is at the foundation of what is reshaping the value chain . However, even with the
understanding of the prospective and enormous economic value inherent in next generation products, business
enterprises repeatedly describe the ROI of PLM as difficult to assess [2, 12, 26], thus confounding their ability to
quickly and strategically make decisions around which platforms to pursue.
4 Why is PLM Value Hard to Calculate?
Research by Walton, Tomovic  explains, “the benefits of PLM are difficult to assess because the same
benefit can be expressed as a function of time, cost, quality, or a combination thereof” . PLM projects, like many
other cross-functional implementations with interwoven components and processes, require combined and
synchronous attention to all phases. Implementations that lack these combined foci stem from misguided
perceptions about the intricacies and interdependencies in foundational PLM modules, which creates negative
business outcomes in myriad areas involving people, practices, processes, and technology . Subsequently,
having expended millions of dollars for technological infrastructure implementation, education, and service, senior-
level executives are eager to see trustworthy and unbiased data supporting their investments in PLM. Furthermore,
without knowing the bottom-line impact of PLM on cost-savings and revenue-generation, executives are unable to
precisely estimate the level of risk associated with future PLM investments .
These imprecise estimates stem from the lack of baseline data, which creates the struggle of determining
ROI during, throughout, and after the project roll out. Additionally, the absence of baseline data and the inability to
calculate post implementation ROI contributes to the overall deficiencies of market data attempting to place a value
of PLM, which only serve to exacerbate the challenge of obtaining buy-in for a PLM implementation project from
the corporation’s most senior executives . Resultantly, numerous enterprises scramble to collect the data during
the implementation and after rollout because, over the lifecycle of the project, the priority is generally on issue and
risk mitigation rather than on gathering data to justify the platform-decision that has already been made. After the
rollout, teams resort to focusing on employing the full measure of the new platform and the temporary resources
assigned during the implementation are moved on to other projects ; thus, data collection continues to be an area
lacking both internal to corporations and even more so in the industry at-large. Despite the fact that there are no
widely cited failures in PLM implementations, the issues related to there being a dearth of data showing PLM’s
value. This exacerbates the struggle companies face in quantifying PLM’s benefits, and thus the adage, ‘you cannot
manage what you do not measure’  encapsulates an executive board’s ability to justify their decision to
implement PLM. Synonymous to this issue, since there are no widely cited failures, a lack of focus on optimization
also exists because implementations are repeatedly seen as successful, so a strong enough urgency has not been
created for implementation optimisation.
In order to resolve these challenges, enormous amounts of resources have been expended by companies,
vendors, and academicians alike, with an aim of creating maturity models, benchmarking systems, and
implementation indexes [2, 32, 33]. Nonetheless, there are still myriad reports and complaints from industry
executives who express distress about the gap between published models for ROI and the models' inapplicability to
their companies [34-36]. Worse yet, perceptions regarding reported benefits from vendors or consultants are often
viewed with skepticism and bias [37-39], and executives argue that due to the enormous monetary costs of a PLM
implementation, a company’s key decisions cannot be left up to what could be viewed as biased information.
Despite the published benefits, the true value or ROI for any particular organization cannot be predetermined, which
hinders the decision-making and selection process. Moreover, since the benefits of PLM are often organizationally-
specific, they do not easily correlate from one enterprise to another due to copious organizational variables (e.g.
industry, company size, market segment, and business process maturity). This further hinders executives attempting
to triangulate the value. This research aims to bridge the gap between the benefits of PLM and the creation of value
and ROI for organizations implementing PLM.
5 Research Plan
As highlighted earlier, many quantitative research studies have been performed for the valuing of IT
systems (Parker, 2012; Weill & Broadbent, 1998; Wen et al., 1998). In contrast, our PLM value metric research will
follow a qualitative research methodology. Silverman explains that “the papers in qualitative journals do not
routinely begin with a hypothesis, the ‘cases’ studied are usually far fewer in number and the authors’ interpretation
is carried on throughout the writing” . Sociologists Glaser and Strauss created grounded theory when they
explicated the qualitative research strategies that they had used in their studies of how staff organized care of dying
patients in hospitals . Since 1967, the method has proved valuable through different industries, and is an
inductive, iterative, interactive, and comparative method geared towards theory construction . This approach
produces a theory representative of the data that has been gathered and systematically analyzed during the research
Our data collection will consist of released information concerning product-producing companies within
Fortune’s list of the 1000 largest companies in the U.S. based on revenues for 2014. The information is generally
published by four entities: the company, the company’s PLM vendor, a PLM consulting firm that has experience
with the company, or conference presentations and proceedings documenting the company’s PLM information.
Unavoidably, there are biases in the information released from these sources. This bias will remain within the
research data collection pool; however, throughout the initial and focused coding periods we will sieve out as much
The analysis for determining the value metrics of PLM will consist of coding data in two phases, initial
coding and focused coding. Glaser states that initial coding, sometimes known as open coding, asks these questions
of the data: what is actually happening in the data, what are these data a study of, and what category does this case,
segment, or statement of data indicate ? These questions will be answered by reviewing the data word-by-word,
line-by-line, paragraph-by-paragraph, or incident-by-incident. Focused coding will be more exclusive and
conceptual than initial coding. Performing focused coding amalgamates and explains larger segments of data. This
will normally create categories for the grounded theory to be constructed upon. Charmaz suggests making the
following comparisons during focused coding: comparing different people, comparing data from the same
individuals at different points in time, comparing specific data with the criteria for the category, and comparing
categories in the analysis with other categories . Coding will create impressions for ideas, thoughts, or
connections, which then leads to theoretical sampling.
Glaser and Strauss define theoretical sampling as “the process of data collection for generating theory
whereby the analyst jointly collects, codes, and analyzes his data and decides what data to collect next and where to
find them” . This iterative process of gathering data, coding, and theoretical sampling will be performed until
there is a saturation of data and all new information stops yielding any new theories. Since grounded theories are
derived from data, they can be safe guides for the operation by the establishment of a deeper understanding and
insight . The research methodology for the discovery of value metrics for PLM will be Grounded theory
approached as described by Silverman , Glaser and Strauss , Takhar-Lail  and Charmaz .
The goal of this research is to bridge the gap between the various value metric frameworks proposed in a
variety of industry and academically generated literature and the current struggles that industry executives face when
making strategic investments for PLM implementations. It provides a novel process to view the value of PLM and
takes the research to a degree of understanding, in which, meaning can be given to the collected raw data. The
research will gather benefits and costs of PLM implementations from numerous industries and markets. Due to the
disparate nature of all the benefits and costs from different companies, after being collected, the data will undergo
coding and analysis based on specific organizational variables to determine the value of each instance. These values
will be put into a framework to help companies not only in their investment justification processes, but also in their
post project value realization process. The multitude of the analysis may be outside of typical geometric spacing
(e.g. Figure 6: Example of PLM Value / Impact Model) due to the sheer number of variables.
Figure 6. Example of PLM Value / Impact Model
The analysis will generate multidimensional models (i.e. possible utilizing greater than three dimensions)
 analyzing factors such as industry, company size, PLM use case, solution provider, business case, and value of
each instance. This research stands to provide companies a tool and a valuable road map to determine the value of
their PLM strategies. Furthermore, by having the ability to extrapolate data within the model, companies will be
able to better pinpoint PLM use-cases that are optimized for their industry and company size, allowing them to make
well-informed decisions on their investment directions.
The data collected for on-going and future research outlined in this paper is subject to natural biases due to
the source of gathering information. The publishing bodies’ best interest is to represent the benefits and
accomplishments as triumphantly as possible without misrepresenting or being deceitful. The rhetoric and the
Use Case X1 Use Case X2 Use Case X3 Use Case X4 Use Case X5
PLM Use Cases
PLM Value / Impact Model
Implementation Maturity Level vs. Industry vs. Use Case
reality of the information gathered will be coded carefully in an effort to eliminate all bias; however, we will not
know for sure the extent of the biases represented.
The investment size and interdisciplinary impact of PLM create a need for higher probability of success for
executives making the case for financing a PLM project. PLM implementations on the whole have a shortage of
known ROI information available and the academic research to date has not addressed the challenges that industry
executives are facing. Thus, the vast collection of PLM instances gathered by the proposed future research will
allow for extrapolation of unique industry and organizational variables making it feasible for companies to easily
benchmark competitors and predetermine the values gained.
1. Grieves, M., Product Lifecycle Management: Driving the Next Generation of Lean
Thinking. 2006: McGraw Hill Professional.
2. Walton, A.L.J., C.L. Tomovic, and M.W. Grieves, Product Lifecycle Management:
Measuring what is important - product lifecycle implementation maturity model, in
Product Lifecycle Management for Society 10th IFIP WG 5.1 International Conference.
2013: Nantes, France.
3. Tomovic, C., et al., Measuring the impact of product lifecycle management: Process
plan, waste reduction and innovations conceptual frameworks, and logic model for
developing metrics, in Product Realization. 2009, Springer. p. 1-14.
4. Ross, J. and P. Weill, Six IT Decisions Your IT People Shouldn't Make. Harvard Business
Review, 2002(November): p. 84-91.
5. Weill, P. and M.H. Olson, Managing Investment in Information Technology: Mini Case
Examples and Implications. MIS Quarterly, 1989(March 1989): p. 3-17.
6. Weill, P. and M. Broadbent, Leveraging the New Infrastructure: How market leaders
capitalize on IT. Harvard Business Press, 1998.
7. Parker, J. Calculating ROI on Information Technology Projects. 2012 [cited 2015
8. Wen, H.J., D.D. Yen, and B. Lin, Methods for measuring information investment payoff.
Human Systems Management, 1998. 17(2): p. 145-163.
9. Strassmann, P.A., Information Payoff: The transformation of work in the electronic age.
1985: Strassmann Inc.
10. Simons, R. and A. Dávila, HOW HIGH IS YOUR RETURN ON MANAGEMENT? (cover
story). Harvard Business Review, 1998. 76(1): p. 70-80.
11. Grieves, M., Virtually Perfect : Driving Innovative and Lean Products Through Product
Lifecycle Management. 2011, Cocoa Beach, FL: Space Coast Press.
12. CIMdata, Tata Technologies PLM Analytics Maturity Assessment Program: Measuring
Organizations' Readiness to Adopt PLM. 2014, CIMdata Inc.
13. Grieves, M., Back to the Future: Product Lifecycle Management and the Virtualization of
Product Information. 2009, Springer Science+Business Media, LLC. p. 44.
14. Foy-Babbage, K. Agile PLM Customer Success by Value Delivered. in PLM World. 2008.
15. IBM, PLM Challenges and Benefits. 2004, Emea Marketing and Publishing Services
(EMAPS): Normandy House, Basingstoke, United Kingdom. p. 16.
16. IBM, V6 brings PLM 2.0 to Life. 2009: IBM.com/software/plm. p. 16.
17. SAP. Mining The Data for Riches. 2014 [cited 2015 9/21]; Available from:
18. Moradshahi, P. Model-Based Development: Realizing Fully Integrated Algorithm &
Software Development for Production Automotive Electronic Control Units. in
Mathwork's Automotive Conference. 2008. American Axle & Manufacturing.
19. Hearne, B., Procter & Gamble: Reinventing innovation processes with Dassault
Systèmes. 2011, Dassault: 3ds.com.
20. Seel, N.M., Encyclopedia of the Sciences of Learning. 2012: Springer Science &
21. Porter, M. and J. Heppelmann, How Smart, Connected Products Are Transforming
Competition. Harvard Business Review, 2014(November).
22. Vermesan, O. and P. Friess, Internet of Things-From Research and Innovation to Market
Deployment. 2014: River Publishers.
23. McKinsey&Company, Internet of Things: Mapping the Value Behind the Hype. 2015,
McKinsey Global Institute. p. 23.
24. Porter, M. and J. Heppelmann, How Smart, Connected Products Are Transforming
Companies. Harvard Business Review, 2015(October).
25. Mauborgne, R. and C. Kim, Blue Ocean Strategy. Harvard Business Review (Boston:
Harvard Business School Press), 2005.
26. Grealou, L., PLM Maturity and Capability Maturity Model Integration (CMMI), in Tata
Technologies. 2014: http://www.tatatechnologies.com/plm-maturity-and-cmmi/.
27. Marien, E.J., Meeting the product lifecycle challenge. 2006. p. 50+.
28. Tomovic, C., et al., Development of Product Lifecycle Management metrics: measuring
the impact of PLM. International Journal of Manufacturing Technology and Management,
2010. 19(3/4): p. 167-179.
29. Voskuil, J. Getting Buy-in-ROI. in PI Congress. 2013. Chicago: PI Congress.
30. Graeb, R., How to Measure the Value of PLM Solutions, in http://blogs.ptc.com/. 2013:
31. Deming, W.E., Out of the crisis, Massachusetts Institute of Technology. Center for
advanced engineering study, Cambridge, MA, 1986. 510.
32. Batenburg, R., R. Helms, and J. Versendaal, PLM roadmap: stepwise PLM
implementation based on the concepts of maturity and alignment. International Journal of
Product Lifecycle Management, 2006. 1(4): p. 333-351.
33. Savino, M.M., A. Mazza, and Y. Ouzrout, PLM maturity model: a multi-criteria
assessment in southern Italy companies. International Journal of Operations and
Quantitative Management, 2012. 18(3): p. 159-180.
34. Sevenler, K. PLM Roadmap: Xerox Development and Deployment. in PI Congress. 2013.
35. Davin, J. Deckers Brands' Business Transformation Journey. in PI Congress. 2013.
36. Stark, J., Product Lifecycle Management–Volume 1: 21st Century Paradigm for Product
Realisation. 2015, Springer, Cham.
37. Aras, Engineering Definition and Supply Chain BOM Management for Complex
Configurations: GE Aviation, G. Aviation, Editor. 2013: Aras.com.
38. Oracle, Oracle Agile PLM for the Medical Device Industry, in Oracle Executive Brief.
39. Siemens, Shanghai Yanfeng Johnson Controls Seating: Reaping the rewards of
worldwide concurrent design. 2012, Siemens: Siemens.com/teamcenter.
40. Silverman, D., Interpreting Qualitative Data. 2015: Sage.
41. Glaser, B. and A.L. Strauss, The Discovery of Grounded Theory New York. NY: Aldine
De Gruyter, 1967.
42. Charmaz, K., Constructing grounded theory: A practical guide through qualitative
analysis (Introducing Qualitative Methods Series). 2006.
43. Charmaz, K. and J. Smith, Grounded theory. Qualitative psychology: A practical guide to
research methods, 2003: p. 81-110.
44. Takhar-Lail, A., Market Research Methodologies: Multi-Method and Qualitative
Approaches: Multi-Method and Qualitative Approaches. 2014: IGI Global.
45. Gorbach, I., A. Berger, and E. Melomed, Describing Multidimensional Space, in
Microsoft SQL Server 2008 Analysis Services Unleashed. 2008, Sams.