Conference PaperPDF Available

A Telemetry-driven Approach to Simulate Data-intensive Manufacturing Processes


Abstract and Figures

Telemetry enables the collection of data from remote points to support monitoring, analysis and visualization. It is largely adopted in Formula One car racing, where streams of live data collected from hundreds of sensors installed on car components are transmitted to the pitwall to be used as input of real-time car performance simulations. The aim of this paper is to evaluate the potential of a telemetry-driven approach in a manufacturing environment, where researchers are still looking for efficient methods to perform valuable simulations of the production processes on the basis of real data coming from the factory. The telemetry could contribute to the implementation of a virtual image of the real factory, which in turn could be used to simulate the factory performance, allowing to predict failures or investigate problems, and to reduce costly downtime. This study addresses in particular the efforts to combine and adapt methods and techniques borrowed from the field of Formula One car racing. Moreover, the investigation of the exploitation possibilities of the factory telemetry is paired with the design of a software application supporting this technology, starting from the elicitation and specification of the functional requirements.
Content may be subject to copyright.
Available online at
Procedia CIRP 00 (2016) 000000
2212-8271 © 2016 The Authors. Published by Elsevier B.V.
Peer-review under responsibility of Scientific committee of the 49th CIRP Conference on Manufacturing Systems (CIRP-CMS 2016).
49th CIRP Conference on Manufacturing Systems (CIRP-CMS 2016)
A telemetry-driven approach to simulate data-intensive manufacturing
Gianfranco E. Modonia,*, Marco Saccob, Walter Terkajb
aITIA-CNR, Institute of Industrial Technologies and Automation, National Research Council, Bari, Italy
bITIA-CNR, Institute of Industrial Technologies and Automation, National Research Council, Milano, Italy
* Corresponding author. Tel.: +39-080-5481265; fax: +39-080-5482533. E-mail address:
Telemetry enables the collection of data from remote points to support monitoring, analysis and visualization. It is largely adopted in Formula
One car racing, where streams of live data collected from hundreds of sensors installed on car components are transmitted to the pitwall to be
used as input of real-time car performance simulations. The aim of this paper is to evaluate the potential of a telemetry-driven approach in a
manufacturing environment, where researchers are still looking for efficient methods to perform valuable simulations of the production
processes on the basis of real data coming from the factory. The telemetry could contribute to the implementation of a virtual image of the real
factory, which in turn could be used to simulate the factory performance, allowing to predict failures or investigate problems, and to reduce
costly downtime. This study addresses in particular the efforts to combine and adapt methods and techniques borrowed from the field of
Formula One car racing. Moreover, the investigation of the exploitation possibilities of the factory telemetry is paired with the design of a
software application supporting this technology, starting from the elicitation and specification of the functional requirements.
© 2015 The Authors. Published by Elsevier B.V.
Peer-review under responsibility of Scientific committee of the 49th CIRP Conference on Manufacturing Systems (CIRP-CMS 2016).
Keywords: Factory Telemetry; Factory Image; Cyber-Physical Systems; Simulation
1. Introduction
One of the major issues affecting the current
manufacturing companies is the lack of a full bidirectional
synchronization between the physical world at the shop-floor
and its digital equivalent counterpart (so-called Factory
Image) [1] [2]. When the latter is constantly connected to the
production system, it represents a true reflection of the real
factory which can be used to monitor and simulate the factory
performance, allowing to adjust and optimize processes,
anticipate failures or investigate problems, and thus increasing
efficiency by orders of magnitude and reducing costly
downtime. The current state of the art shows that some
potential solutions are available in literature for transferring
the digital rendered designs to the shop floor in order to build
the right product [3]. However, to the best of our knowledge,
none of the proposed solutions implement the reverse
information flow from the real to the virtual world.
In order to close this loop and provide the full range of
capabilities that synchronization between the real and digital
factory have to offer, a wide variety of different technologies
is needed. A key-enabler is the Cyber-Physical System (CPS)
technology [4], a network of interacting collaborative
elements in constant connection with the surrounding physical
world and its on-going processes. The use of these smart
devices within the manufacturing execution phase allows to
generate a factory telemetry under the form of a large amount
of intensive and multi-source data, which can then be routed
real-time towards various enabled devices connected to the
company network. The major obstacle in the implementation
of such a telemetry approach is the difficulty of handling an
enormous amount of data coming from the real plant to get
aggregate values suitable for the analysis [5].
This paper aims at looking for new strategies to capture,
compare, and view disparate sets of data coming from various
elements of a CPS in order to extract relevant knowledge and
2 Author name / Procedia CIRP 00 (2016) 000000
provide better insights over the status of the resources at the
shop floor. In this regard, a new vision for processing the
factory telemetry is proposed, by combining and adapting
methods and techniques borrowed from the field of Formula
One (F1) car racing, where telemetry is largely adopted to
collect data from remote points in order to support real-time
car performance simulations. The idea behind this study is
that the envisioned approach allows to generate an integrated
and aggregated view of the factory telemetry that dynamically
augments and enhances the data-driven simulation
applications supporting the manufacturing execution phase.
Finally, the investigation of the exploitation possibilities of
the factory telemetry is paired with the design of a software
application supporting this technology, starting from the
elicitation and specification of the functional requirements.
The remainder of this paper is structured as follows.
Section 2 illustrates the analogies in the use of the telemetry
between the worlds of racing cars and factories. Section 3
introduces an overview of a software application supporting
the factory in industrial scenarios and illustrates major
challenges to realize it. Finally, Section 4 draws the
conclusions, summarizing the main outcomes.
2. Towards a factory telemetry: similarities with the
racing car
Analogies can be a valid way of analyzing the performance
of industrial processes in order to understand potential
improvements. A significant example is represented by the
several parallels between biological and manufacturing
systems that have been drawn in literature to solve a series of
problems of the modern manufacturing, through the study of
the structure, control mechanisms, and functions of the
biological systems [6] [7] [8].
The idea behind this study is that various analogies can
also be observed between the worlds of F1 racing cars and
modern factories. In fact, like the F1 cars, a manufacturing
environment comprises a set of processes to be monitored in
near real-time, huge information flows (and corresponding
software applications) from which to take critical decisions in
limited time, and a team of people that has the task of
developing, maintaining, measuring, and adjusting the system
under changing conditions [9]. On the basis of these analogies
(Fig. 1), it is interesting to experiment a transfer of methods
and tools from one field to the other. In this regard, the focus
of this section is on a set of relevant features of the telemetry,
a proven technology of F1 in which important measurements
are made on board of the cars for data recording and
monitoring (Fig. 2). Such features are then analyzed in order
to investigate the potential of the factory telemetry in the
world of the manufacturing. Moreover, it must be emphasized
that F1 represents a relevant reference case, since it is always
on the cutting edge of technological development.
Fig. 1. Similarities between the worlds of F1 racing cars and modern factories
Fig. 2. Telemetry of a F1 car which contains speed, gear and other channels
(SOURCE: Caterham F1 Team/Renault Sport F1)
2.1. Accurate monitoring of the assets for critical decisions
Telemetry is a proven technology of F1 through which a
deluge of data is transmitted from the car to the pitwall in
order to allow a team of engineers to monitor accurately and
constantly several parameters about car systems such as
suspensions, engine, transmission, and wheels [10]. In this
way the engineers can watch over the racing car performance
and optimize the vehicle setup, suggesting drivers to change
one of these parameters. Moreover, they can use the telemetry
to analyze tactics and strategies, investigating on which
corners car could go faster.
The accurate monitoring supported by a similar factory
telemetry would be relevant for any manufacturing company
where data provides the basis for critical decisions making. In
particular, there are two major areas of associated benefit: the
management of the allocated resources, and the continuous
improvement between design, development, and manufacture
of the products (enabling a kind of loop between the three
stages). Along the whole factory life cycle, the sensors
connected with the real factory components can provide
detailed information about the performance of various
processes, ensuring a better visibility and control of the used
resources and a more reliable forecasting. Moreover, a proper
Author name / Procedia CIRP 00 (2016) 000000 3
integration of the data coming from telemetry and from
enterprise systems such as MES or ERP could help operations
managers to analyze the dynamics of the manufacturing
processes and seek to identify potential improvement actions
(e.g. reconfigurations of the input parameters, changes in the
management of maintenance activities, etc.). In this way it is
also possible to identify and address any bottleneck and
ensure a smart utilization of expensive machineries which
allows to maximize the throughput. Finally, the analysis of the
gathered data enables also the check if the product “as built”
is compliant with the specifications and requirements of the
designers, helping the company to adjust and optimize
processes between design and production stages. Specifically
by merging the designer specifications about how the product
is to be manufactured and the information about how the
product is actually being manufactured, it is possible to build
an instantaneous perspective on how the manufactured
product is meeting its design specification goals.
2.2. Feedback from virtual to real to apply corrective
The F1 two-way telemetry is a bidirectional data flow that
allows engineers to make real time adjustments remotely on
the car even while the latter is running on the track. In this
way it is possible to align the setup of the car with the needs
of the driver also taking into account external conditions.
From the 2003 season, the two-way telemetry has been
banned from the FIA (Fédération Internationale de
l'Automobile), with the exception of the system for the
activation the DRS (Drag Reduction System), which allows
the driver to adjust the rear wing in order to reduce drag and
increase top speed. In fact, this system is automatically
enabled only in certain circumstances on the basis of the data
coming from the cars telemetry [11].
Similarly, within the factory, a two-way telemetry would
allow project managers and designers to accurately monitor
manufacturing processes progress in real time, enabling them
at the same time to detect problems early (e.g. breakdowns)
and apply corrective decisions based on the information they
receive and analyze. Once these decisions are final, they
would be applied to the real factory, thus implementing the
closed loop between the virtual and real factory [2].
2.3. Integration with Advanced Simulation and Forecasting
Telemetry not only allows F1 teams to collect and monitor
information in real time but also to use them in order to
properly simulate the car for maximizing its performance.
These simulation models have become so advanced that
potential lap time of the car can be calculated, and this time is
what the driver is expected to meet. Moreover, between a race
and another, the F1 teams compute a series of analysis
through which they are able to build predictive models of how
the car will perform with different setups, different tracks
under changing ambient conditions, on the basis of the
collected historical data [12].
If a telemetry-based simulation is used on the factory floor
next to the machineries that it models, it could give operators
a digital representation that looks and acts exactly like the
machine itself. In this way it can offer the capability to
execute the operations through a simulation environment
where the various product components can be inserted and
tested in different configurations across the entire production
chain. Under these conditions, operators can optimize and
validate new processes into state-of-the-art machine, without
taking the latter out of production. In order to realize this
approach, the data telemetry should be fully integrated with
discrete or continuous simulators, which allow to model the
complex dynamics of a manufacturing system. The latter can
refer to the processes of a single cell, a production line, an
entire factory, or several companies interconnected with the
warehouses through a network. Another key success factor of
the approach is the capability to initialize the simulation
models through a snapshot of the real system [5].
2.4. Digital continuity between telemetry historical data
The simulation-based analysis of the F1 car performance
mentioned in the previous subsection can be effectively
exploited only if the digital continuity between telemetry
historical data is guaranteed. Indeed, it must be ensured that
data can be playbacked and passed as input to the simulation
tools in order to perform forecasts against which to compare
the behavior of actual running real-time systems. Digital
continuity is also important from a reliability point of view,
since statistics based on historical data make sure that
installed components not exceed their recommended lifetime
ranges [12]. Finally, digital continuity plays an essential role
in case of an accident, since FIA can determine driver errors
as a possible cause on the basis of the driver inputs that have
been recorded.
Similarly, digital continuity between historical data of
factory telemetry allows to create numerous simulated data
streams that are semantically interoperable with real
operational data. Such data emulation offers a real-world
environment to train personnel, where for example control
room operators can directly interact with the system and
receive real feedback [13]. Specific analytics have to be
performed on the gathered information to extract better
insight over the progress and status of each single machine.
These analytics can provide comparison between machine
performance. Moreover, historical information can be
measured to predict the future behavior of the allocated
In order to guarantee the digital continuity between
historical data, Terkaj et al. [14] proposed to use an history
model of factory objects. In this way, historical data can be
collected and stored in a distributed way, while keeping an
overall coherence thanks to a common virtual factory model.
2.5. In situ simulations
The seamless integration of simulation tools and the real
environment of the factory paves the way to in situ simulation
approaches, which takes place in the working environment
and involving those who work there. The in situ simulation is
4 Author name / Procedia CIRP 00 (2016) 000000
distinct from center-based simulation, which is performed in a
context separated from the work environment [14].
A similar philosophy can be found in F1 behind the driving
simulator, which is a car cockpit that gives drivers true feel of
a real environment and direct feedback on their actions. The
driving simulator replicates real race track conditions and is
used to test different aspects that affect performance of the car
such as wings and brake settings. The high fidelity of the
simulator allows the driver to feel the difference that
modifications applied to the car setup can produce without the
high acceleration of a real test drive. As the new FIA
regulations limits the number of test days on the track and
also wind tunnel time to reduce costs and level the playing
field, the driving simulator plays a key role for drivers
training, saving at the same time both time and money while
respecting new regulations. Moreover, the driving simulator
can be used to test future car designs and train new drivers on
different circuits.
3. Factory Dynamic Simulator: A dedicated software for
telemetry analysis
A typical infrastructure supporting the telemetry data flow
from the real to the virtual factory should include three main
components (Fig. 3). The first is an embedded controller unit
enabling sensor data collection and logging and corresponds
to the first level of the 5C Architecture for CPS introduced by
Lee at al. in [15]. The second component is the
communication module, which commutates dynamic data
read from real components into a real-time transmission
stream. The third component is a software application that
receives, interprets, persists, integrates, and analyzes the
collected data. This section focuses in particular on the
requirements elicitation of the third component. During this
activity, a valid starting point can be the evaluation of existing
F1 telemetry systems, such as Atlas [16], which is the
standard system, or Wintax [17].
The following list highlights the major features that a
software application supporting factory telemetry should
provide: capability to maintain the links between factory
configurations/layouts and telemetry data; simulation of the
effects of different input parameter values on a given factory
process; and direct comparison of simulated results with real
telemetry data or with other simulations.
Data visualization is an essential task of the envisioned
software tool. XY Charts, waveform and scattered plotting,
statistics and animations permit to show under different views
the data telemetry acquired from the sensors which are
connected to the real factory. In this way, it is possible to
study accurately a particular aspect of the factory. Among the
most significant graphical features, the new environment
should include functionalities to filter and select a part of the
collected data stream in order to provide it as input of a new
simulation. Using a multiscale model as reference, the
envisioned software application should also comprise
capabilities to zoom in and zoom out the selected data in order
to drill down into specific data subsets. Moreover, the
Graphical User Interface should also provide facilities to
change the factory setup which comprises the different input
parameter values for the proper configuration of the factory
processes; the setup can be stored to a database in order to be
used as input for a following simulation. Each new created
setup should be compliant with the previous already saved
setups, allowing in this way to guarantee their Digital
Continuity, as discussed in the previous section.
A typical issue of the data coming from sensors is the noise
errors. As it is better to have a smooth curve to analyze the
factory performance, removing high frequency noise and
spike is a necessary feature for the envisioned software
application. In this regard it is essential to use various
techniques of high frequency noise removal such filtering and
smoothing [18]. Also, the end-users should have the
possibility to introduce a sensor offset/gain or implement a
sensor correction. Combining the digital versions of telemetry
signals and a lot of math/logical/filter/statistical functions,
also through the integration with external commercial tools
such as Excel, Matlab and Simulink, it is possible to create the
so-called virtual channels, which represent a method to
abstract and remap the original telemetry channels (for
example to create alarms). A proper API (Application
Programming Interface) should guarantee the access to
telemetry data, enabling data analysis in external tools (e.g.
Matlab). Finally, a Multicast transmission of the data over the
factory network would allow the software application to
receive the telemetry regardless of the PC where the software
application runs, as long as it is connected to the network and
Fig. 3. The components of a typical infrastructure supporting telemetry
4. Conclusions
This paper has highlighted various analogies between the
worlds of F1 racing cars and modern factories. On the basis of
these analogies, the paper has analyzed the benefits of the
technology transfer of the telemetry, proven in F1, to the
manufacturing field. In particular, it is shown that the
exploitation of the factory telemetry could offer various
methods to perform valuable simulations of the production
processes, using as input the data coming from the real
factory. Moreover, the requirements of a software application
supporting the factory telemetry have been elicited. Further
developments of this study will address the difficulty in
Author name / Procedia CIRP 00 (2016) 000000 5
integrating the telemetry with existing simulation packages,
the semantic interoperability of the data coming from
heterogeneous sensors, and the need for more efficient and
scalable databases for Big Data storage [19].
The research reported in this paper has been funded by the
European Union 7th FP (FP7/20072013) under the grant
agreement No: 314156, Engineering Apps for advanced
Manufacturing Engineering (Apps4aME).
[1] Grieves, M.. Digital twin: manufacturing excellence through virtual
factory replication (2014). [Online]. Available from
nufacturingExcellence.php. [retrieved: Jan, 2016].
[2] Kádár B, Terkaj W, Sacco M (2013) Semantic Virtual Factory supporting
interoperable modelling and evaluation of production systems. CIRP
Annals Manufacturing Technology, 62(1):443-446.
[3] Ben Khedher, A., Henry, S., & Bouras, A. (2011). Integration between
mes and product lifecycle management. In Emerging Technologies &
Factory Automation (ETFA), 2011 IEEE 16th Conference on (pp. 1-8).
[4] Monostori, L. (2014). Cyber-physical production systems: Roots,
expectations and R&D challenges. Procedia CIRP, 17, 9-13.
[5] Kádár, B., Lengyel, A., Monostori, L., Suginishi, Y., Pfeiffer, A., &
Nonaka, Y. (2010). Enhanced control of complex production structures by
tight coupling of the digital and the physical worlds. CIRP Annals-
Manufacturing Technology 59.1 (2010): 437-440.
[6] Christo, C., & Cardeira, C. (2007, June). Trends in intelligent
manufacturing systems. In Industrial Electronics, 2007. ISIE 2007. IEEE
International Symposium on (pp. 3209-3214). IEEE.
[7] AlGeddawy, T., & ElMaraghy, H. (2010). Co-evolution hypotheses and
model for manufacturing planning. CIRP Annals-Manufacturing
Technology, 59(1), 445-448.
[8] Ueda, K., Vaario, J., & Ohkura, K. (1997). Modelling of biological
manufacturing systems for dynamic reconfiguration. CIRP Annals-
Manufacturing Technology, 46(1), 343-346.
[9] McCullen, P., Saw, R., Christopher, M., & Towill, D. (2006, June). The
F1 supply chain: adapting the car to the circuitthe supply chain to the
market. In Supply chain forum: an international journal (Vol. 7, No. 1, pp.
14-23). KEDGE Business School.
[10] Cocco, L., and P. Daponte (2008). Metrology and formula one car.
Instrumentation and Measurement Technology Conference Proceedings,
2008. IMTC 2008. IEEE.
[11] Federation Internationale de l'Automobile, 2011 formula one technical
regulations (2010). Tech. Rep., Section 3.18. [Online]. [retrieved: Jan,
[12] Waldo, J. (2005). Embedded computing and Formula One racing.
Pervasive Computing, IEEE 4.3: 18-21.
[13] Capozzi, F., Lorizzo, V., Modoni, G., & Sacco, M. (2014). Lightweight
Augmented Reality Tools for Lean Procedures in Future Factories. In
Augmented and Virtual Reality (pp. 232-246). Springer International
[14] Terkaj, W., Tolio, T., & Urgo, M. (2015). A virtual factory approach for
in situ simulation to support production and maintenance planning. CIRP
Annals-Manufacturing Technology, 64(1):451-454.
[15] Lee, J., Bagheri, B., & Kao, H. A. (2015). A cyber-physical systems
architecture for industry 4.0-based manufacturing systems. Manufacturing
Letters, 3, 18-23.
[16] Advanced Telemetry Linked Acquistion System [Online]. Available
[retrieved: Jan, 2016].
[17] Wintax4 [Online]. Available from
ax4. [retrieved: Jan, 2016].
[18] Vaseghi, S. V. (2013). Advanced signal processing and digital noise
reduction. Springer-Verlag.
[19] Modoni, G. E., Sacco, M., & Terkaj, W. (2014, June). A survey of RDF
store solutions. In Engineering, Technology and Innovation (ICE), 2014
International ICE Conference on (pp. 1-7). IEEE.
... Now the physical and digital worlds have become interconnected, which was the motive towards the correlation between the two worlds through telemetry supported by simulation. For exam-ple, in F1 car racing, the live stream collected from hundreds of sensors installed on the car and transmitted to the pit wall serving as a data source for simulating the real-time car performance [5]. As a result, engineers can make real-time remote adjustments to the vehicle running on the track. ...
... Especially in stateful stream processing, where each data point can play a critical role in the system. 5 6 ...
Full-text available
Digital twins of processes and devices use information from sensors to synchronize their state withthe entities of the physical world. The concept of stream computing enables effective processing of events gen-erated by such sensors. However, the need to track the state of an instance of the object leads to the impossi-bility of organizing instances of digital twins as stateless services. Another feature of digital twins is that severaltasks implemented on their basis require the ability to respond to incoming events at near-real-time speed.In this case, the use of cloud computing becomes unacceptable due to high latency. Fog computing managesthis problem by moving some computational tasks closer to the data sources. One of the recent solutions pro-viding the development of loosely coupled distributed systems is a Microservice approach, which implies theorganization of the distributed system as a set of coherent and independent services interacting with eachother using messages. The microservice is most often isolated by utilizing containers to overcome the highoverheads of using virtual machines. The main problem is that microservices and containers together arestateless by nature. The container technology still does not fully support live container migration betweenphysical hosts without data loss. It causes challenges in ensuring the uninterrupted operation of services in fogcomputing environments. Thus, an essential challenge is to create a containerized stateful stream processingbased microservice to support digital twins in the fog computing environment. Within the scope of this article,we study live stateful stream processing migration and how to redistribute computational activity across cloudand fog nodes using Kaf ka middleware and its Stream DSL API.
... The DT consists in a faithful digital representation of a system (or part of it), of a single product or of a process and this representation is realized by including properties and behaviour of the physical objects within virtual models (Schleich et al., 2017). In particular, the DT becomes a real replica of its physical twin when it is fully synchronized with it through a circular process between real and digital world (Modoni et al., 2016). Indeed, in this case, the DT becomes "an integrated multi-physics, multiscale, probabilistic simulation of an as-built system" that allows to mirror the real operating conditions of its corresponding physical counterpart (Shafto et al., 2012). ...
n recent years, the interest in miniaturization of devices easier to wear or even to insert into the body and in general the interest in building lightweight and compact systems are greatly increased. For this reason, the field of micro manufacturing is becoming more and more strategic for modern industry. One critical aspect for the adoption of micro manufacturing technologies on large scale is the quality validation and metrology of components or devices with conventional technologies, such as vision systems or tactile profilometer. Indeed, these technologies hardly fit to be used on the micro devices due to their extremely reduced dimensions which cause a high risk of damaging the micro structured surface. For these reasons, one of the challenges in the micro manufacturing field is the study of methods and tools for the continuous monitoring of the micro production process, with the final aim to improve reliability. In this regard, this paper focuses on a typical micro manufacturing technology, such as micro injection moulding, and presents a methodological approach that can help companies to adopt a solution for optimizing their process leveraging the corresponding Digital Twin. The latter represents a mirror of the physical process that allows to monitor in-line its parameters, to compare them with any analytic model, and to supply specific variations of parameters to keep them always in optimal conditions. The paper also presents a proof of concept of the proposed approach that has been validated to prove the correctness and its capability to scale to a real case study.
... .]'. This synchronization and integration of the digital thread into the object is an eminent aspect of the DT, similar to the usage of telemetry-data in other fields (Modoni, Sacco, and Terkaj 2016). Kritzinger et al. (2018) propose a classification and review of DT applications based on automatic or manual data flow, aggregating this to the classification of Digital Models (DM, both manual), Digital Shadows (DS, automatic update of virtual object), and Digital Twins (automatic bi-directional). ...
Full-text available
After its introduction around 20 years ago, the Digital Twin (DT) approach has recently attracted much interest in shaping the next generation of manufacturing. In the last years, many definitions and descriptions of the DT have been published, examining different aspects of its implementation. This paper is the first to present an analysis on the integration and interaction of human and DT in smart manufacturing systems in form of a scoping review following the PRISMA-ScR methodology. It presents the current state of the art of DT-based human-machine interaction (HMI), its implications, and future research directions. Filtering from 278 publications over the last decade, the analysis includes 23 publications, all published from 2016 to 2020. The results show the predominant scenarios and applications of DT-based HMI and identify the current division of labor between human and DT. The paper concludes with an integration of these findings into a human-centered classification of DTs as well as future research directions.
... Unlike the traditional simulation, the virtual representation in DT is continually updated with the state of maintenance and performance throughout the physical asset's life cycle [3]. For example, in car racing, the data stream from sensors on the car transmitted to the pit wall to simulate car performance and enable real-time remote adjustment [4]. The virtual representation model may be predefined or learned from the data streams, which is a complex process that may require machine learning and identification techniques to specify the parameters for each specific class of model [5] [6]. ...
Full-text available
Abstract—Smart industry systems are based on integrating historical and current data from sensors with physical and digital systems to control product states. For example, Digital Twin (DT) system predicts the future state of physical assets using live simulation and controls the current state through real-time feedback. These systems rely on the ability to process big data stream to provide real-time responses. For, example it is estimated that one autonomous vehicle (AV) could produce 30 terabytes of data per day. AV will not be on the road before using an effective way to managing its big data and solve latency challenges. Cloud computing failed in the latency challenge, while Fog computing addresses it by moving parts of the computations from the Cloud to the edge of the network near the asset to reduce the latency. This work studies the challenges in data stream processing for DT in a fog environment. The challenges include fog architecture, the necessity of loosely-coupling design, the used virtual machine versus container, the stateful versus stateless operations, the stream processing tools, and live migration between fog nodes. The work also proposes a fog computing architecture and provides a vision of the prerequisites to meet the challenges.
... With the new generations of engineers moving into the manufacturing world, companies from across numerous practices have already started to adopt various DMT. In addition, experiments on connectivity between real and virtual worlds have been widely introduced and tested by Modoni (2016) and Kuts et al. 2018 in order to feed Digital Twin replicas with data from the real IoT sensors to improve the accuracy of the simulation. Digital Twin technology is one of the major pillars for smart manufacturing as it allows us to analyze the past and predict the future. ...
Full-text available
The new paradigm of digital manufacturing and the concept of Industry 4.0 has led to the integration of recent manufacturing advances with modern information and communication technologies. Therefore, digital simulation tools fused into production systems can improve time and cost-effectiveness and enable faster, more flexible, and more efficient processes to produce higher-quality goods. The advancement of digital simulation with sensory data may support the credibility of production systems and improve the efficiency of production planning and execution processes. In this paper, an approach is proposed to develop a Digital Twin of production systems in order to optimize the planning and commissioning process. The proposed virtual cell interacts with the physical system with the help of different Digital Manufacturing Tools (DMT), which allows for the testing of various programs in a different scenario to check for any shortcomings before it is implemented on the physical system. Case studies from the different production systems are demonstrated to realize the feasibility of the proposed approach.
... The increasing interest in micro manufacturing, especially in areas such as aerospace (structure monitoring, sensors for landing system and for engines and reactors, etc.), automotive (pressure sensors, engine management, air and gas quality control, etc …) and above all biomedical, is linked to the results obtained by scientific community that have demonstrated how these micro technologies are ready for the industrial field. In biomedical sector, in particular, the increasing possibility for micro-technologies to be combined in order to produce complete miniaturized devices, are transforming the micromanufacturing sector in a concrete concept for the factories of the future, where scenarios like the digital twin [1] can be applied to improve the reliability of the process chain. Of all the micro processes, the one that is driving the development towards the mass production in particular of biomedical devices is definitely the micro injection moulding. ...
Full-text available
The study of the rheological behaviour of the polymer in micro cavities is one of the aspects related to the technology of micro injection moulding (μIM) still substantially unresolved. Even today, there are no databases on the rheological characteristics of the material specific for the μIM, which, therefore, takes into account a number of important differences compared to the conventional injection moulding. In this paper, the study of the rheological behaviour of the polymer melt in a thin plate cavity with variable thickness has been conducted. The use of a micro injection moulding machine, on which the prototype of a sensorized mould with pressure and temperature sensor has been mounted, allowed the rheological study of the material under high shear rate conditions. After preliminary tests on different thicknesses, it has been studied the viscosity of polymer melt for 400 μm thickness. The viscosity reduction observed meets the characteristics of a pseudoplastic fluid subject to shear thinning and the wall slip seems to play an important role in the apparent reduction of viscosity. The results suggest to increase injection speed, and consequently injection pressures, so that the reduced viscosity can help melt flow to overcome the extreme conditions due to the aspect ratio and to obtain greater efficiency from the filling phase against the high cooling rate typical of micro injection moulding.
... A cost effective system than the current modules with a high data rate for a DAQ system for a telemetry system is developed for automotive application considering the cost, size and performance [4]. Data collection for a Formula One car racing is discussed where live data is collected from different sensors installed on the components of the car which are transmitted to check the real time performance and simulations [5]. ...
... Application of inbound logistics tool in different disruption events tier of entire system which aims at the integration of all the factory's tools data. In particular, it contains two macro-modules (Event Dispatcher and Digital Twin) that allow to integrate the data generated by the layer of the Real Factory under the form of data streams (Factory Telemetry) [14]. Specifically, the Digital Twin is a virtual model which represents a faithful mirror of the Real Factory, persisted on two structured databases: Event Disrupt Database (containing the logic to raise the events) and Synchro Factory Database (containing the information related to the supply chain management) [15]. ...
The critical success factor of the supply chain management process in a modern manufacturing company consists in the company’s capability to exploit the data produced by a growing number of different sources. The latter include a network of collaborative sensors, digital tools, and services, made available to suppliers and other involved supply chain actors by the recent advancements in digitalization. The collected data can be processed and analyzed in near real time to extract significant information useful for the company to take some relevant decisions. However, these data are typically produced under the form of heterogeneous formats, as they arrive from different types of sources. This is the reason why the real challenge is finding valid solutions that support the data integration. In this regard, this paper investigates the potential of a solution for data integration that allows supporting a set of interacting decision-support tools within the inbound logistics of the automotive manufacturing. This solution is based on a message-oriented middleware which enables a collaborative approach where suppliers, trucks, dock managers and production plants can share information about their own status for the optimization of the overall system.
Full-text available
The digital twin (DT) concept has a key role in the future of the smart manufacturing industry. This review paper aims to investigate the development of the digital twin concept, its maturity and its vital role in the fourth industrial revolution. Having identified its potential functionalities for the digitalisation of the manufacturing industry, the digital twin concept, its origin and perspectives from both the academic and industrial sectors are presented. The identified research gaps, trends and technical limitations hampering the implementation of digital twins are also discussed. In particular, this review attempts to address the research question on how the digital twin concept can support the realisation of an integrated, flexible and collaborative manufacturing environment which is one of the goals projected by the fourth industrial revolution. To address this, a conceptual framework supporting an integrated product-process digital twin for application in digitised manufacturing is proposed. The application and benefits of the proposed framework are presented in three case studies.
Full-text available
Structured methodologies and tools for the tailored design of factories are more and more adopted by suppliers of manufacturing systems but usually discontinued after the design phase. The use of an ontology-based virtual factory, continuously synchronized with the real plant, is proposed to guarantee digital continuity and enable in situ simulation during the operating phase of a factory. This digital counterpart of the system can be used for integrated shop-floor simulations to assess future impact of production and maintenance planning decisions. An industrial application is provided in the context of roll shops, i.e., systems devoted to the grinding of cylinders for rolling mills.
Full-text available
This paper introduces the concept of a " Digital Twin " as a virtual representation of what has been produced. Compare a Digital Twin to its engineering design to better understand what was produced versus what was designed, tightening the loop between design and execution.
Full-text available
Recent advances in manufacturing industry has paved way for a systematical deployment of Cyber-Physical Systems (CPS), within which information from all related perspectives is closely monitored and synchronized between the physical factory floor and the cyber computational space. Moreover, by utilizing advanced information analytics, networked machines will be able to perform more efficiently, collaboratively and resiliently. Such trend is transforming manufacturing industry to the next generation, namely Industry 4.0. At this early development phase, there is an urgent need for a clear definition of CPS. In this paper, a unified 5-level architecture is proposed as a guideline for implementation of CPS.
Conference Paper
Full-text available
The aim of this paper is to introduce the main outcomes of the application of Augmented Reality (AR) features to manufactur- ing and industrial scenarios under a new perspective. While the request of industrial mixed reality technologies is continuously growing, the re- search community is still facing the crucial challenge to give a convenient answer to such needs. The problem of the development of adaptable and inexpensive AR solutions is herein addressed by proposing a new ap- proach for the application of augmented reality technology to lean-based visual communication transfer and exchange. This work starts from the concept of virtual factory, a place where the real production of future factories becomes fully merged with virtual reality features and utilities. Augmented reality applications may then be reinterpreted as lightweight tools that continuously interact with the virtual factory to support man- ufacturing and management tasks, providing just-in-time and adaptive augmented information to users. As a case study, several AR tools de- signed following these principles to support a real production process are presented.
Conference Paper
Full-text available
This paper analyzes the potential of Semantic Web technologies to support innovation in industrial scenarios. The study focuses in particular on RDF stores, the software components dedicated to the storage, representation and retrieval of semantic information. Starting from a literature review, a qualitative analysis is carried out considering a set of these systems. RDF stores are evaluated to find the implementations that are best suited to play the role of backbone in a software architecture sharing information between the software tools adopted by the various stakeholders. This can be achieved if the architecture overcomes the problems deriving from the lack of integration between the involved software applications, providing in this way an integrated view on relevant engineering knowledge.
Full-text available
One of the most significant directions in the development of computer science and information and communication technologies is represented by Cyber-Physical Systems (CPSs) which are systems of collaborating computational entities which are in intensive connection with the surrounding physical world and its on-going processes, providing and using, at the same time, data-accessing and data-processing services available on the internet. Cyber-Physical Production Systems (CPPSs), relying on the newest and foreseeable further developments of computer science, information and communication technologies on the one hand, and of manufacturing science and technology, on the other, may lead to the 4th Industrial Revolution, frequently noted as Industry 4.0. The key-note will underline that there are significant roots generally – and particularly in the CIRP community – which point towards CPPSs. Expectations and the related new R&D challenges will be outlined.
Full-text available
Analogies can be a powerful way of understanding and improving industrial performance. This paper exploits the similarities between Formula 1 competition and racing car design and supply chain competition and supply chain design, as a way of understanding and managing change. Just as the performance characteristics of a racing car must be aligned to the requirements of the track, so the performance of a supply chain must be aligned to the requirements of its market(s) and product(s). We present a framework for graduated re-alignment of the supply chain; extending through agility, adaptability and transformation. The possibilities for aligning and re-aligning supply chains are reviewed in terms of seven operational dimensions. The approach is illustrated using examples drawn from the Mechanical Precision, Paper Distribution and Chemical market sectors. The extent to which a supply chain is agile and adaptable, together with the recognition of the need for periodic transformation is an important conclusion for supply chain designers.
Full-text available
Today, within the global Product Lifecycle Management (PLM) approach, success of design, industrialization and production activities depends on the ability to improve interaction between information systems that handle such activities. Enterprises deploy mainly PLM system, Enterprise Resource Planning system (ERP) and Manufacturing Execution System (MES) in order to manage sufficient product-related information and provide better customer-products. This paper proposes a methodological approach to integrate product data generated during product design, industrialization and production. This involves the PLM and MES integration. Thus, the proposed approach aims to overcome the problem of data heterogeneity by proposing a mediation system resolving syntactic and semantic conflicts. Keyword words: PLM, MES, integration, mediation system, web service architecture.
One of the most fundamental properties of CPS is the use of services in global networks. Concerning production systems this means to set up automated systemswithout knowing the concrete devices. The encapsulation of mechatronic process functions and their semantic annotation offer the basis for an abstract description of field device functionalities. It enables the invocation of services over system boundaries. This paper describes a method for the dynamic orchestration of automated processes based on service oriented architectures and semantic technologies.