ArticlePDF Available

Digital Twin is a complex of microservices

Authors:

Abstract

Industry 4.0 is an actively developing concept, of which the concept of the Digital Twin is becoming a part. The digital twin is a complex cyber-physical system that consists of many components. One of the main tasks in the construction of a twin is to organize the interaction of the parts of the twin with each other. Previously, an approach called the enterprise service bus was used, but over the years of its use it became clear that it is not suitable for constantly evolving and growing systems. The digital twin is just such a system and therefore it is required to use a different approach, called microservice. If we imagine the parts of the twin as a set of microservices, then it will be possible to create a system suitable for constant evolution and replacement of its parts. This approach was used to solve the problem of building a prototype digital twin of methanol production. The solution of this problem showed the possibility of using a microservice approach.
Digital Twin is a complex of microservices
Maxim Pysin 1
*
and Alexey Lobanov 1
1 Mendeleev University of Chemical Technology, Information Computer Technologies Department,
Moscow, Russia
Abstract. Industry 4.0 is an actively developing concept, of which the
concept of the Digital Twin is becoming a part. The digital twin is a complex
cyber-physical system that consists of many components. One of the main
tasks in the construction of a twin is to organize the interaction of the parts
of the twin with each other. Previously, an approach called the enterprise
service bus was used, but over the years of its use it became clear that it is
not suitable for constantly evolving and growing systems. The digital twin
is just such a system and therefore it is required to use a different approach,
called microservice. If we imagine the parts of the twin as a set of
microservices, then it will be possible to create a system suitable for constant
evolution and replacement of its parts. This approach was used to solve the
problem of building a prototype digital twin of methanol production. The
solution of this problem showed the possibility of using a microservice
approach.
1 Introduction
Throughout history, mankind has been developing its technological knowledge and practical
skills. During the period of antiquity and the Middle Ages, skills related to manual work and
the beginning of the emergence of handicrafts became the basis for development, however,
after the beginning of the Renaissance and the active development of guilds (associations of
people by occupation), new milestones in technological development became possible. Such
jumps began to be called scientific and technological revolutions. These rapid technological
breakthroughs were accompanied by significant changes in social, cultural and economic life.
At the moment, the historical process of technology development is in the stage of a new
scientific and technological revolution called Industry 4.0. This concept is essentially a set
of methods, approaches and technologies that involve the digital transformation of industry
and the consumer market, the development of smart factories, smart supply chains and smart
production chains. [1, 2]
Technological transformation requires the industry to introduce new technologies and
approaches that allow for broad control and predictability of the production cycle. One of the
basic technologies that can help in building smart industries in their current understanding is
the technology of digital twins. The digital twin has recently been actively turned on and
developed as part of the concept of Industry 4.0 [3]. According to a scientific paper by Pires
and Flávia [4]: “The digital twin is a set of adaptive models that mimic the behaviour of a
*
Corresponding author: aprogrom@gmail.com
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
© The Authors, published by EDP Sciences. This is an open access article distributed under the terms of the Creative
Commons Attribution License 4.0 (https://creativecommons.org/licenses/by/4.0/).
physical system in a virtual system that receives real-time data to update throughout its life
cycle. The digital twin replicates the physical system to predict failures and opportunities for
change, prescribe real-time actions to optimize and/or mitigate unforeseen events by
observing and evaluating the operating profile of the system.” This rather broad definition
gives a clear understanding of the tasks it performs. In addition to this publication, which
indicates the given definition, other scientific articles devoted to this topic [5] also highlight
the listed tasks. As a result, we can say that this concept is designed to ensure continuous
processing of data received both on a physical object and on its digital representation. The
purpose of this processing is to predict and optimize the processes of a real object. As a result,
we can summarize that the digital twin allows you to:
Accelerate and reduce the cost of testing and testing hypotheses when optimizing
processes in plants and enterprises.
Reduce pollution and injuries by predicting exceptions and problems.
Increase the efficiency of both workers and installations through fine-tuning based on
predictive results.
Reduce the cost of training new employees by documenting production and having a full-
fledged digital copy from which to conduct training.
Centralize the collection, exchange and processing of data generated by various
components of production, by combining all enterprise systems [6].
Separately, it is worth highlighting visualization as a significant task that is set for digital
twins. About 50% of all publications are devoted to this problem, from which we can
conclude that it is significant [7]. Visualization can be both classical and three-dimensional,
in the case of three-dimensional one, we can talk about VR (virtual reality) and AR
(augmented reality) technologies that can increase the return on learning on such a double.
2 Materials
Based on everything that is written above, and based on a number of scientific papers, we
can conclude that the digital twin is not a single monolithic system and involves the presence
of many parts [6-8]. Each of these parts is obliged to communicate with each other and ensure
the transfer of data, performing its task assigned to it. Such a system should grow and develop
along with the physical object and incorporate all new subsystems used for a more
comprehensive and complete description of the original object. Building such systems is a
complex task that has various solutions: an approach called the enterprise service bus is often
used [9].
However, this approach has a number of disadvantages associated with the device
architecture itself, as follows:
Performance the named architecture is based on a specific center of concentration of
data flows, the very bus through which all communication takes place. This center is
under severe stress when many different systems are connected to it.
Delayed effects the essence of this architecture assumes that the service bus performs
not only a transport function, but also contains a number of business logic. Although
initially many systems built in this way do not imply such a load on the bus, but over time
it becomes a point of concentration of logic that unites various subsystems. Gradually, it
turns into a tangle of intricate connections that are extremely difficult to parse.
Documentability is a disadvantage closely related to the previous one, since the
algorithms for computing business logic are transferred to the bus. Documenting such a
bus turns into a task comparable in complexity to building a bus.
Blurring of responsibility similar to the previous problem, it grows due to the
placement of logic in the bus itself, at some point during the development of the next
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
2
subsystems, part of their logic, for the sake of simplicity and speed of development,
begins to be implemented on the bus, their own responsibility and control over data are
lost, with which this subsystem operates.
Fault tolerance in fact, the presence of one point of concentration creates problems
with fault tolerance, since it turns out a single node of disruption in operation, failures of
which entail guaranteed problems in the functioning of all the others.
Entanglement this problem is generally connected with the fact that a single bus turns
a system distributed in its idea into a common monolithic tangle of relationships and
dependencies.
All the problems described above relate not so much to the very concept of the service
bus as an architectural idea, but to the incarnations that it takes during implementation and
long-term operation. In this fact lies the most basic problem - such an architecture is not in
the full sense of evolutionary, and it will be impossible for it to grow together with an object
that involves a gradual increase and change. Building such a system would require a more
evolutionable architecture. The microservice architecture copes with this task [10].
Microservice architecture is used by many Internet companies, such as Google, Apple,
Amazon, Netflix and others, who collect and process huge amounts of data. This architecture
presupposes the exclusion of the possibility of building single points of concentration of all
functions and the distribution of tasks and logic into separate parts. At the same time, it is
important to note that the microservice architecture also has malicious and incorrect
implementations that involve the division of a single business logic into parts according to
the entities involved in this logic, which turns the entire operation of the system into a block
of separate independent parts, each of which must constantly interact with others to solve
their problem. But this problem, in contrast to the problem of the growth of the service bus
into a monolith, is solved at the initial stage, and in most cases it harms the system
immediately, which forces developers to abandon and never use this option again. If the
microservice system is combined with the concept of domain logic [11], this will create a
complex system that maintains a balance between the independence of individual parts and
the connectivity of the entire system. As a result, the microservice architecture provides the
following properties:
Scalability is associated with the ability of the system to quickly increase its
performance in certain segments, which have become its bottleneck.
Flexibility is determined by allowing parts of the system to be implemented on any
technological basis, be it different programming languages, different execution
environments, different data transmission channels, the main thing is that the common
format of interaction should be preserved.
Integrability is determined by the almost complete independence of the entire
architecture from technology, the only rule is a single interaction protocol that will allow
any part to adapt to it.
Fault tolerance is provided by the capabilities of systems built on such an architecture
to operate on distributed computing power, store their data in distributed storages and use
distributed databases.
Decentralization is ensured by the fact that the architecture does not assume the
presence of a single center for the concentration of functions or data by default.
The properties described above are not exhaustive and are not strictly exclusive to the
chosen approach, but the combination of these properties is definitely an important
achievement of the microservice approach. It is important to note that in order to achieve
these properties, not only the architecture itself must have these characteristics at its core, but
also its individual parts must provide them. That is, any microservice should be designed and
developed in such a way that:
Be capable of flat scaling of the resources allocated to it;
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
3
Implement the base interoperability standards defined in the original architecture design.
Make it easy to deploy yourself to new capacities.
Guarantee versioning and the ability to use previous versions to quickly fix inoperability.
Be able to perform its own functions with a minimum amount of information taken from
other integrated systems.
Ensuring all the above properties of parts of the system ensures that the described
properties of the entire system are guaranteed, which is the key advantage of the architecture
over alternative options for working with really large amounts of data that the digital twin of
production must operate with.
3 Results
Using the concept of architecture described earlier, it is possible to build a digital twin system
that will provide and guarantee the availability of a resource for constant expansion and
evolution. The essence of the idea is to present each part of the twin as a separate domain
microservice that deals exclusively with its own tasks. So the separate parts will be:
The real object, and in the future and, if necessary, its individual parts:
A program for modeling a real object or its parts.
Object monitoring system.
Predictive analytics system.
Object management system.
Model management system.
Operator assistance and support system.
Visual representation of an object.
Each of these parts will act as a provider or consumer of data that will be used by other
parts. In a situation where all systems generate and receive data, it would be reasonable to
introduce a center for their collection and provision as a separate unit. At this point, we run
into a contradiction with the approach described above, and it is logical to point out that such
a system is very similar to a service bus, but there are important differences. So the server
for aggregating and providing data is not the only link, and other systems, if necessary, can
interact independently. Separately, we note that the server does not imply any additional logic
other than storing data sent by one consumer and providing it to others. The server does not
ensure the operation of signals and reactions to data changes, it simply concentrates them and
transmits them further along the chain. Thus, such a data concentration server is not a point
of failure, since it can be run as a separate unit in any quantities, it does not modify the data
and does not process them in any way, its whole task is to quickly save and quickly provide.
If one such server fails, the consumer and supplier can always switch to another, the data in
which will be up to date. The only duty that can be assigned to such a server, in addition to
data transmission, is their replication among all their fellows.
As a result, in the system, the conceptual representation of which is shown in Figure 1,
there will be only three entities that are significantly different from each other:
Data providers are those parts that generate the data coming into the system.
Data consumers are those parts that receive and use data from the system.
Data sender - those parts that act as aggregators, layers whose only task is to receive
data from the supplier and pass it on to the consumer if he requests it.
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
4
Fig. 1 Conceptual representation of the described digital twin system.
Separately, it should be noted that in order to build a system based on such an architecture,
it will be necessary to choose a single data exchange protocol and a format for presenting this
data, and since it is the microservice architecture that is used, it is logical to use the JSON
data format and protocol that have proven themselves and are the de facto standard in such
systems. HTTPS exchange. The main advantage of this choice will be the ability to use
existing libraries for writing code and organizing the interaction of all parts of the system.
Also, it is worth noting the convenience of the data format for human reading, which will
simplify the task of debugging an embedded system.
Fig. 2 Actual structure of the conceptual implementation of the digital twin.
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
5
A similar conceptual twin device was implemented in the prototype digital twin of the
co-production of methanol and ammonia, shown in Figure 2. This implementation is a
prototype of the digital twin because there is no real object constantly communicating with
the model. This is due to the initial formulation of the problem, according to which only a
digital model was provided, implemented using the UniSim Design software package, used
in the simulator existing at the enterprise. The constructed system serves as an illustration of
the viability of the idea of organizing a digital twin as a set of microservices.
4 Discussion
When implementing this task, the complexity associated with the proprietary software on
which the production model was implemented was identified. This program was created at a
time when the basis of technological programs was the dependence on programs from the
same manufacturer, and therefore there was no convenient way of integration interaction in
UniSim Design. Also, the method of data extraction that was discovered did not in itself
provide integrability into a system built on the basis of web technologies, i.e. it was
impossible to use the HTTPS protocol directly. To solve this problem, a separate entity called
the data driver was added to the overall architectural design. This program connects to a data
source or consumer that cannot be integrated into the system on its own, and using the
interaction mechanisms provided by it, provides data transmission or reception from the
aggregation server. In fact, this program runs in parallel with the simulation program, and
either directly controls it, interrupting it to receive data, or removes data during the
calculation in any of the available ways.
Also, in the process of solving the problem of constructing another digital representation,
it became necessary to change the modelling program to the Aspen HYSYS program. This
program has the same roots as UniSim Design, but has its own distinctive features. When
switching to another modelling program, for its integration into the system, it was also
necessary to create a data driver, but no other changes were made to the developed digital
representation, which perfectly shows the flexibility of the system to changes achieved with
such an implementation.
In the process of implementing the prototype of the digital twin, it was not possible to
load test the proposed architecture, for this reason, it is possible to judge its resistance to
loads only by the presence of such stability in the architecture itself, as well as scalability. To
solve the second problem, it is supposed to use automatic build systems and project
deployment on servers in order to be able to horizontally scale the used capacities by
launching additional servers if necessary. Traffic redirection and balancing must be provided
by Nginx using a random server selection mechanism to process the request. The deployment
of processing servers itself must use CI/CD mechanisms and containerization technology to
rapidly scale up capacity.
5 Conclusion
The presented work can be considered as proof of the viability and applicability of the idea
of building systems similar to a digital twin as a set of microservices. Such a device will
allow the system to evolve together with the physical object and adapt to improvements in
the modeling process. Such an architecture will allow any third-party software from any
manufacturer to be integrated into the digital twin, i.e. connect it with existing management
and analytics systems at enterprises. The microservice approach will enable the ease of
management of the entire enterprise ecosystem and the gradual introduction of the concept
of smart enterprise, smart supply chains and smart products. The approach will allow
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
6
gradually increasing computing power and increasing the amount of data collected, which
will ensure a smooth increase in the cost of introducing new approaches and technologies for
analysis and modeling. The use of a modern evolutionary architecture in a technological
enterprise to build data collection systems will simplify the process of implementing a third-
party software vendor, which, in turn, can save on the final cost of the entire system.
Separately, it is worth highlighting the fact that this technology is associated with the use of
commonly used web interaction standards, which means that it can be implemented by a
wider range of specialists, which will ensure the influx of fresh qualified personnel into
production. The microservice architecture is not without its drawbacks and may be replaced
by a more advanced one in the future, but now it can give the industry a new impetus in the
development of production systems.
References
1. A. Schroeder, et al.,Production Planning & Control 30, 16, 1305-1321 (2019)
2. H.S. Kang, et al., International journal of precision engineering and manufacturing-
green technology 3, 111-128 (2016)
3. M. Ghobakhloo, Journal of cleaner production 252, 119869 (2020)
4. F. Pires, et al., Digital twin in industry 4.0: Technologies, applications and challenges.,
2019 IEEE 17th International Conference on Industrial Informatics, INDIN, Vol. 1.,
721-726 (2019)
5. M. Liu, et al., Journal of Manufacturing Systems 58, 346-361 (2021)
6. M. Singh, et al., Applied System Innovation 4, 2, 36 (2021)
7. F. Tao, et al., Journal of Manufacturing Systems 64, 372-389 (2022)
8. Qi, Qinglin, et al., Journal of Manufacturing Systems 58, 3-21 (2021)
9. F. Menge, Free and open source software conference 2 (2007)
10. N. Dragoni, et al., Microservices: yesterday, today, and tomorrow., Present and ulterior
software engineering, pp 195-216 (2017)
11. E. Evans, R. Szpoton, Domain-driven design., Helion (2015)
E3S Web of Conferences 458, 09012 (2023)
EMMFT-2023
https://doi.org/10.1051/e3sconf/202345809012
7
Article
Full-text available
Digital Twin (DT) refers to the virtual copy or model of any physical entity (physical twin) both of which are interconnected via exchange of data in real time. Conceptually, a DT mimics the state of its physical twin in real time and vice versa. Application of DT includes real-time monitoring, designing/planning, optimization, maintenance, remote access, etc. Its implementation is expected to grow exponentially in the coming decades. The advent of Industry 4.0 has brought complex industrial systems that are more autonomous, smart, and highly interconnected. These systems generate considerable amounts of data useful for several applications such as improving performance, predictive maintenance, training, etc. A sudden influx in the number of publications related to ‘Digital Twin’ has led to confusion between different terminologies related to the digitalization of industries. Another problem that has arisen due to the growing popularity of DT is a lack of consensus on the description of DT as well as so many different types of DT, which adds to the confusion. This paper intends to consolidate the different types of DT and different definitions of DT throughout the literature for easy identification of DT from the rest of the complimentary terms such as ‘product avatar’, ‘digital thread’, ‘digital model’, and ‘digital shadow’. The paper looks at the concept of DT since its inception to its predicted future to realize the value it can bring to certain sectors. Understanding the characteristics and types of DT while weighing its pros and cons is essential for any researcher, business, or sector before investing in the technology.
Article
Full-text available
Various kinds of engineering software and digitalized equipment are widely applied through the lifecycle of industrial products. As a result, massive data of different types are being produced. However, these data are hysteretic and isolated from each other, leading to low efficiency and low utilization of these valuable data. Simulation based on theoretical and static model has been a conventional and powerful tool for the verification, validation, and optimization of a system in its early planning stage, but no attention is paid to the simulation application during system run-time. With the development of new-generation information and digitalization technologies, more data can be collected, and it is time to find a way for the deep application of all these data. As a result, the concept of digital twin has aroused much concern and is developing rapidly. Dispute and discussions around concepts, paradigms, frameworks, applications, and technologies of digital twin are on the rise both in academic and industrial communities. After a complete search of several databases and careful selection according to the proposed criteria, 240 academic publications about digital twin are identified and classified. This paper conducts a comprehensive and in-depth review of these literatures to analyze digital twin from the perspective of concepts, technologies, and industrial applications. Research status, evolution of the concept, key enabling technologies of three aspects, and fifteen kinds of industrial applications in respective lifecycle phase are demonstrated in detail. Based on this, observations and future work recommendations for digital twin research are presented in the form of different lifecycle phases.
Article
Full-text available
Digital twin is revolutionizing industry. Fired by sensor updates and history data, the sophisticated models can mirror almost every facet of a product, process or service. In the future, everything in the physical world would be replicated in the digital space through digital twin technology. As a cutting-edge technology, digital twin has received a lot of attention. However, digital twin is far from realizing their potential, which is a complex system and long-drawn process. Researchers must model all the different parts of the objects or systems. Varied types of data needed to be collected and merged. Many researchers and participators in engineering are not clear which technologies and tools should be used. 5-dimension digital twin model provides reference guidance for understanding and implementing digital twin. From the perspective of 5-dimension digital twin model, this paper tries to investigate and summarize the frequently-used enabling technologies and tools for digital twin to provide technologies and tools references for the applications of digital twin in the future.
Article
Full-text available
This study uses a business network perspective to investigate the industry 4.0 context with the internet of things (IoT) as its enabling technology and product-use data as its core network resource. A three-stage qualitative methodology (interviews, focus group, Delphi-based inquiry) was used to examine the case of an emergent IoT-based business network in the UK road transport industry to examine: (i) how aspects of product use data influence the benefit opportunities the data provide to the different network actors; (ii) how capturing of the benefit opportunities in a network context is impacted by key barriers; and (iii) how network capabilities can overcome these barriers to capture benefits from product-use data. The study, thereby, contributes to an understanding of the industry 4.0 context from a resource dependency theory perspective and provides concrete recommendations for management operating in this context.
Chapter
Full-text available
Microservices is an architectural style inspired by service-oriented computing that has recently started gaining popularity. Before presenting the current state-of-the-art in the field, this chapter reviews the history of software architecture, the reasons that led to the diffusion of objects and services first, and microservices later. Finally, open problems and future challenges are introduced. This survey primarily addresses newcomers to the discipline, while offering an academic viewpoint on the topic. In addition, we investigate some practical issues and point out some potential solutions.
Article
The digital twin is an emerging and vital technology for digital transformation and intelligent upgrade. Driven by data and model, the digital twin can perform monitoring, simulation, prediction, optimization, and so on. Specifically, the digital twin modeling is the core for accurate portrayal of the physical entity, which enables the digital twin to deliver the functional services and satisfy the application requirements. Therefore, this paper provides systematic research of current studies on the digital twin modeling. Since the digital twin model is a faithful reflection of the digital twin modeling performance, a comprehensive and insightful analysis of digital twin models is given first from the perspective of the application field, hierarchy, discipline, dimension, universality, and functionality. Based on the analysis of digital twin models, current studies on the digital twin modeling are classified and analyzed according to the six modeling aspects within the digital twin modeling theoretical system proposed in our previous work. Meanwhile, enabling technologies and tools for the digital twin modeling are investigated and summarized. Finally, observations and future research recommendations are presented.
Article
The fourth industrial revolution and the underlying digital transformation, known as Industry 4.0, is progressing exponentially. The digital revolution is reshaping the way individuals live and work fundamentally, and the public remains optimistic regarding the opportunities Industry 4.0 may offer for sustainability. The present study contributes to the sustainability literature by systematically identifying the sustainability functions of Industry 4.0. In doing so, the study first reviews the fundamental design principles and technology trends of Industry 4.0 and introduces the architectural design of Industry 4.0. The study further draws on the interpretive structural modelling technique to model the contextual relationships among the Industry 4.0 sustainability functions. Results indicate that sophisticated precedence relationships exist among various sustainability functions of Industry 4.0. ‘Matrice d’Impacts Croisés Multiplication Appliquée àun Classement’ (MICMAC) analysis reveals that economic sustainability functions such as production efficiency and business model innovation tend to be the more immediate outcome of Industry 4.0, which pays the way for development of more remote socioenvironmental sustainability functions of Industry 4.0 such as energy sustainability, harmful emission reduction, and social welfare improvement. This study can serve Industry 4.0 stakeholders – leaders in the public and private sectors, industrialists, and academicians – to better understand the opportunities that the digital revolution may offer for sustainability, and work together more closely to ensure that Industry 4.0 delivers the intended sustainability functions around the world as effectively, equally, and fairly as possible.
Article
Today, the manufacturing industry is aiming to improve competitiveness through the convergence with cutting-edge ICT technologies in order to secure a new growth engine. Smart Manufacturing, which is the fourth revolution in the manufacturing industry and is also considered as a new paradigm, is the collection of cutting-edge technologies that support effective and accurate engineering decision-making in real time through the introduction of various ICT technologies and the convergence with the existing manufacturing technologies. This paper surveyed and analyzed various articles related to Smart Manufacturing, identified the past and present levels, and predicted the future. For these purposes, 1) the major key technologies related to Smart Manufacturing were identified through the analysis of the policies and technology roadmaps of Germany, the U.S., and Korea that have government-driven leading movements for Smart Manufacturing, 2) the related articles on the overall Smart Manufacturing concept, the key system structure, or each key technology were investigated, and, finally, 3) the Smart Manufacturing-related trends were identified and the future was predicted by conducting various analyses on the application areas and technology development levels that have been addressed in each article.