Access to this full-text is provided by MDPI.
Content available from Sensors
This content is subject to copyright.
Academic Editor: Biswanath Samanta
Received: 13 January 2025
Revised: 18 February 2025
Accepted: 22 February 2025
Published: 25 February 2025
Citation: Cristoiu, C.; Ivan, A.M.
Integration of Real Signals Acquired
Through External Sensors into
RoboDK Simulation of Robotic
Industrial Applications. Sensors 2025,
25, 1395. https://doi.org/10.3390/
s25051395
Copyright: © 2025 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license
(https://creativecommons.org/
licenses/by/4.0/).
Article
Integration of Real Signals Acquired Through External Sensors
into RoboDK Simulation of Robotic Industrial Applications
Cozmin Cristoiu * and Andrei Mario Ivan
Faculty of Industrial Engineering and Robotics, National University of Science and Technology Politehnica
Bucures
,ti, Splaiul Independentei No. 313, 060042 Bucharest, Romania; andrei_mario.ivan@upb.ro
*Correspondence: cozmin.cristoiu@upb.ro
Abstract: Ensuring synchronization between real-world sensor data and industrial robotic
simulations remains a critical challenge in digital twin and virtual commissioning appli-
cations. This study proposes an innovative method for integrating real sensor signals
into RoboDK simulations, bridging the gap between virtual models and real-world dy-
namics. The proposed system utilizes an Arduino-based data acquisition module and a
custom Python script to establish real-time communication between physical sensors and
RoboDK’s simulation environment. Unlike traditional simulations that rely on predefined
simulated signals or manually triggered virtual inputs, our approach enables dynamic
real-time interactions based on live sensor data. The system supports both analog and
digital signals and is validated through latency measurements, demonstrating an average
end-to-end delay of 23.97 ms. These results confirm the feasibility of real sensor integration
into RoboDK, making the system adaptable to various industrial applications. This frame-
work provides a scalable foundation for researchers and engineers to develop enhanced
simulation environments that more accurately reflect real industrial conditions.
Keywords: offline simulation; virtual commissioning; industrial robotics; RoboDK; sensors;
analog signals; digital signals
1. Introduction
The integration of digital twin technology and virtual commissioning has become
pivotal in advancing industrial automation, particularly within the framework of Industry
4.0. Digital twins serve as virtual counterparts of physical systems, enabling real-time
monitoring, simulation, and optimization throughout a product’s lifecycle. This approach
enhances predictive maintenance, operational efficiency, and system design [
1
]. Dedicated
solutions are employed for programming, testing, and optimizing robotic tasks before
integrating them into real production systems [
2
]. Software such as Process Simulate,
Robotmaster, and RoboDK offer tools for simulating, testing, and optimizing industrial
robotic processes, while also providing offline programming (OLP) functionality. Virtual
commissioning allows for the testing and validation of control systems in a simulated
environment before deployment, significantly reducing commissioning time and costs [
3
].
However, a significant challenge remains in accurately replicating dynamic interactions
between robots and their environments, especially concerning real-time sensor data inte-
gration. Traditional simulation tools often rely on predefined or manually triggered signals,
which may not capture the complexities of actual operational conditions [4].
Despite their powerful capabilities, these tools primarily simulate robotic behavior
using predefined signals rather than real-time sensor data, limiting their ability to accurately
Sensors 2025,25, 1395 https://doi.org/10.3390/s25051395
Sensors 2025,25, 1395 2 of 20
reflect real-world conditions. Addressing this gap is crucial for developing simulations
that closely mirror real-world scenarios, thereby improving the reliability and performance
of automated systems. This study proposes an innovative approach to incorporating real-
time sensor data into robotic simulations, bridging the divide between virtual models and
physical operations.
Using simulation and offline programming software has become an important aspect
of industrial robotic automation. Being one of the leading solutions on the market, RoboDK
allows both integrators and researchers to simulate and study robotic industrial applications
by providing a virtual 3D environment [
5
]. The software allows the kinematic modeling
and integration of various robot types, along with analyzing a wide range of applications.
However, these simulations lack the machine-to-machine dynamics that come with the data
provided by the integrated sensors. These data are a very important aspect in all industrial
robotic applications. Their management, which forms the station logic, is essential for the
functionality and optimization of the application [
6
–
9
]. In the virtual environment provided
by RoboDK (similar to other software solutions), these signals are simulated instead of
being acquired from real sensors. Because the efficiency and relevance of a study developed
using one of these software solutions depend on how capable that software is to emulate
the real working environment and conditions, the simulation of the input signals represents
a weak spot. The real sensors collect data from the working environment that has dynamic
characteristics, especially when working with unstructured applications [
10
–
12
]. This leads
to uncertainties that cannot be modeled in a virtual environment.
Another important aspect of using process simulation solutions is virtual commission-
ing. This is a method of testing and optimizing production systems that are developed
in virtual environments before real-world integration. The essential element of virtual
commissioning is testing the application logic, which refers to the management of I/O
signals and the behavior of the integrated equipment with respect to those signals [
13
].
Because there are events in industrial applications that cannot be properly modeled in
virtual environments, these must be emulated through certain elements and functions
(such as “attach” or “duplicate object” options, which work differently from the real-world
corresponding events) [
14
]. For these reasons, in most cases, the logic of the production
system cannot be fully validated using the virtual environment alone. Most of these issues
can be addressed by developing and using true digital twin models of real equipment.
However, the digital twin concept has been developed mostly in recent years and has yet
to be adopted on a large scale in the industrial field. On the other hand, even if a proper
digital twin of the real system is used, virtual commissioning is still required in order to
reduce the integration time of the system, and to check for errors in the control software.
Thus, virtual commissioning, which is implemented by connecting the simulated system to
a virtual or real PLC, is required to properly test the programming of the robotic industrial
application [15].
There are certain applications that rely heavily on information acquired through ex-
ternal sensors [
15
]. For example, in assembly applications force/torque sensors are used
to identify the gripping force of sensible parts or to identify the placing position of the
components [
16
,
17
]. Collaborative applications also use various sensors to ensure safe
interaction between robots and human operators [
18
]. While some robotic applications
operate in structured environments with minimal sensor dependency, others require con-
tinuous real-time sensor data for accurate process execution. Digital twins and virtual
commissioning play a crucial role in simulating such applications, yet most existing re-
search relies on predefined or simulated signals rather than real-time sensor integration.
The work performed in [
2
] emphasizes the importance of real-time process optimization
in cyber–physical systems (CPS), while [
3
] classifies digital twin implementations and
Sensors 2025,25, 1395 3 of 20
identifies real-time data synchronization as a major challenge. The effectiveness of virtual
commissioning in reducing commissioning time and improving simulation accuracy is
demonstrated in [
4
]. However, despite these advancements, most simulation tools still
lack the capability to dynamically integrate real-world sensor data, limiting their ability to
accurately model real-time industrial processes.
A significant challenge remains in bridging the gap between digital and physical
systems by allowing real-time sensor-driven decision making in industrial robotic simula-
tions. Many offline programming (OLP) software solutions, including Process Simulate,
Robotmaster, and RoboDK, provide powerful tools for simulating, testing, and optimiz-
ing robotic operations but primarily rely on predefined non-dynamic input signals. This
limitation reduces adaptability in scenarios where sensor feedback is critical for process
validation. Addressing this shortcoming requires an approach that not only enhances
sensor data integration but also maintains system flexibility, accuracy, and usability in an
industrial setting.
This study introduces a software-driven framework for integrating real-time sensor
data into RoboDK simulations, ensuring live sensor-driven robotic interactions. Unlike
conventional simulation methods, which rely on predefined control sequences, our ap-
proach establishes a live communication link between real-world sensors and the virtual
robotic environment. This is achieved through a custom Python-based interface, enabling
real-time data exchange between external sensors and RoboDK software (version 5.7.4). By
dynamically updating robotic behaviors based on real sensor inputs, the proposed method
significantly improves simulation realism, adaptability, and predictive accuracy.
The core innovation of this study is the development of a customizable Python-based
software interface that allows seamless sensor data transmission into RoboDK, eliminating
the need for simulated approximations. The software is designed to be scalable, modular,
and adaptable, supporting various industrial sensors and data acquisition methods. The
use of Python scripting provides high flexibility for integrating additional features such as
real-time data filtering, advanced control algorithms, and multi-sensor fusion techniques.
Unlike previous approaches that rely on proprietary hardware or expensive industrial au-
tomation platforms, our solution remains cost-effective and open for
further development
.
To validate our approach, a framework was implemented and tested using external
sensors interfaced with RoboDK via a Python-controlled communication pipeline. The
integration allows for dynamic robotic interactions and enables process optimization in
a more realistic and responsive digital twin environment. This contribution represents a
significant step forward in making robotic simulations more representative of real-world
industrial conditions, addressing a crucial gap in real-time digital twin applications for
robotic automation.
2. State of the Art
The current industrial development status, with Industry 4.0 designation, is based on
a hybrid approach to robotic application design. Eguti et al. [
19
] underline that the concept
of virtual commissioning integrates aspects that are linked to digital manufacturing and
simulation with hardware-in-the-loop elements. The research performs a comparative
analysis between the conventional design of an automation system and the same design
process based on a virtual commissioning approach. It is shown that the main objective of
virtual commissioning is to reduce the time required for application design and integration,
along with the product’s time-to-market. This is achieved through the possibility of
simultaneously running several development stages, as shown in Figure 1.
Sensors 2025,25, 1395 4 of 20
Sensors 2025, 25, x FOR PEER REVIEW 4 of 21
integration, along with the product’s time-to-market. This is achieved through the possi-
bility of simultaneously running several development stages, as shown in Figure 1.
Figure 1. Implementation stages of a robotic industrial application: up—standard process design
approach; down—virtual commissioning approach (as presented by Eguti et al. [19]).
However, various research works have shown that virtual commissioning has not
yet been established as a standard procedure for automated application development [20–
23]. While it is used on a large scale by big companies, smaller enterprises are not typically
implementing the concept, mainly due to complexity and the high level of resources in-
volved, including costs [24].
Virtual commissioning is, at its core, a testing method for application logic. In the
literature, there are two approaches that are used: software-in-the-loop and hardware-in-
the-loop [25–28]. The software-in-the-loop simulation tests the application programming
(the station logic) together with the digital model/digital twin of the system, while using
either a virtual or real PLC. The hardware-in-the-loop approach tests the real application
programming running on a real PLC that communicates with the digital twin of the sys-
tem through PROFINET, PROFIBUS, or other real communication standards [29]. Soft-
ware-in-the-loop and hardware-in-the-loop concepts are presented in Figure 2.
Documentation
Equipment
acquisition
Parts
manufacturing
(tooling,
devices, etc.
Assembly
Programming
Safety configuration
Commissionin
g
,
testing and
optimization
Preliminar
y
3D
model System design
Equipment
acquisition
Parts
manufacturin
g
(tooling,
devices, etc.
Final 3D model
Assembly
Programming
Safety configuration
Commissionin
g
,
testing and
optimization
Figure 1. Implementation stages of a robotic industrial application: up—standard process design
approach; down—virtual commissioning approach (as presented by Eguti et al. [19]).
However, various research works have shown that virtual commissioning has not yet
been established as a standard procedure for automated application development [
20
–
23
].
While it is used on a large scale by big companies, smaller enterprises are not typically
implementing the concept, mainly due to complexity and the high level of resources
involved, including costs [24].
Virtual commissioning is, at its core, a testing method for application logic. In the
literature, there are two approaches that are used: software-in-the-loop and hardware-in-
the-loop [
25
–
28
]. The software-in-the-loop simulation tests the application programming
(the station logic) together with the digital model/digital twin of the system, while using
either a virtual or real PLC. The hardware-in-the-loop approach tests the real application
programming running on a real PLC that communicates with the digital twin of the system
through PROFINET, PROFIBUS, or other real communication standards [
29
]. Software-in-
the-loop and hardware-in-the-loop concepts are presented in Figure 2.
Noga et al. [
30
] developed a concept of hybrid virtual commissioning. The study
builds upon the concept of hardware-in-the-loop and aims to reduce the complexity and
requirements of the simulation by using the real available equipment while simulating the
rest of the system. Since one of the main advantages of simulation and virtual commis-
sioning is the possibility of validating an application before purchasing any of the required
equipment, a hybrid approach will be compatible with that argument, since only readily
available equipment will be integrated with the simulation. This reduces the need to model
those elements and ensures more accurate testing, since part of the setup is based on real
components. The concept is validated using a real 3D vision camera and a SIMATIC ET
Sensors 2025,25, 1395 5 of 20
200SP I/O module connected to a real PLC. The camera scans the shape and position of
objects placed in an unstructured configuration (random order and posture) and trans-
fers the data into a virtual workspace. There, virtual commissioning for a pick-and-place
application, including a delta robot, is performed.
Sensors 2025, 25, x FOR PEER REVIEW 5 of 21
Figure 2. Software-in-the-loop and hardware-in-the-loop concepts, together with the model-in-the-
loop approach, as presented by Ullrich et al. [29].
Noga et al. [30] developed a concept of hybrid virtual commissioning. The study
builds upon the concept of hardware-in-the-loop and aims to reduce the complexity and
requirements of the simulation by using the real available equipment while simulating
the rest of the system. Since one of the main advantages of simulation and virtual com-
missioning is the possibility of validating an application before purchasing any of the re-
quired equipment, a hybrid approach will be compatible with that argument, since only
readily available equipment will be integrated with the simulation. This reduces the need
to model those elements and ensures more accurate testing, since part of the setup is based
on real components. The concept is validated using a real 3D vision camera and a SI-
MATIC ET 200SP I/O module connected to a real PLC. The camera scans the shape and
position of objects placed in an unstructured configuration (random order and posture)
and transfers the data into a virtual workspace. There, virtual commissioning for a pick-
and-place application, including a delta robot, is performed.
3. System Design and Implementation
The proposed system enables real-time interaction between physical sensors and a
virtual robotic simulation in RoboDK. This allows the digital twin environment to dynam-
ically adjust based on live sensor input, improving the accuracy and adaptability of ro-
botic simulations. The system architecture consists of three main components:
1. Sensor acquisition layer.
2. Simulation layer.
3. Python script.
To maintain clarity and conciseness, only the essential will be presented in the paper.
However, the entire project, including all files, virtual station, the full Python script
and Arduino code, and detailed implementation instructions, is publicly available on
GitHub: hps://github.com/Cozmin90/rdk_signal_box01 (accessed on 10 January 2025).
3.1. Sensor Acquisition Layer
To interface these sensors with the simulation, an Arduino microcontroller is used as
a sensor acquisition module. The choice of Arduino is based on its flexibility, low cost,
and ease of integration. The microcontroller collects sensor data and transmits them to the
Model-in-the-
loop
Digital Twin
Internal Logic
Software-in-the-
loop
Digital Twin
IEC-Code
Simulated Communication
-OPC UA
-TCP IP
-M
Q
TT, etc.
Sim. PLC
Hardware-in-
the-loop
Digital Twin
IEC-Code
Real PLC
Real Communication
(BUS)
-Ethercat
-ProfiNet, etc.
Figure 2. Software-in-the-loop and hardware-in-the-loop concepts, together with the model-in-the-
loop approach, as presented by Ullrich et al. [29].
3. System Design and Implementation
The proposed system enables real-time interaction between physical sensors and a
virtual robotic simulation in RoboDK. This allows the digital twin environment to dynami-
cally adjust based on live sensor input, improving the accuracy and adaptability of robotic
simulations. The system architecture consists of three main components:
1. Sensor acquisition layer.
2. Simulation layer.
3. Python script.
To maintain clarity and conciseness, only the essential will be presented in the paper.
However, the entire project, including all files, virtual station, the full Python script
and Arduino code, and detailed implementation instructions, is publicly available on
GitHub: https://github.com/Cozmin90/rdk_signal_box01 (accessed on 10 January 2025).
3.1. Sensor Acquisition Layer
To interface these sensors with the simulation, an Arduino microcontroller is used
as a sensor acquisition module. The choice of Arduino is based on its flexibility, low cost,
and ease of integration. The microcontroller collects sensor data and transmits them to
the PC via serial communication. The system is designed to accommodate both types of
signals: digital and analog. For testing purposes only, two digital sensors are connected
to the Arduino’s digital I/O pins (high/low states) and one analog sensor is connected to
an analog I/O, providing values between 0 and 1023. The Arduino microcontroller and
program continuously pool sensor values, processing them efficiently with debouncing
and threshold filtering, and formatting them into structured messages that are sent via
serial communication. Serial communication is initialized at 115,200 baud and, to pre-
vent rapid fluctuations in digital sensor readings, a debouncing delay of 20 ms is added.
The debouncing mechanism in the Arduino code prevents false readings caused by the
Sensors 2025,25, 1395 6 of 20
mechanical nature of buttons, which tend to “bounce” when pressed or released. This is
achieved by ensuring that each sensor state change is only registered if it remains stable for
at least debounceDelay milliseconds. The millis() function is used to track the elapsed time
since the last valid state change, filtering out unintended fluctuations. This enhances signal
stability, ensuring that the Python (version 3.12.6) program only processes genuine button
presses, preventing false triggers caused by mechanical noise. Also, to ensure that only
significant analog value changes are transmitted, a threshold value of 20 units is added,
thus reducing unnecessary data transmission. The sensor data are structured in a string
as follows:
SENSOR_IDENTIFIER_VALUE_TIMESTAMP
Here are some examples of data output:
IO1_1_123450
IO2_0_123460
PD_350_123478
PD_372_123500
The numbers at the end of the strings represent the timestamps of each event, and
their role is to be used by the Python script in order to compute the transmission delay.
The logical representation of the Arduino code is very simple, as follows in Figure 3:
Sensors 2025, 25, x FOR PEER REVIEW 6 of 21
PC via serial communication. The system is designed to accommodate both types of sig-
nals: digital and analog. For testing purposes only, two digital sensors are connected to
the Arduino’s digital I/O pins (high/low states) and one analog sensor is connected to an
analog I/O, providing values between 0 and 1023. The Arduino microcontroller and pro-
gram continuously pool sensor values, processing them efficiently with debouncing and
threshold filtering, and formaing them into structured messages that are sent via serial
communication. Serial communication is initialized at 115,200 baud and, to prevent rapid
fluctuations in digital sensor readings, a debouncing delay of 20 ms is added. The de-
bouncing mechanism in the Arduino code prevents false readings caused by the mechan-
ical nature of buons, which tend to “bounce” when pressed or released. This is achieved
by ensuring that each sensor state change is only registered if it remains stable for at least
debounceDelay milliseconds. The millis() function is used to track the elapsed time since
the last valid state change, filtering out unintended fluctuations. This enhances signal sta-
bility, ensuring that the Python (version 3.12.6) program only processes genuine buon
presses, preventing false triggers caused by mechanical noise. Also, to ensure that only
significant analog value changes are transmied, a threshold value of 20 units is added,
thus reducing unnecessary data transmission. The sensor data are structured in a string
as follows:
SENSOR_IDENTIFIER_VALUE_TIMESTAMP
Here are some examples of data output:
IO1_1_123450
IO2_0_123460
PD_350_123478
PD_372_123500
The numbers at the end of the strings represent the timestamps of each event, and
their role is to be used by the Python script in order to compute the transmission delay.
The logical representation of the Arduino code is very simple, as follows in Figure 3:
Figure 3. Arduino code logic.
3.2. Simulation Layer
The RoboDK software is used as a simulation tool for robotic applications. To per-
form a simulation in RoboDK, a station has to be created. The station represents the study
of the robot application in a virtual environment. It represents a set of elements that in-
cludes virtual models for all physical components, target points and the corresponding
robot paths, operations performed on the processed parts, coordinate systems, mecha-
nisms and their kinematic configuration, tools, configuration files, programming ele-
ments, scripts, etc. These elements are organized in a logical structure; in most cases, in
the form of hierarchical trees that are shown on the left side of the application window.
For this study, two stations are created: one demo app (no real application) that is used
only for demonstration and testing purposes and one station for a robotic palletizing ap-
plication in order to demonstrate functionality in correspondence with a real case sce-
nario.
Figure 3. Arduino code logic.
3.2. Simulation Layer
The RoboDK software is used as a simulation tool for robotic applications. To perform
a simulation in RoboDK, a station has to be created. The station represents the study of
the robot application in a virtual environment. It represents a set of elements that includes
virtual models for all physical components, target points and the corresponding robot
paths, operations performed on the processed parts, coordinate systems, mechanisms and
their kinematic configuration, tools, configuration files, programming elements, scripts, etc.
These elements are organized in a logical structure; in most cases, in the form of hierarchical
trees that are shown on the left side of the application window. For this study, two stations
are created: one demo app (no real application) that is used only for demonstration and
testing purposes and one station for a robotic palletizing application in order to demonstrate
functionality in correspondence with a real case scenario.
3.2.1. Case Study 1: Test Demo Application
The first station includes: two robots, target points and the corresponding paths, seven
programming files that include robot movement instructions and instructions for signal
management, together with the Python script. This virtual model for validation includes
two articulated arm industrial robots. The models are the IRB 120 and the IRB 140 produced
by ABB, each having six axes. A lamp is placed in front of each robot to provide visual
feedback for digital signals. Also, to provide display data for the status of digital signals
Sensors 2025,25, 1395 7 of 20
and for the value read by the photodiode (analog signal), a display is placed between the
robots. The real photodiode (the common GM5539 model used in many Arduino projects)
is wired to the analog input pin (A0) of the microcontroller. It is connected to the system
through an analog-to-digital (ADC) convertor providing values between 0 and 1023 that are
directly influenced by its changing resistance range: ~5 K
Ω
(bright light) to 200 K
Ω
(dark
conditions). Being in the classroom, the photodiode is pointed at a screen that displays
randomly changing images and colors.
The setup of the application in RoboDK is displayed in Figure 4.
Sensors 2025, 25, x FOR PEER REVIEW 7 of 21
3.2.1. Case Study 1: Test Demo Application
The first station includes: two robots, target points and the corresponding paths,
seven programming files that include robot movement instructions and instructions for
signal management, together with the Python script. This virtual model for validation in-
cludes two articulated arm industrial robots. The models are the IRB 120 and the IRB 140
produced by ABB, each having six axes. A lamp is placed in front of each robot to provide
visual feedback for digital signals. Also, to provide display data for the status of digital
signals and for the value read by the photodiode (analog signal), a display is placed be-
tween the robots. The real photodiode (the common GM5539 model used in many Ar-
duino projects) is wired to the analog input pin (A0) of the microcontroller. It is connected
to the system through an analog-to-digital (ADC) convertor providing values between 0
and 1023 that are directly influenced by its changing resistance range: ~5 KΩ (bright light)
to 200 KΩ (dark conditions). Being in the classroom, the photodiode is pointed at a screen
that displays randomly changing images and colors.
The setup of the application in RoboDK is displayed in Figure 4.
Figure 4. Validation simulation environment in RoboDK, including two industrial robots, two feed-
back lamps, a virtual display that shows real-time sensor readings, and the project tree, including
objects, robot targets, and movement programs and Python scripts.
For each robot, four target points are defined. The points are used to create an en-
closed path roughly rectangular in shape. Both paths are simulated with a signal-based
approach. The programming routines include instructions that wait for a certain change
in I/O signals to trigger the robot’s movement in loop along the path. The simple graphic
user allows the user to fast check the manual by turning the signals on or off if the lamp
color changes corresponding to each of the two digital signals, if the correct state and val-
ues of the signals are displayed on the virtual display, and if the corresponding robot is
executing its path following program. In Figure 5, signal IO_1 is set to high, so the lamp
in front of the white robot turns green; on the display, its value turns to 1 and the robot
has already moved to the next point on the path.
Figure 4. Validation simulation environment in RoboDK, including two industrial robots, two
feedback lamps, a virtual display that shows real-time sensor readings, and the project tree, including
objects, robot targets, and movement programs and Python scripts.
For each robot, four target points are defined. The points are used to create an enclosed
path roughly rectangular in shape. Both paths are simulated with a signal-based approach.
The programming routines include instructions that wait for a certain change in I/O signals
to trigger the robot’s movement in loop along the path. The simple graphic user allows
the user to fast check the manual by turning the signals on or off if the lamp color changes
corresponding to each of the two digital signals, if the correct state and values of the signals
are displayed on the virtual display, and if the corresponding robot is executing its path
following program. In Figure 5, signal IO_1 is set to high, so the lamp in front of the white
robot turns green; on the display, its value turns to 1 and the robot has already moved to
the next point on the path.
At the same time, the Python script responsible for receiving signal values from
serial communication and transforming them into specific commands that trigger different
actions in RoboDK virtual station is executed. The script is configured so that it can be
loaded into the virtual station and launched at any time. It also implements a control panel
in the application. The small GUI also allows the user, through a toggle button, to switch on
“Automatic” mode. In this mode, digital signals cannot be changed from the GUI buttons
(buttons become greyed out), but the actions in the stations are now triggered by the state
of physical sensors (in this case, two push buttons). In Figure 6, the color of the second
lamp indicates that the second button has been pressed, and the orange robot starts the
execution of the movement program.
Sensors 2025,25, 1395 8 of 20
Sensors 2025, 25, x FOR PEER REVIEW 8 of 21
Figure 5. Visual feedback on the simulation environment in RoboDK at the state change of the first
sensor: text and color feedback on the GUI, color feedback of the left-side lamp, and virtual display
feedback.
At the same time, the Python script responsible for receiving signal values from serial
communication and transforming them into specific commands that trigger different ac-
tions in RoboDK virtual station is executed. The script is configured so that it can be
loaded into the virtual station and launched at any time. It also implements a control panel
in the application. The small GUI also allows the user, through a toggle buon, to switch
on “Automatic” mode. In this mode, digital signals cannot be changed from the GUI but-
tons (buons become greyed out), but the actions in the stations are now triggered by the
state of physical sensors (in this case, two push buons). In Figure 6, the color of the sec-
ond lamp indicates that the second buon has been pressed, and the orange robot starts
the execution of the movement program.
Figure 6. Image capturing the start of the movement routine of the orange robot triggered at the
state change of the second sensor.
In this case, based on the status of digital signals (received through serial communi-
cation from the push buons connected to the Arduino module), one or both robots exe-
cute a task that is configured in the simulation environment. The signal can be associated
with other events in the simulation, such as object hide/show, color change, modification
of a variable value, executing a script, etc. For a real system, the change of an I/O signal
Figure 5. Visual feedback on the simulation environment in RoboDK at the state change of the
first sensor: text and color feedback on the GUI, color feedback of the left-side lamp, and virtual
display feedback.
Sensors 2025, 25, x FOR PEER REVIEW 8 of 21
Figure 5. Visual feedback on the simulation environment in RoboDK at the state change of the first
sensor: text and color feedback on the GUI, color feedback of the left-side lamp, and virtual display
feedback.
At the same time, the Python script responsible for receiving signal values from serial
communication and transforming them into specific commands that trigger different ac-
tions in RoboDK virtual station is executed. The script is configured so that it can be
loaded into the virtual station and launched at any time. It also implements a control panel
in the application. The small GUI also allows the user, through a toggle buon, to switch
on “Automatic” mode. In this mode, digital signals cannot be changed from the GUI but-
tons (buons become greyed out), but the actions in the stations are now triggered by the
state of physical sensors (in this case, two push buons). In Figure 6, the color of the sec-
ond lamp indicates that the second buon has been pressed, and the orange robot starts
the execution of the movement program.
Figure 6. Image capturing the start of the movement routine of the orange robot triggered at the
state change of the second sensor.
In this case, based on the status of digital signals (received through serial communi-
cation from the push buons connected to the Arduino module), one or both robots exe-
cute a task that is configured in the simulation environment. The signal can be associated
with other events in the simulation, such as object hide/show, color change, modification
of a variable value, executing a script, etc. For a real system, the change of an I/O signal
Figure 6. Image capturing the start of the movement routine of the orange robot triggered at the state
change of the second sensor.
In this case, based on the status of digital signals (received through serial communica-
tion from the push buttons connected to the Arduino module), one or both robots execute a
task that is configured in the simulation environment. The signal can be associated with
other events in the simulation, such as object hide/show, color change, modification of a
variable value, executing a script, etc. For a real system, the change of an I/O signal can
trigger events such as lighting an indicator, mechanism movement, activating a certain
function of an equipment, program interrupts, etc. The data flow from sensors to the virtual
system in RoboDK, going through the Arduino module, is illustrated in Figure 7.
The setup presented above is developed for testing. In this case, the analog sensor
does not have any role and does not trigger any action in the simulation environment.
Thus, a second simulation station is created in order to also use the analogic signal and to
validate the concept in a useful and realistic application.
Sensors 2025,25, 1395 9 of 20
Sensors 2025, 25, x FOR PEER REVIEW 9 of 21
can trigger events such as lighting an indicator, mechanism movement, activating a cer-
tain function of an equipment, program interrupts, etc. The data flow from sensors to the
virtual system in RoboDK, going through the Arduino module, is illustrated in Figure 7.
Figure 7. The hardware and software setup, including sensors and buons, Arduino board, and a
computer running RoboDK and the Python script.
The setup presented above is developed for testing. In this case, the analog sensor
does not have any role and does not trigger any action in the simulation environment.
Thus, a second simulation station is created in order to also use the analogic signal and to
validate the concept in a useful and realistic application.
3.2.2. Case Study 2: Robotic Palletizing Application
For the second station, a simple robotic palletizing application is configured. For vis-
ual feedback, the lamp and the virtual display are kept in the station. In Figure 8, the
layout can be observed.
Figure 8. Validation simulation environment in RoboDK imitating a real-world automated palletiz-
ing operation. The robotic arm interacts with a conveyor system and a sensor-based feedback loop,
displaying real-time distance and object detection data on a virtual screen.
The logic of the signals used in this station is the following:
Figure 7. The hardware and software setup, including sensors and buttons, Arduino board, and a
computer running RoboDK and the Python script.
3.2.2. Case Study 2: Robotic Palletizing Application
For the second station, a simple robotic palletizing application is configured. For
visual feedback, the lamp and the virtual display are kept in the station. In Figure 8, the
layout can be observed.
Sensors 2025, 25, x FOR PEER REVIEW 9 of 21
can trigger events such as lighting an indicator, mechanism movement, activating a cer-
tain function of an equipment, program interrupts, etc. The data flow from sensors to the
virtual system in RoboDK, going through the Arduino module, is illustrated in Figure 7.
Figure 7. The hardware and software setup, including sensors and buons, Arduino board, and a
computer running RoboDK and the Python script.
The setup presented above is developed for testing. In this case, the analog sensor
does not have any role and does not trigger any action in the simulation environment.
Thus, a second simulation station is created in order to also use the analogic signal and to
validate the concept in a useful and realistic application.
3.2.2. Case Study 2: Robotic Palletizing Application
For the second station, a simple robotic palletizing application is configured. For vis-
ual feedback, the lamp and the virtual display are kept in the station. In Figure 8, the
layout can be observed.
Figure 8. Validation simulation environment in RoboDK imitating a real-world automated palletiz-
ing operation. The robotic arm interacts with a conveyor system and a sensor-based feedback loop,
displaying real-time distance and object detection data on a virtual screen.
The logic of the signals used in this station is the following:
Figure 8. Validation simulation environment in RoboDK imitating a real-world automated palletizing
operation. The robotic arm interacts with a conveyor system and a sensor-based feedback loop,
displaying real-time distance and object detection data on a virtual screen.
The logic of the signals used in this station is the following:
-
The first digital sensor signal is used to start the palletizing routine of the robot (it has
the role of a start button).
-
The second digital sensor signal is used to immediately interrupt the robot action (it
has the role of an emergency stop button).
-
The robot has to close its gripper and grab a box only when there is one exactly under
it. So, the analog input signal is used to measure the distance from the gripper to the
box and to trigger the grabbing action only when a box is in the corresponding range.
The correlation of signals is presented in Table 1.
Sensors 2025,25, 1395 10 of 20
Table 1. Signal correlation.
I/O Signal Sensor Type Description Mapped RDK Action
IO1 digital real push button
start robot movement routine
IO2 digital real push button stop every process (E-stop)
Distance analog real distance sensor toggle IO3 on or off
IO3 digital simulated digital signal close gripper
This time, a HC-SR04 ultrasonic sensor is used to measure distances also connected
directly to the Arduino board. This ultrasonic sensor model has one emitter and one
receiver and can determine distance by measuring the time delay until the echo bounces
from the object and returns to the receiver. To adjust the measured distance, we have to
manually point it to some close objects in the classroom.
In the virtual environment, the distance sensor is attached to the bottom of the robot
gripper, as shown in Figure 9.
Sensors 2025, 25, x FOR PEER REVIEW 10 of 21
- The first digital sensor signal is used to start the palletizing routine of the robot (it
has the role of a start buon).
- The second digital sensor signal is used to immediately interrupt the robot action (it
has the role of an emergency stop buon).
- The robot has to close its gripper and grab a box only when there is one exactly under
it. So, the analog input signal is used to measure the distance from the gripper to the
box and to trigger the grabbing action only when a box is in the corresponding range.
The correlation of signals is presented in Table 1.
Table 1. Signal correlation.
I/O Signal Sensor Type Description Mapped RDK Action
IO1 digital real push buon start robot movement routine
IO2 digital real push buon stop every process (E-stop)
Distance analog real distance sensor toggle IO3 on or off
IO3 digital simulated digital signal close gripper
This time, a HC-SR04 ultrasonic sensor is used to measure distances also connected
directly to the Arduino board. This ultrasonic sensor model has one emier and one re-
ceiver and can determine distance by measuring the time delay until the echo bounces
from the object and returns to the receiver. To adjust the measured distance, we have to
manually point it to some close objects in the classroom.
In the virtual environment, the distance sensor is aached to the boom of the robot
gripper, as shown in Figure 9.
Figure 9. Placement of the virtual sensor on the gripper. The sensor is placed under the gripper in
order to measure the distance to the box and close the gripper when the box is close enough.
While the real start buon is not pressed, everything is at rest. The lamp indicates the
OFF status by the red color, both of the digital signals indicate the value zero on the dis-
play, the analog sensor displays its current value, and, since there is nothing close enough
to the gripper, the “box in range” status is also zero. These elements can be observed in
Figure 10.
Figure 9. Placement of the virtual sensor on the gripper. The sensor is placed under the gripper in
order to measure the distance to the box and close the gripper when the box is close enough.
While the real start button is not pressed, everything is at rest. The lamp indicates the
OFF status by the red color, both of the digital signals indicate the value zero on the display,
the analog sensor displays its current value, and, since there is nothing close enough to the
gripper, the “box in range” status is also zero. These elements can be observed in Figure 10.
After the automatic mode is turned on and the start button has been pressed, the
robot moves in the location determined to grab the boxes. The lamp becomes green,
and the first digital sensor displays a value of 1. At this moment, the robot gripper is
still open, but we can observe that, since the distance sensor has decreased, the box in
range status is still 0. This is because it took a little time to put something in front of the
real sensor and for it to detect and adjust the distance value. Right when the distance
value drops under the specified threshold (value of 50), the robot closes its gripper, then
moves the box approaching the palletizing location. These two sequences are presented in
Figures 11 and 12.
To test the functionality of the emergency stop, the second button is pressed while
the robot is in the position of grabbing another box, after confirmation of the “Box in
range”. Every process immediately stops, and the visual indicators change accordingly.
The moment of stopping the application at the signal of the second sensor is captured in
Figure 13.
Sensors 2025,25, 1395 11 of 20
Sensors 2025, 25, x FOR PEER REVIEW 11 of 21
Figure 10. Station at rest with the robot waiting in its “home” position and the push of the first
buon (start buon) to start the palletizing routine.
After the automatic mode is turned on and the start buon has been pressed, the
robot moves in the location determined to grab the boxes. The lamp becomes green, and
the first digital sensor displays a value of 1. At this moment, the robot gripper is still open,
but we can observe that, since the distance sensor has decreased, the box in range status
is still 0. This is because it took a lile time to put something in front of the real sensor and
for it to detect and adjust the distance value. Right when the distance value drops under
the specified threshold (value of 50), the robot closes its gripper, then moves the box ap-
proaching the palletizing location. These two sequences are presented in Figures 11 and
12.
Figure 11. The robot approaches the first box until the distance value becomes smaller than the
threshold and triggers the closing action of the clamps.
Figure 10. Station at rest with the robot waiting in its “home” position and the push of the first button
(start button) to start the palletizing routine.
Sensors 2025, 25, x FOR PEER REVIEW 11 of 21
Figure 10. Station at rest with the robot waiting in its “home” position and the push of the first
buon (start buon) to start the palletizing routine.
After the automatic mode is turned on and the start buon has been pressed, the
robot moves in the location determined to grab the boxes. The lamp becomes green, and
the first digital sensor displays a value of 1. At this moment, the robot gripper is still open,
but we can observe that, since the distance sensor has decreased, the box in range status
is still 0. This is because it took a lile time to put something in front of the real sensor and
for it to detect and adjust the distance value. Right when the distance value drops under
the specified threshold (value of 50), the robot closes its gripper, then moves the box ap-
proaching the palletizing location. These two sequences are presented in Figures 11 and
12.
Figure 11. The robot approaches the first box until the distance value becomes smaller than the
threshold and triggers the closing action of the clamps.
Figure 11. The robot approaches the first box until the distance value becomes smaller than the
threshold and triggers the closing action of the clamps.
Sensors 2025, 25, x FOR PEER REVIEW 12 of 21
Figure 12. Measured distance reaches the threshold and triggers the “box in range” signal. The grip-
per clamps close and the robot continues its palletizing routine.
To test the functionality of the emergency stop, the second buon is pressed while
the robot is in the position of grabbing another box, after confirmation of the “Box in
range”. Every process immediately stops, and the visual indicators change accordingly.
The moment of stopping the application at the signal of the second sensor is captured in
Figure 13.
Figure 13. Program halts at the push of the second buon (emergency stop). The status lamp be-
comes red, and the robot stops moving, even if the “box in range” signal is active.
The application described above is developed with the goal of providing a frame-
work for testing the concepts approached in this study. It is deliberately kept as simple as
Figure 12. Measured distance reaches the threshold and triggers the “box in range” signal. The
gripper clamps close and the robot continues its palletizing routine.
Sensors 2025,25, 1395 12 of 20
Sensors 2025, 25, x FOR PEER REVIEW 12 of 21
Figure 12. Measured distance reaches the threshold and triggers the “box in range” signal. The grip-
per clamps close and the robot continues its palletizing routine.
To test the functionality of the emergency stop, the second buon is pressed while
the robot is in the position of grabbing another box, after confirmation of the “Box in
range”. Every process immediately stops, and the visual indicators change accordingly.
The moment of stopping the application at the signal of the second sensor is captured in
Figure 13.
Figure 13. Program halts at the push of the second buon (emergency stop). The status lamp be-
comes red, and the robot stops moving, even if the “box in range” signal is active.
The application described above is developed with the goal of providing a frame-
work for testing the concepts approached in this study. It is deliberately kept as simple as
Figure 13. Program halts at the push of the second button (emergency stop). The status lamp becomes
red, and the robot stops moving, even if the “box in range” signal is active.
The application described above is developed with the goal of providing a framework
for testing the concepts approached in this study. It is deliberately kept as simple as possible
to keep the testing results relevant for any robotic task, regardless of its particularity. The
application has all the characteristics of a standard simulation, including real I/O signals,
which are the core subject of this study. This study aims to address this issue and find a
solution for integrating more realistic signals. For this purpose, a Python script is developed,
and its functionality is detailed in the next section.
3.3. Python Script
The script acts as a connection between real sensors and the virtual signals included
in the application, automatically calling the signal changing routines when necessary. The
following libraries are included:
•Tkinter—for graphical interface/control panel elements.
•Serial—for receiving data from Arduino through serial communication.
•Threading—for starting a thread used for reading data received from Arduino.
•Robolink—for creating a link to RoboDK.
The logical diagram of the script is illustrated in Figure 14.
Serial communication is carried through the port to which the Arduino module is
connected (in this case, Port 4) using an adjustable baud rate of 115,200 bits per second
(as shown in Figure 15). In order to use the information configured in the graphical user
interface and to process the data, the following functions are created.
•
A function for changing signal values that also changes the color of the signal indicators
(which show whether a signal is active or not).
•
A function that reads the data received through serial communication from the Ar-
duino board. These data are received in the form of character strings, such as “IO1_0”,
“IO2_0”, “IO1_1”, “IO2_1”, and “PD_value”. These are associated with the buttons
connected to the digital inputs of the Arduino board and with the analog sensor con-
nected to one of the analog inputs. This function processes the data and then calls the
required routines inside RoboDK or changes certain signal values in the simulation.
•
A function that changes the signal management approach for the simulation. This
function switches between manual and automatic mode. In manual mode, the signal
Sensors 2025,25, 1395 13 of 20
values are changed through the push buttons provided by the control panel. In
automatic mode, the signal values are determined by the data received through
serial communication.
Sensors 2025, 25, x FOR PEER REVIEW 13 of 21
possible to keep the testing results relevant for any robotic task, regardless of its particu-
larity. The application has all the characteristics of a standard simulation, including real
I/O signals, which are the core subject of this study. This study aims to address this issue
and find a solution for integrating more realistic signals. For this purpose, a Python script
is developed, and its functionality is detailed in the next section.
3.3. Python Script
The script acts as a connection between real sensors and the virtual signals included
in the application, automatically calling the signal changing routines when necessary. The
following libraries are included:
• Tkinter—for graphical interface/control panel elements.
• Serial—for receiving data from Arduino through serial communication.
• Threading—for starting a thread used for reading data received from Arduino.
• Robolink—for creating a link to RoboDK.
The logical diagram of the script is illustrated in Figure 14.
Figure 14. Logical diagram of the Python script.
Serial communication is carried through the port to which the Arduino module is
connected (in this case, Port 4) using an adjustable baud rate of 115,200 bits per second (as
shown in Figure 15). In order to use the information configured in the graphical user in-
terface and to process the data, the following functions are created.
• A function for changing signal values that also changes the color of the signal indi-
cators (which show whether a signal is active or not).
• A function that reads the data received through serial communication from the Ar-
duino board. These data are received in the form of character strings, such as
“IO1_0”, “IO2_0”, “IO1_1”, “IO2_1”, and “PD_value”. These are associated with the
buons connected to the digital inputs of the Arduino board and with the analog
sensor connected to one of the analog inputs. This function processes the data and
Figure 14. Logical diagram of the Python script.
Sensors 2025, 25, x FOR PEER REVIEW 14 of 21
then calls the required routines inside RoboDK or changes certain signal values in
the simulation.
• A function that changes the signal management approach for the simulation. This
function switches between manual and automatic mode. In manual mode, the signal
values are changed through the push buons provided by the control panel. In auto-
matic mode, the signal values are determined by the data received through serial
communication.
Some of these functions are also illustrated in Figure 15a,b, showing the modular
configuration of the script.
(a)
Figure 15. Cont.
Sensors 2025,25, 1395 14 of 20
Sensors 2025, 25, x FOR PEER REVIEW 15 of 21
(b)
Figure 15. (a) Some Python script functions corresponding to the UI for selecting the operating mode
(manual/automatic) and executing RoboDK programs in correspondence with received signal val-
ues. (b) Python script function responsible for continuously reading data from Arduino and trig-
gering RoboDK actions based on received signals.
The read_arduino_data function is responsible for receiving sensor signals from Ar-
duino and processing them in real-time within the RoboDK simulation. To ensure that
only the most recent data are processed, the serial buffer is flushed before reading new
inputs. This prevents outdated sensor values from being used, particularly when switch-
ing between manual and automatic modes. Additionally, a debouncing mechanism is im-
plemented at the Arduino level to prevent rapid unwanted fluctuations caused by me-
chanical buon noise. This enhances the reliability of digital signal readings, ensuring sta-
ble interactions between the physical and virtual environments. Finally, the function also
records end-to-end latency measurements, which are crucial for evaluating the real-time
performance of the system. Key features in accordance with the code structure are pre-
sented in Figure 15b.
(a) The function continuously listens for incoming data from the Arduino and deter-
mines whether the received signal should trigger an action in the RoboDK simula-
tion.
Figure 15. (a) Some Python script functions corresponding to the UI for selecting the operating mode
(manual/automatic) and executing RoboDK programs in correspondence with received signal values.
(b) Python script function responsible for continuously reading data from Arduino and triggering
RoboDK actions based on received signals.
Some of these functions are also illustrated in Figure 15a,b, showing the modular
configuration of the script.
The read_arduino_data function is responsible for receiving sensor signals from Ar-
duino and processing them in real-time within the RoboDK simulation. To ensure that
only the most recent data are processed, the serial buffer is flushed before reading new
inputs. This prevents outdated sensor values from being used, particularly when switching
between manual and automatic modes. Additionally, a debouncing mechanism is imple-
mented at the Arduino level to prevent rapid unwanted fluctuations caused by mechanical
button noise. This enhances the reliability of digital signal readings, ensuring stable inter-
actions between the physical and virtual environments. Finally, the function also records
end-to-end latency measurements, which are crucial for evaluating the real-time perfor-
mance of the system. Key features in accordance with the code structure are presented in
Figure 15b.
(a) The function continuously listens for incoming data from the Arduino and determines
whether the received signal should trigger an action in the RoboDK simulation.
(b)
Flushing the serial buffer (ser.reset_input_buffer()): ensures that old unread data do
not cause delayed or outdated readings. Prevents buffer overflow when Arduino
Sensors 2025,25, 1395 15 of 20
sends frequent sensor updates. Useful when switching between manual and automatic
modes to avoid processing outdated data.
(c) End-to-End latency measurement: measures the time between when data are received
from Arduino and when they trigger an event in RoboDK. Helps evaluate system
responsiveness and optimize real-time performance.
The developed script has a simple structure, including 112 lines of code, together
with the libraries mentioned above. Its purpose is to interact with the elements of a
simulation configured in RoboDK, such as commands, routines, or events, and to link
various external signals to these elements. Thus, the script allows the signals provided by
external equipment and sensors to alter the elements of a simulation. In order to implement
this concept, the following conditions are required:
•
The station must be completely configured in RoboDK, including programs, modules,
and routines. The various modules and routines should be readily available to be
called when needed.
•
The Python script must be integrated into the station. The Arduino module must be
connected to the communication port defined in the script.
•
The external sensors that are required in the application must be connected to the
module. Also, the communication rate must be set to the same value both between
the Arduino module and the computer and between the Arduino module and the
Python script.
•
The variable names that store the sensor values must be identical to the ones used
for the name format in the Python script. For example, the “IO1_<value>” name
format identifies the value of the IO1 signal. Thus, the variable used in the Arduino
program that refers to the corresponding sensor signal must be named IO1. For the
photodiode, the PD identifier is used. The “_<value>” component specifies the actual
sensor value using integer data types. This naming format is important, as the script
searches specifically for these elements. The received data take the form of a character
string and the name of the signal together with its value are extracted based on the
location of the “_” character.
•
The monitoring of the serial communication (serial monitor) from Arduino IDE must
be closed, so that it does not interfere with the communication between the module
and the script.
4. Results and Discussions
The primary objective of this study is to develop a practical and user-friendly solution
that enables the integration of real-world signals into RoboDK simulations in a simple and
efficient manner. The result is a compact, well-structured, and easy-to-implement script that
introduces a crucial functionality into a simulation environment that previously lacked this
capability. Our approach is not merely theoretical but is aimed at developing a method that
can be immediately applied in real-world scenarios, providing researchers and engineers
with a flexible and adaptable tool for testing industrial processes based on real sensor data.
This study thus demonstrates the feasibility and practical utility of the proposed solution,
contributing to the expansion of simulation and optimization capabilities in industrial
applications. Ultimately, the most important and only relevant aspect when evaluating the
efficiency and performance of the developed solution is the measurement of system latency.
While the implementation is compact, easy to use, and introduces a crucial functionality
into the simulation environment, the true value of this method is validated solely by
measuring its reaction time. These measurements are essential to demonstrate that the
system can operate under conditions close to reality, ensuring a sufficiently fast response
Sensors 2025,25, 1395 16 of 20
for the targeted industrial applications. Based on this, we conduct a series of rigorous tests
to assess the total system latency.
Latency time of the interface is defined as the time that passes between transmitting
the data from the Arduino board and the execution of the code sequence triggered by
the corresponding signal in RoboDK. This time period has two main elements: the time
that reflects the communication speed between Arduino and the computer, and the time
required for the triggered actions to be executed in RoboDK. This total time is called end-to-
end latency. It reflects the capacity of the system—formed by the Arduino board, the Python
script, and RoboDK—to complete certain actions based on sensor inputs. This duration
can be influenced by various factors, such as the configured communication speed (baud
rate), microprocessor specifications, algorithm efficiency, etc. Considering that the Arduino
board used is the UNO model, the baud rate is set to the maximum value of 115,200.
To evaluate the response of the system, the application runs for 60 min. The timestamp
and end-to-end latency are recorded at each signal value change. The latency_log.txt file
can be found in the repos, along with the other relevant files. The total recording count is
4126. By calculating the average of these times, a medium value of 23.97 ms is determined.
Figure 16 illustrates the latency evolution for the measurement time of 60 min.
Sensors 2025, 25, x FOR PEER REVIEW 17 of 21
essential to demonstrate that the system can operate under conditions close to reality, en-
suring a sufficiently fast response for the targeted industrial applications. Based on this,
we conduct a series of rigorous tests to assess the total system latency.
Latency time of the interface is defined as the time that passes between transmiing
the data from the Arduino board and the execution of the code sequence triggered by the
corresponding signal in RoboDK. This time period has two main elements: the time that
reflects the communication speed between Arduino and the computer, and the time re-
quired for the triggered actions to be executed in RoboDK. This total time is called end-
to-end latency. It reflects the capacity of the system—formed by the Arduino board, the
Python script, and RoboDK—to complete certain actions based on sensor inputs. This du-
ration can be influenced by various factors, such as the configured communication speed
(baud rate), microprocessor specifications, algorithm efficiency, etc. Considering that the
Arduino board used is the UNO model, the baud rate is set to the maximum value of
115,200.
To evaluate the response of the system, the application runs for 60 min. The
timestamp and end-to-end latency are recorded at each signal value change. The la-
tency_log.txt file can be found in the repos, along with the other relevant files. The total
recording count is 4126. By calculating the average of these times, a medium value of 23.97
ms is determined. Figure 16 illustrates the latency evolution for the measurement time of
60 min.
Figure 16. Graphical representation of the latency evolution during 60 min of continuous operation.
Looking at the graphical representation of the latency evolution, the communication
delay may sometimes look like a random event. The reason is that the analog sensor is
placed in front of a screen that continuously and randomly displays a changing image. It
is possible that the transition between colors and images is not always so sudden that it
quickly exceeds the threshold so that the sensor detects the change and sends the signal.
The delay distribution is presented in Figure 17.
0
50
100
150
200
250
13:10
13:12
13:13
13:15
13:16
13:18
13:19
13:21
13:22
13:24
13:25
13:27
13:28
13:30
13:31
13:33
13:34
13:36
13:37
13:39
13:40
13:42
13:43
13:45
13:46
13:48
13:49
13:51
13:52
13:54
13:55
13:57
13:58
13:59
14:01
14:03
14:04
14:06
14:07
14:08
14:10
14:12
Recorded latency [ms]
Timestamp
End-to-end latency
Figure 16. Graphical representation of the latency evolution during 60 min of continuous operation.
Looking at the graphical representation of the latency evolution, the communication
delay may sometimes look like a random event. The reason is that the analog sensor is
placed in front of a screen that continuously and randomly displays a changing image. It
is possible that the transition between colors and images is not always so sudden that it
quickly exceeds the threshold so that the sensor detects the change and sends the signal.
The delay distribution is presented in Figure 17.
Sensors 2025, 25, x FOR PEER REVIEW 18 of 21
Figure 17. Histogram of latency measurements during test period of 60 min.
While the proposed framework successfully demonstrates the feasibility of real-time
sensor integration into RoboDK simulations, certain limitations should be acknowledged.
(a) Hardware Dependency:
• The current implementation relies on an Arduino-based platform for sensor ac-
quisition, which may not be representative of higher-end industrial PLCs or
edge computing systems.
• While the approach is modular, testing with more advanced industrial control-
lers could provide additional validation.
(b) Communication Latency and Real-Time Constraints:
• The system achieves an average latency of 23.97 ms, which is acceptable for most
robotic applications but may not be sufficient for high-speed industrial pro-
cesses requiring near-zero latency.
• Future improvements could explore faster communication protocols (e.g.,
TCP/IP, UDP, or real-time fieldbuses like EtherCAT) to reduce response time.
(c) Limited Sensor Modalities Tested:
• Future work should explore multi-sensor fusion techniques to enhance simula-
tion realism and decision-making capabilities.
(d) Simulation vs. Real-World Performance:
• While real sensor data are used, RoboDK remains a simulation environment,
and the actual performance of the system in a physical industrial seing has not
been tested.
• Future work should focus on deploying the proposed approach in a real indus-
trial setup, comparing simulated vs. real execution outcomes.
(e) Scalability and Complex Systems:
• The current study demonstrates a single robot and a limited number of sensors.
Expanding this to multi-robot systems with complex sensor networks intro-
duces additional challenges, such as data synchronization, computational load,
and system architecture complexity.
By recognizing these limitations, this study sets the foundation for future enhance-
ments, ensuring that the proposed approach remains adaptable to industrial automation
trends and evolving robotic systems.
The direct integration of robotic systems without prior virtual commissioning and
real-signal-based simulation poses significant operational risks, such as unexpected fail-
ures, prolonged downtime, and suboptimal system performance [31,32]. To mitigate these
challenges, the proposed real-time robotic simulation framework enhances sustainability
Figure 17. Histogram of latency measurements during test period of 60 min.
Sensors 2025,25, 1395 17 of 20
While the proposed framework successfully demonstrates the feasibility of real-time
sensor integration into RoboDK simulations, certain limitations should be acknowledged.
(a)
Hardware Dependency:
•
The current implementation relies on an Arduino-based platform for sensor
acquisition, which may not be representative of higher-end industrial PLCs or
edge computing systems.
•
While the approach is modular, testing with more advanced industrial controllers
could provide additional validation.
(b)
Communication Latency and Real-Time Constraints:
•
The system achieves an average latency of 23.97 ms, which is acceptable for most
robotic applications but may not be sufficient for high-speed industrial processes
requiring near-zero latency.
•
Future improvements could explore faster communication protocols (e.g.,
TCP/IP, UDP, or real-time fieldbuses like EtherCAT) to reduce response time.
(c)
Limited Sensor Modalities Tested:
•
Future work should explore multi-sensor fusion techniques to enhance simula-
tion realism and decision-making capabilities.
(d)
Simulation vs. Real-World Performance:
•
While real sensor data are used, RoboDK remains a simulation environment,
and the actual performance of the system in a physical industrial setting has not
been tested.
•
Future work should focus on deploying the proposed approach in a real industrial
setup, comparing simulated vs. real execution outcomes.
(e)
Scalability and Complex Systems:
•
The current study demonstrates a single robot and a limited number of sensors.
Expanding this to multi-robot systems with complex sensor networks introduces
additional challenges, such as data synchronization, computational load, and
system architecture complexity.
By recognizing these limitations, this study sets the foundation for future enhance-
ments, ensuring that the proposed approach remains adaptable to industrial automation
trends and evolving robotic systems.
The direct integration of robotic systems without prior virtual commissioning and real-
signal-based simulation poses significant operational risks, such as unexpected failures,
prolonged downtime, and suboptimal system performance [
31
,
32
]. To mitigate these
challenges, the proposed real-time robotic simulation framework enhances sustainability
in industrial automation by minimizing material waste, optimizing production efficiency,
and enabling precise virtual commissioning. This approach accelerates deployment time,
and saves energy and potential material lost by identifying and addressing potential issues
before physical implementation [33].
5. Conclusions
This study presents the development of an innovative and practical framework for
integrating real-world sensor signals into industrial robotic simulations. Unlike conven-
tional simulation environments that rely on predefined or artificially generated signals,
the proposed system enables real-time sensor-driven interactions within RoboDK. This is
achieved through a lightweight and efficient Python-based interface that links an Arduino-
powered hardware module to the simulation environment. The Arduino board is respon-
sible for acquiring sensor data, which are then transmitted via serial communication and
Sensors 2025,25, 1395 18 of 20
processed by the Python script, effectively translating physical sensor states into virtual
simulation triggers.
The proposed system distinguishes itself from existing solutions by offering a highly
practical, adaptable, and efficient approach to real-time robotic simulation enhancements.
Its main characteristics include ease of implementation and use, as the framework is
designed to be quickly and seamlessly integrated into any RoboDK application without re-
quiring significant modifications. It has minimal resource requirements, being a lightweight
solution consisting of only 112 lines of code, occupying just 7 KB of memory, and requiring
negligible computational power. Additionally, the system ensures high responsiveness
and low latency, with a measured total latency of 23.97 ms, making it suitable for real-time
industrial applications. By incorporating real sensor feedback, the method enhances the
realism of digital twins, ensuring that environmental dynamics and data variations are
accurately represented, thus improving simulation reliability. Moreover, the approach
supports a broad range of sensors, and its modular architecture allows for easy expansion
to accommodate multiple input sources and various sensor types, making it highly flexible
and scalable.
Compared with existing studies that focus on static simulations, the proposed ap-
proach introduces a real-time data-driven methodology for robotic process validation.
Traditional offline programming (OLP) solutions and similar platforms lack native support
for real-time external sensor integration. This study fills that gap by demonstrating a
working prototype that effectively bridges the virtual and physical domains. The measured
low-latency performance confirms that the system is not just a theoretical concept but a
practical and deployable solution for industrial robotic research and automation.
The current implementation lays the foundation for further advancements in
sensor-driven robotic simulations. Future work will focus on developing a dedicated
microcontroller-based module to replace the current Arduino-based solution, improving
hardware compatibility and integration with industrial controllers. Additionally, efforts
will be made to enhance RoboDK integration by transitioning from a Python script to a
dedicated RoboDK plugin, offering a graphical interface for real-time sensor interaction.
Another key improvement will be the implementation of real-time sensor calibration, al-
lowing users to define analog reference values, ensuring compatibility with a wider variety
of industrial sensors. Finally, alternative communication protocols, such as UDP, MQTT, or
real-time fieldbuses (e.g., EtherCAT, PROFINET), will be explored to further reduce latency
and improve synchronization with industrial automation systems.
This research represents a significant advancement in the domain of real-time digi-
tal twins and virtual commissioning, enabling more precise, responsive, and adaptable
industrial robotic simulations.
Author Contributions: Methodology, A.M.I.; Software, C.C.; Validation, A.M.I.; Investigation, C.C.;
Writing—original draft, C.C.; Writing—review & editing, A.M.I. All authors have read and agreed to
the published version of the manuscript.
Funding: This work was supported by the grant “GNAC ARUT 2023” contract no. 116/4/12/2023
financed by the National University of Science and Technology “Politehnica” Bucharest.
Data Availability Statement: To promote further use and development of this framework, the entire
project has been made available as open source. All the files are available at: https://github.com/
Cozmin90/rdk_signal_box01.
Conflicts of Interest: The authors declare no conflict of interest.
Sensors 2025,25, 1395 19 of 20
References
1.
Choi, H.; Crump, C.; Duriez, C.; Elmquist, A.; Hager, G.; Han, D.; Hearl, F.; Hodgins, J.; Jain, A.; Leve, F.; et al. On the use of
simulation in robotics: Opportunities, challenges, and suggestions for moving forward. Proc. Natl. Acad. Sci. USA 2021,118,
e1907856118. [CrossRef] [PubMed]
2.
Negri, E.; Fumagalli, L.; Macchi, M. A Review of the Roles of Digital Twin in CPS-based Production Systems. Procedia Manuf.
2017,11, 939–948. [CrossRef]
3.
Ružarovský, R.; Holubek, R.; Sobrino, D.R.D.; Velíšek, K. A Case Study of Robotic Simulations Using Virtual Commissioning
Supported by the Use of Virtual Reality. MATEC Web Conf. 2019,299, 02006. [CrossRef]
4.
Kritzinger, W.; Karner, M.; Traar, G.; Henjes, J.; Sihn, W. Digital Twin in Manufacturing: A Categorical Literature Review and
Classification. IFAC-PapersOnLine 2018,51, 1016–1022. [CrossRef]
5.
Garbev, A.; Atanassov, A. Comparative Analysis of RoboDK and Robot Operating System for Solving Diagnostics Tasks in
Off-Line Programming. In Proceedings of the 2020 International Conference Automatics and Informatics (ICAI), Varna, Bulgaria,
1–3 October 2020; pp. 1–5.
6.
Zhao, W.; Queralta, J.P.; Westerlund, T. Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: A Survey. In Proceed-
ings of the 2020 IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, ACT, Australia, 1–4
December 2020
.
[CrossRef]
7.
Pitkevich, A.; Makarov, I. A Survey on Sim-to-Real Transfer Methods for Robotic Manipulation. In Proceedings of the 2024
IEEE 22nd Jubilee International Symposium on Intelligent Systems and Informatics (SISY), Pula, Croatia, 19–21 September 2024;
pp. 000259–000266.
8.
Zaeh, M.; Schnoes, F.; Obst, B.; Hartmann, D. Combined offline simulation and online adaptation approach for the accuracy
improvement of milling robots. CIRP Ann. 2020,69, 337–340. [CrossRef]
9.
Kadian, A.; Truong, J.; Gokaslan, A.; Clegg, A.; Wijmans, E.; Lee, S.; Savva, M.; Chernova, S.; Batra, D. Sim2Real Predictivity:
Does Evaluation in Simulation Predict Real-World Performance? IEEE Robot. Autom. Lett. 2020,5, 6670–6677. [CrossRef]
10.
Saez, M.; Maturana, F.P.; Barton, K.; Tilbury, D.M. Real-Time Manufacturing Machine and System Performance Monitoring Using
Internet of Things. IEEE Trans. Autom. Sci. Eng. 2018,15, 1735–1748. [CrossRef]
11.
Drăgoi, M.-V.; Nisipeanu, I.; Frimu, A.; Tălîngă, A.-M.; Hadăr, A.; Dobrescu, T.G.; Suciu, C.P.; Manea, A.R. Real-Time Home
Automation System Using BCI Technology. Biomimetics 2024,9, 594. [CrossRef]
12.
Dragoi, M.-V.; Hadar, A.; Goga, N.; Grigorie, L.-S.; Stefan, A.; Ali, H.A. Design and implementation of an eeg-based bci prosthetic
lower limb using raspberry pi 4. UPB Sci. Bull. Ser. C 2023,85, 353–366.
13.
Lechler, T.; Fischer, E.; Metzner, M.; Mayr, A.; Franke, J. Virtual Commissioning–Scientific review and exploratory use cases in
advanced production systems. Procedia CIRP 2019,81, 1125–1130. [CrossRef]
14.
Hofmann, W.; Langer, S.; Reggelin, T. Integrating Virtual Commissioning Based on High Level Emulation into Logistics Education.
Procedia Eng. 2017,178, 24–32. [CrossRef]
15.
Schamp, M.; Hoedt, S.; Claeys, A.; Aghezzaf, E.; Cottyn, J. Impact of a virtual twin on commissioning time and quality.
IFAC-PapersOnLine 2018,51, 1047–1052. [CrossRef]
16. Li, P.; Liu, X. Common Sensors in Industrial Robots: A Review. J. Phys. Conf. Ser. 2019,1267, 012036. [CrossRef]
17.
Li, R.; Qiao, H.; Knoll, A. A Survey of Methods and Strategies for High-Precision Robotic Grasping and Assembly Tasks—Some
New Trends. IEEE/ASME Trans. Mechatron. 2019,24, 2718–2732. [CrossRef]
18. Vicentini, F. Collaborative Robotics: A Survey. J. Mech. Des. 2021,143, 1–29. [CrossRef]
19.
Eguti, C.C.A.; Trabasso, L.G. The virtual commissioning technology applied in the design process of a flexible automation system.
J. Braz. Soc. Mech. Sci. Eng. 2018,40, 396. [CrossRef]
20.
Ugarte, M.; Etxeberria, L.; Unamuno, G.; Bellanco, J.L.; Ugalde, E. Implementation of Digital Twin-based Virtual Commissioning
in Machine Tool Manufacturing. Procedia Comput. Sci. 2022,200, 527–536. [CrossRef]
21.
Schamp, M.; Van De Ginste, L.; Hoedt, S.; Claeys, A.; Aghezzaf, E.; Cottyn, J. Virtual Commissioning of Industrial Control
Systems—A 3D Digital Model Approach. Procedia Manuf. 2019,39, 66–73. [CrossRef]
22.
Zhang, L.; Cai, Z.Q.; Ghee, L.J. Virtual Commissioning and Machine Learning of a Reconfigurable Assembly System. In
Proceedings of the 2020 2nd International Conference on Industrial Artificial Intelligence (IAI), Shenyang, China, 23–25 October
2020; pp. 1–6.
23.
Martinez, G.S.; Sierla, S.; Karhela, T.; Vyatkin, V. Automatic Generation of a Simulation-Based Digital Twin of an Industrial
Process Plant. In Proceedings of the IECON 2018-44th Annual Conference of the IEEE Industrial Electronics Society, Washington,
DC, USA, 21–23 October 2018; pp. 3084–3089.
24.
Striffler, N.; Voigt, T. Concepts and trends of virtual commissioning—A comprehensive review. J. Manuf. Syst. 2023,71, 664–680.
[CrossRef]
25.
Mihaliˇc, F.; Truntiˇc, M.; Hren, A. Hardware-in-the-Loop Simulations: A Historical Overview of Engineering Challenges. Electronics
2022,11, 2462. [CrossRef]
Sensors 2025,25, 1395 20 of 20
26.
Korpai, R.; Szántó, N.; Csapó, Á.B. A Framework for Effective Virtual Commissioning: Guiding Principles for Seamless System
Integration. J. Manuf. Mater. Process. 2024,8, 165. [CrossRef]
27.
Gasiyarov, V.R.; Bovshik, P.A.; Loginov, B.M.; Karandaev, A.S.; Khramshin, V.R.; Radionov, A.A. Substantiating and Implementing
Concept of Digital Twins for Virtual Commissioning of Industrial Mechatronic Complexes Exemplified by Rolling Mill Coilers.
Machines 2023,11, 276. [CrossRef]
28.
Konstantinov, S.; Assad, F.; Ahmad, B.; Vera, D.A.; Harrison, R. Virtual Engineering and Commissioning to Support the Lifecycle
of a Manufacturing Assembly System. Machines 2022,10, 939. [CrossRef]
29.
Ullrich, M.; Thalappully, R.; Heieck, F.; Lüdemann-Ravit, B. Virtual Commissioning of Linked Cells Using Digital Models in an
Industrial Metaverse. Automation 2024,5, 1–12. [CrossRef]
30.
Noga, M.; Juhás, M.; Gulan, M. Hybrid Virtual Commissioning of a Robotic Manipulator with Machine Vision Using a Single
Controller. Sensors 2022,22, 1621. [CrossRef] [PubMed]
31.
Grecu, I.; Belu, N.; Rachieru, N. Risk Measurement and Prioritization Using Fuzzy Fmea Approach—A Study of Process in
Automotive Industry. In Proceedings of the International Conference on Management and Industrial Engineering, Bucharest,
Romania, 1 November 2019; pp. 325–335.
32. Nechita, R.; Ulerich, O.; Radoi, E. Risk management in research projects. FAIMA Bus. Manag. J. 2024,12, 15–23.
33.
Silvestru, C.I.; Lupescu, M.-E.; Ifrim, A.-M.; Silvestru, R.; Icociu, C.-V. The Impact of Sustainability on the Labour Market and
Employability in the Construction Industry. Sustainability 2024,16, 10284. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.
Available via license: CC BY 4.0
Content may be subject to copyright.