Content uploaded by Vadym Slyusar
Author content
All content in this area was uploaded by Vadym Slyusar on Aug 28, 2020
Content may be subject to copyright.
Slyusar Vadym, DSc, Prof
MOD of Ukraine
The role of Artificial Intelligence in cross-platform tailoring of AR data
Because Artificial Intelligence (AI) will form the basis of future control
networks, the incorporation of AI is an important trend in the development of
battlefield and weapons control systems.
NATO experts have few alternative definitions of AI, but in the context of AR
can be recommended the definition from NIAG Study Group SG-238: “AI refers to
systems designed by humans that, given a complex goal, act in the physical or
digital world by perceiving their environment, interpreting the collected structured
or unstructured data, reasoning on the knowledge derived from this data and
deciding the best action(s) to take (according to pre-defined parameters) to achieve
the given goal. AI systems can also be designed to learn to adapt their behavior by
analyzing how the environment is affected by their previous actions”.
AI is useful in particular with respect to making heterogeneous AR systems
work together; to improve AR data exchange and target allocation (also between
nations); to working with fewer resources of AR data; to making coordination of
sensors and effectors, threat detection and identification, semi-autonomous weapon
allocation. AI is a mean to improving responce speed (fast threat, pop up,
numerous threats), derivation of intents, situational awareness and debrief with
help of AR. On the other hand, AR is a communication bridge and feedback
mechanism from AI to a Human for the support of decisions making.
The main benefits of AI and ML are to enhance C2 (with AR),
Communications, Sensors, Integration and Interoperability. On the basis of AI and
ML with Microsoft Common Objects in Context (MS-COCO), or other
technologies, the synthesis of AR symbols can be provided (such as outline
symbols of targets). It enables joint target acquisition, targeting of moving targets
(single or swarm) and supports coordination and deconfliction of distributed Joint
Fires between networked combat vehicles, tanks, helicopters, ships etc. also inside
Manned and Unmanned Teams (MUM-T).
AI can be used to visually identify objects and targets on the battlefield. For
this task can be used the cloud distibuted or multi-platform cooperative AI
algorithm, which is distributed between several vehicles and can create joint three-
dimensional outlines AR symbols for the common operational picture.
AI algorithms can build (and not only outline) AR symbols of targets but also
synthesis they AR vulnerability models, similar such as the German VEMAG
model, Swiss RUAG model, US MILES LEAR model, and French GDI model,
which use now for modeling and simulations.
These visualized vulnerability models of target decompose enemy object into
few sides and those sides into some areas for hitting, allow for a more precision
and/or effectiveness regards between point of impact and specific damage effect.
The information about such hitting areas can be distributed as AR symbols
between networked combat vehicles inside a unit to coherent/ together destroy of
difficult target. The level of decomposition of AR shape symbols can be changed
dependent distance to target, and the state of such decomposition may be used as
the additional information about current distance to target. This concept will
increase effectiveness of the combat engagement if use unmanned platforms as
forward observers for produce vulnerability areas symbols of targets.
The color of the target’s outline symbol obtained from other vehicles via
BMS can be updated using the AI algorithm to solve occlusion problems and
optimize its visual perception against the background of the scene.
Neither vulnerability models nor the associated vulnerability calculation
mechanisms need to be standardized in order to achieve technical interoperability,
as opposed to a standardized vulnerability model that can be used to create a
reference model. Still, the entry into the vulnerability model and the exit of the AR
model should be standardized to achieve technical compatibility, while the
mechanism for calculating the underlying vulnerability model using AI can be
considered a “black box”.
In the future, after increasing the autonomy of robotic UGV and the
integration of human perception analogues, such in the physical vision, into
robotics, the part of AR data from BMS should be transferred to UGV for use by
its autopilot for orientation and support of mission. For this it is possible to preload
the necessary volumes of AR data before the start of the UGV mission, as well as
quickly update them on board the UGV during the execution of the task.
In case of the Remotely Controlled UGV the overlaying of preloaded AR
symbols to the video stream from the UGV on-board cameras should be make in
the UGV equipment with the next transmission a full ready combination preloaded
AR and video stream to the UGV's operator. Such solution decrease navigation
errors and exclude additional mistakes such as operator localization in the placing
AR symbols in correct position on the terrain image. This improves the accuracy of
targeting acquisition and situation awareness.
At the same time, AR outline symbols of targets will be synthesized as AR
data on the base of Point Cloud from UGV on-board vision sensors by use AI
algorithms. Also AI can perform the following functions: warnings about the
possibility of capsizing, determining a safe path, detecting suddenly emerging
threats that impede movement, visual warning for marking areas requiring special
attention, the analysis of hyperspectral images of the soil to identify changes in its
surface, which is a sign of artificial camouflage of improvised explosive devices or
mines, camouflage identification against the backdrop of a natural landscape. All
results of such identification will be present as AR symbols. Such synthesized AR
symbols can be sending to the operator of Command post or other vehicle inside
MUM-T without video stream for minimization of traffic or incorporate to full
video stream in the combination with preloaded AR symbols as well. In this case,
it is necessary to solve the problem of integrating the on-board AR data generation
tools with the UGV architecture, as well as to find a compromise in the level of
centralization of their connection to the BMS. It is very important also inside
MUM-T.