Access to this full-text is provided by Wiley.
Content available from IET Science, Measurement & Technology
This content is subject to copyright. Terms and conditions apply.
Received: 27 September 2022 Revised: 15 December 2022 Accepted: 3 January 2023 IET Science, Measurement & Technology
DOI: 10.1049/smt2.12141
ORIGINAL RESEARCH
Unmanned ground vehicle-unmanned aerial vehicle relative
navigation robust adaptive localization algorithm
Jun Dai1,2Songlin Liu1Hao Xiangyang1Zongbin Ren1Xiao Yang3
Yunzhu Lv1
1Institute of Geospatial Information, Information
Engineering University, Zhengzhou, China
2School of Aerospace Engineering, Zhengzhou
University of Aeronautics, Zhengzhou, China
3Dengzhou Water Conservancy Bureau, Dengzhou,
China
Correspondence
Xiangyang Hao, Institute of Geospatial Information,
Information Engineering University, Zhengzhou
450001, China.
Email: xiangyanghao2004@163.com
Funding information
The Science and Technology Department of Henan
Province, Grant/Award Numbers: 222102110029,
222102220095; Information Engineering University,
Grant/Award Number: F4211
Abstract
The unmanned aerial vehicle (UAV) and the unmanned ground vehicle (UGV) can
complete complex tasks through information sharing and ensure the mission execution
capability of multiple unmanned carrier platforms. At the same time, cooperative naviga-
tion can use the information interaction between multi-platform sensors to improve the
relative navigation and positioning accuracy of the entire system. Aiming at the problem of
deviation of the system model due to gross errors in sensor measurement data or strong
manoeuvrability in complex environments, a robust and adaptive UGV-UAV relative nav-
igation and positioning algorithm is proposed. In this paper, the relative navigation and
positioning of UGV-UAV is studied based on inertial measurement unit (IMU)/Vision.
Based on analyzing the relative kinematics model and sensor measurement model, a
leader (UGV)-follow (UAV) relative navigation model is established. In the implementa-
tion of the relative navigation and positioning algorithm, the robust adaptive algorithm
and the non-linear Kalman filter (extended Kalman filter [EKF]) algorithm are combined
to dynamically adjust the system state parameters. Finally, a mathematical simulation of the
accompanying and landing process in the UGV-UAV cooperative scene is carried out. The
relative position, velocity and attitude errors calculated by EKF, Robust-EKF and Robust-
Adaptive-EKF algorithms are compared and analyzed by simulating two cases where the
noise obeys the Gaussian distribution and the non-Gaussian distribution. The results show
that under the non-Gaussian distribution conditions, the accuracy of the Robust-Adaptive-
EKF algorithm is improved by about two to three times compared with the EKF and
Robust-EKF and can almost reach the relative navigation accuracy under the Gaussian
distribution conditions. The robust self-adaptive relative navigation and positioning algo-
rithm proposed in this paper has strong adaptability to the uncertainty and deviation of
system modelling in complex environments, with higher accuracy and stronger robustness.
1 INTRODUCTION
With the continuous improvement of unmanned, intelligent,
and autonomous requirements, unmanned aerial vehicle (UAV)
and unmanned ground vehicle (UGV) can show excellent char-
acteristics in high-risk and complex tasks [1, 2]. Therefore,
they have become a research hotspot and developed rapidly.
When UAV and UGV cooperate with each other to complete
a certain task, collaborative navigation technology can utilize
This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is
properly cited.
© 2023 The Authors. IET Science, Measurement & Technology published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology.
the information interaction between multi-platform sensors to
achieve multi-carrier information collaboration and ensure the
task execution capability of multi-carriers [3–5].
In cooperative and relative positioning, the focus of attention
is on the relative position, relative velocity, and relative attitude
calculations between platforms. However, due to the time and
space inconsistency between different platforms and the differ-
ent accuracy of the sensors carried by each platform, the relative
calculation of the navigation information of each platform has
IET Sci. Meas. Technol. 2023;17:183–194. wileyonlinelibrary.com/iet-smt 183
184 DAI ET AL.
a large error. Relative navigation and positioning algorithms can
directly determine the relative position, velocity, and relative atti-
tude of different moving bodies, which are more intuitive and
accurate for the coordinated control between the carriers. The
high-precision and high-reliability relative navigation position-
ing provides a guarantee for improving the overall cooperative
task efficiency of the vehicle and the machine, which is the core
technology to realize the coordinated flight and landing of the
vehicle and the machine [6–9].
At present, the sensors used for relative navigation and posi-
tioning mainly include inertial measurement devices and visual
sensors. Since the inertial measurement unit has constant drift,
and the visual sensor has the problem of image blur in the case
of scale uncertainty and fast motion, the combination of the two
can achieve complementary advantages, which can improve the
relative navigation and positioning accuracy [10–13].
Yang Dongchun et al. [14] proposed a relative navigation sys-
tem based on laser ranging/inertial navigation for the problem
of long-distance rendezvous and docking of non-cooperative
targets and used the Kalman filter algorithm to obtain the
relative position and relative velocity errors and correct the
recursive results. The simulation results show that the relative
position error obtained by this method is less than 1 m, and
the relative velocity error is less than 0.05 m/s. Kim et al. [15]
proposed a relative navigation system based on inertial naviga-
tion system (INS)/VisNav, which was applied to the problem
of autonomous rendezvous and docking of spacecraft. The
method adopts the form of data link to transmit the navigation
data between spacecraft and uses the relative line of sight vector
information to correct the relative motion parameter error, and
finally the accurate relative navigation data is obtained. Wang
Dong et al. [16] proposed a relative navigation method based on
INS/GPS for the uncertainty of the inertial array of the tracker
in the fly-around mode and designed the corresponding naviga-
tion filter combined with the robust Kalman filter algorithm. A
relative position accuracy of 0.1 m and a relative attitude accu-
racy of 0.001 degrees are obtained. Zhang et al. [17] proposed
a method to calibrate the external parameters in the process
of estimating the relative state using visual/inertial navigation.
In this method, the external parameters of vision/inertial nav-
igation are regarded as state variables, which together with the
relative orbit kinematics equation, relative attitude equation, and
inertial navigation model are combined into the system state
equation. The extended Kalman filter is designed to estimate the
relative pose, inertial navigation bias, and external parameters
of vision/inertial navigation by using the state equation and the
observation of monocular vision, and the validity of the method
is verified by mathematical simulation. Fosbury et al. [18] pro-
posed a relative navigation method based on INS/VisNav
and applied it to UAV cooperative formation flight. The
relative position, attitude, and velocity information between
UAVs are effectively estimated by the extended Kalman filter
algorithm.
For multi-source fusion relative navigation, Qu et al. [19]
proposed a fault-tolerant relative navigation method for UAVs
based on inertial navigation/GPS/vision for the coordinated
formation flight of UAVs, which effectively improved the rel-
ative navigation accuracy and the fault tolerance of the system.
Benjamin et al. [20] proposed a unscented Kalman filter-based
navigation algorithm for the actual shipboard autonomous
take-off and landing problem. It combines GPS/INS/vision
three-part measurement information and completes the esti-
mation process of relative motion state under the condition
of considering the mathematical model of sea surface motion.
Chen [21] proposed a relative navigation algorithm based on
robust high-order volume filtering based on the autonomous
landing of a small, unmanned helicopter relative to a mobile
platform. Extended Kalman filtering (EKF), higher-order vol-
ume Kalman filtering (HEKF), Huber-based filtering (HF), and
robust higher-order volumetric filtering (HHCF) were com-
pared to estimate relative position, relative velocity, and relative
attitude. The results demonstrate the effectiveness of the HHCF
algorithm under non-linear and non-Gaussian conditions and
its robustness to non-Gaussian noise.
However, most of the previous research are the basic imple-
mentation of relative navigation, or use robust algorithms to
solve the problem of gross errors or outliers in the data, but
have not considered the inconsistency with reality caused by
the deviation of the system model. In general, both the kinetic
model and the random error model on which Kalman filtering
is based are not expected to be accurate. Especially when the
carrier has a large manoeuvre, such as turning, climbing, accel-
erating etc., the system model is not consistent with the set one,
and the random error no longer obeys the Gaussian distribu-
tion. At this time, if we rely too much on the original system
model, it will lead to the accumulation of errors and even the
phenomenon of filter divergence. Based on this, we propose a
robust adaptive relative navigation and localization algorithm,
which combines both the robust and adaptive algorithms into
non-linear Kalman filtering for state estimation. Specifically, we
make the following contributions.
We analyze the relative kinematics model and sensor mea-
surement model and build a leader (UGV)-follow (UAV) relative
navigation model based on the inertial measurement unit
(IMU)/Vision sensor combination.
We propose a robust adaptive relative navigation and
positioning algorithm, which combines the robust adaptive
algorithm and the non-linear Kalman filter EKF algorithm to
adjust the system state parameters dynamically adaptively.
We simulated two cases of Gaussian and non-Gaussian dis-
tribution for mathematical simulation experiments, in which
we simulated the non-Gaussian distribution noise by estab-
lishing the contamination distribution, and compared and
analyzed the relative position, velocity, and attitude accu-
racy calculated by EKF, Robust-EKF, and Robust-Adaptive-
EKF algorithm. Finally, the accuracy and robustness of the
proposed Robust-Adaptive-EKF algorithm in complex envi-
ronments are verified. At the same time, the real-time per-
formance of the Robust-Adaptive-EKF algorithm was also
tested.
The rest of this paper is organized as follows: The defini-
tion of the relevant coordinate system is given in Section 2.In
Section 3, a relative navigation model based on IMU/vision
is established. In Section 4, the relative navigation filtering
DAI ET AL.185
algorithms are proposed, including traditional extended
Kalman, robust algorithm, and adaptive algorithm. In Section 5,
through mathematical simulation experiments, the accuracy
performances of the three algorithms are discussed and ana-
lyzed. Finally, conclusions and further research arrangements
are drawn in Section 6.
2THE RELEVANT COORDINATE
SYSTEM DEFINITION
Inertial coordinate system [22](I): The origin is located at the
centre of the earth, the x-axis points to the vernal equinox,
the z-axis is in the same direction as the earth’s polar axis,
and the y-axis and zx-axis form a right-handed coordinate
system.
UGV body coordinate system (G): The origin is located at
the centre of mass of the body, the y-axis is along the axis of the
body, and the z-axis is perpendicular to the body plane, and the
x-axis and yz-axis form a right-handed coordinate system.
UAV body coordinate system (A): The origin is located at the
centre of mass of the body, and the xyz axis is the roll axis, pitch
axis, and yaw axis, forming a right-hand coordinate system.
Camera coordinate system (C): The origin is at the cen-
tre of the camera’s projection, the z-axis points to the phase
plane along the optical axis, and the xy-axis is parallel to the A
coordinate system axis.
Image coordinate system (p):Theoriginistheintersectionof
the camera optical axis and the image plane, and the uvaxis is
parallel to the image row and column, respectively.
3RELATIVE NAVIGATION MODEL
Here, as is shown in Figure 1, it is assumed that the naviga-
tion parameters such as UGV’s position, speed, acceleration,
attitude, and body angular rate are provided by the navigation
system of the UGV, and they are transmitted to the UAV in
real time through the data transmission system interconnected.
The navigation information of the UAV itself is output through
its inertial navigation system, and the relative navigation solu-
tion with error can be obtained by making the difference of
the navigation information corresponding to the two carriers.
State estimation is to correct the relative navigation error of
the INS by using the output of the visual navigation system as
the observation quantity. Among them, the relative attitude, rel-
ative position, relative velocity, and the measurement error of
the gyroscope and accelerometer in the UAV inertial navigation
system are used as the state vector of the filter.
3.1 Kinematic model
3.1.1 Relative attitude motion equation
The relative attitude motion equation [23] in this paper could
be expressed as (1) and the rotational angular velocity could be
FIGURE 1 UGV-UAV relative navigation model. UAV, unmanned aerial
vehicle; UGV, unmanned ground vehicle.
defined as 𝜔A
A∕G=𝜔
A−RA
G𝜔G.
qA
A∕G=1
2qA
A∕G⊗𝜔
A
A∕G(1)
where qA
A∕G,𝜔A
A∕G,andR
A
Grepresent the relative attitude of the
UAV relative to the UGV, qA
A∕Grepresents the relative quater-
nion, 𝜔A
A∕Grepresents the relative attitude angular velocity, and
RA
Grepresents the rotation matrix. 𝜔A=[𝜔Ax ,𝜔
Ay ,𝜔
Az ]Trep-
resents the attitude angular velocity of the UAV, and 𝜔G=
[𝜔Gx ,𝜔
Gy,𝜔
Gz ]Trepresents the attitude angular velocity of the
UGV. So (1) could also be written as (2).
qA
A∕G=1
2qA
A∕G⊗𝜔A−RA
G𝜔G
=1
2[
Ω(𝜔A)−
Γ(𝜔G)]qA
A∕G(2)
where
Ω(𝜔A)=
0𝜔Az−𝜔Ay𝜔Ax
−𝜔Az0𝜔Ax𝜔Ay
𝜔Ay−𝜔Ax0𝜔Az
𝜔Ax𝜔Ay𝜔Az0
186 DAI ET AL.
Γ(𝜔G)=
0−𝜔Gz𝜔Gy𝜔Gx
𝜔Gz0−𝜔Gx𝜔Gy
−𝜔Gy𝜔Gx0𝜔Gz
𝜔Gx𝜔Gy𝜔Gz0
3.1.2 Relative centroid motion equation
The relative position vector of the two vectors is defined as
follows (3):
rA
A∕G=RA
GTrA
A∕I−rG
G∕I=RG
IrI
A∕I−rI
G∕I(3)
where rA
A∕Grepresents the component of the position vector of
the UAV relative to the UGV in the UAV coordinate system,
rA
A∕Iand rG
G∕Ithe position vectors of the UAV and UGV in the
inertial coordinate system with respect to the components of the
respective coordinate systems, rI
A∕Iand rI
G∕Irepresent the com-
ponents representing the position vector of the UAV and UGV
in the inertial coordinate system. RG
Irepresents the rotational
attitude of the UGV in the inertial coordinate system.
The relative velocity vector and relative acceleration vector of
the two carriers are expressed as follows (4)and(5):
rA
A∕G=
RG
IrI
A∕I−rI
G∕I−RG
IrI
A∕I−rI
G∕I
=−[(𝜔G)×]rA
A∕G−RG
IrI
A∕G
(4)
rA
A∕G=d2
dt 2rA
A∕G
=d2
dt 2RG
IrI
A∕I−rI
G∕I+2d
dt RG
Id
dt rI
A∕I−rI
G∕I
+RG
I
d2
dt 2rI
A∕I−rI
G∕I
=−
𝜔G×rA
A∕G−[(𝜔G)×][(𝜔G)×]rA
A∕G−2𝜔G×rA
A∕G
+RA
GTaA−aG(5)
3.1.3 Sensor measurement models
IMU measurement model
In this paper, it is assumed that the IMU of the UGV is of
high precision, and only the IMU measurement error model of
the UAV is discussed here. The following gyroscope model and
the accelerometer measurement model are all developed for the
UAV.
The UAV gyroscope measurement model is as follows (6):
𝜔A=𝜔
m−bg−ng
bg=ngw
(6)
A
O
G
O
O
C
O
C
x
C
y
C
z
x
y
(,)uv
(, ,)
CCCC
iiii
rxyz
/
(, ,)
A
AG
rxyz
(,,)
G
iiii
rXYZ
(,,)
G
iiii
rXYZ
C
i
r
FIGURE 2 Relative measurement coordinate relationship
where bg=[bgx ,bgy,bgz ]Trepresents the gyroscope bias. ng
represents the measurement noise of the gyroscope, which is
Gaussian white noise. ngw represents the gyroscope random
walk, which is the Gaussian white noise. 𝜔mis the gyroscope
actual measurement value.
The UAV accelerometer measurement model is as follows (7):
aA=am−ba−na
ba=naw
(7)
where ba=[bax ,bay,baz ]Tis the accelerometer bias. nais the
measurement noise of the accelerometer, which is Gaus-
sian white noise. naw is the accelerometer random walk,
which is Gaussian white noise. amis the accelerometer actual
measurement value.
Camera measurement model
As shown in Figure 2, the components of the ith feature point in
the camera coordinate system and the vehicle body coordinate
system are represented by rC
i(xC
i,yC
i,zC
i)andrG
i(xG
i,yG
i,zG
i)
respectively, in which it is assumed that the camera coordinate
system and the body coordinate system coincide. It can be seen
in Figure 2that rC
i(xC
i,yC
i,zC
i)andrG
i(xG
i,yG
i,zG
i) are related
as follows:
rC
i=
xC
i
yC
i
zC
i
=RA
GrG
i−rA
A∕G(8)
According to the camera imaging mapping model relation-
ship, the observations are as follows (9):
ziui
vi=f
zC
ixC
i
yC
i+𝜂
i=frC
i
[0 0 1]rC
i
+𝜂
i(9)
DAI ET AL.187
where fis the focal length of the camera. Π=100
010
,
uiand virespectively represent the coordinates of the ith fea-
ture point in the image coordinate system. 𝜂irepresents the
measurement noise of feature points, which is Gaussian white
noise.
3.2 Relative navigation model
3.2.1 Relative navigation state propagation
equation
Combining the relative attitude motion equation and the relative
mass centre motion equation, when the state variable is defined
as x=[(qA
A∕G)T,(rA
A∕G)T,(rA
A∕G)T,(ba)T,(bg)T]
T
, the relative
navigation state equation is as follows (10):
qA
A∕G=1
2qA
A∕G⊗𝜔
A
A∕G
x=f(x)+gqA
A∕G,aA
ba=03×1
bg=03×1
(10)
3.2.2 Relative navigation error propagation
equation
Here, the error state is the state change,
and the error state variable is set as 𝛿x=
[(𝛿𝜃A
A∕G)T,(𝛿rA
A∕G)T,(𝛿rA
A∕G)T,(𝛿ba)T,(𝛿bg)T]
T
, where
𝛿𝜃A
A∕Gis the relative attitude angle error, 𝛿rA
A∕Gis the relative
position error, 𝛿rA
A∕Gis the relative velocity error, 𝛿bais the
accelerometer drift error, and 𝛿bgis the gyroscope drift error.
The relative navigation state equation is as the following (11),
(12), and (13):
𝛿
𝜃A
A∕G=−[(𝜔m−
bg)×]𝛿𝜃A
A∕G+𝛿𝜔
A
=−[(𝜔m−
bg)×]𝛿𝜃A
A∕G−𝛿bg−ng(11)
𝛿rA
A∕G=rA
A∕G−
rA
A∕G=d
dt 𝛿rA
A∕G(12)
𝛿rA
A∕G=rA
A∕G−
rA
A∕G
=−(RA
G)T[(am−
ba)×]𝛿𝜃A
A∕G
−[(
𝜔G)×]+[(𝜔G)×][(𝜔G)×]𝛿rA
A∕G
−2[(𝜔G)×]𝛿rA
A∕G−(RA
G)Tna(13)
4RELATIVE NAVIGATION
FILTERING ALGORITHM
The relative navigation and positioning can be realized by the
Kalman filter algorithm, but since the camera measurement
equation is non-linear, the non-linear Kalman filter algorithm
needs to be used. The traditional Kalman filter is sensitive to
noise that deviates from the true probability distribution, which
may cause unstable filter performance or affect the estimation
results. Therefore, to improve the filtering performance and
reduce the possibility of filtering divergence when the noise
distribution characteristics are unknown, robust and adaptive
techniques are introduced to estimate noise statistics in real
time.
4.1 EKF filtering algorithm
4.1.1 Relative navigation state equation
The equation of state is as the following (14):
𝛿x(t)=F(t)𝛿x(t)+G(t)w(t) (14)
F(t) =
−[(𝜔m−
bg)×]−(RA
G)T[(am−
ba)×]0
3×3
03×3−[(
𝜔G)×]+[(𝜔G)×][(𝜔G)×]0
3×3
03×3−2[(𝜔G)×]I3×3015×6
−I3×303×303×3
03×3−(RA
G)T03×3
T
where
G(t) =
−I3×303×3
03×303×3
03×303×3
03×303×3
03×303×3
03×303×3
−(RA
G)T03×3
03×303×3
I3×303×3
03×3I3×3
w(t)=
ng
na
ngw
naw
At the sampling interval [tk,tk+Δt], (14) is discretized.
Xk+1=Φ
k+1∕kXk+Γ
kWk(15)
where Φk+1∕kand Γkare the parameters of the state transition
equation discretized by F(t)andG(t), as follows: Φk+1∕k≈
eF(tk)Δt≈I+F(tk)Δt,Γ
k≈G(tk)Δt.
4.1.2 Relative navigation measurement equation
The estimator ziof ziis expressed as follows (16):
zi=ui
vi=f
zC
ixC
i
yC
i+𝜂
i=fΠrC
i
[001
]rC
i
(16)
188 DAI ET AL.
FIGURE 3 UGV-UAV relative motion trajectory figure (red indicates the
UAV motion trajectory, green indicates the UGV motion trajectory, and the
asterisk indicates the starting point)
where
rC
i=
xC
i
yC
i
zC
i
=RA
GrG
i−rA
G
The observation equation is as the following (17):
𝛿zi=zi−zi=Hi𝛿x+𝜂
i(17)
where
Hi=𝜕zi
𝜕𝛿𝜃A
A∕G
𝜕zi
𝜕𝛿rA
A∕G
03×303×303×3
𝜕zi
𝜕𝛿𝜃A
A∕G
=f
zC
i
10−xC
i
zC
i
01−yC
i
zC
i
RA
GrG
i−rA
A∕G×𝜕zi
𝜕𝛿rA
A∕G
=− f
zC
i
10−xC
i
zC
i
01−yC
i
zC
i
RA
G
4.2 Robust filtering algorithm
The system state residual is jointly determined by the model
error and the observation error [23, 24] and can be repre-
sented by the residual. skrepresents the filter residual sk=
(Zk−HkXk∕k−1) at time k, and its covariance matrix is wk=
HkPk∕k−1HT
k+Rk,inwhichPk∕k−1represents the state pre-
diction vector covariance matrix at time kand Rkrepresents the
observation vector covariance matrix at time k.
TAB LE 1 UGV and UAV inertial parameter settings
Val u e
Parameter UGV UAV
Gyro error (x-, y-, z-) Bias 0.01 ◦∕h0.1
◦∕h
Random walk 0.008 ◦∕h0.08
◦∕h
Accelerometer error (x-, y-, z-) Bias 50 ¯g 200 ¯g
Random walk 10 ¯g∕h50¯g∕h
Frequency 200 Hz 100 Hz
TAB LE 2 Feature point distribution
Feature point Xi(mm) Yi(mm) Zi(mm)
1000
2 357 0 0
3 597 0 0
4 954 0 0
5 357 366 0
6 597 366 0
7 357 732 0
8 597 732 0
9 0 1098 0
10 357 1098 0
11 597 1098 0
12 954 1098 0
The filter residuals are normalized as the following (18):
vk=(sk)𝜏(wk)−1sk(18)
The IGG3 weight function [25–27] is introduced, and the
residual gain matrix is adjusted by the system normalized
residual.
𝜇k=
1vk≤k0
(k0∕vk)d2
kk0<vk≤k1
0vk>k1
dk=k1−vk
k1−k0
(19)
When there is no gross error in the observed value, the
normalized residual vkfollows the standard state distribution:
vk∼N(0,1). Robust processing is performed on observations
with a confidence level of 95%, where k0 is set to 1 and k1 is set
to 3.
After the observation robustness is processed, the measure-
ment update is as the following (20):
Xk=Xk∕k−1+𝜇
kKk(Zk−HkXk∕k−1)
Pk=(I−𝜇
kKkHk)Pk∕k−1,i=1⋯N(20)
DAI ET AL.189
FIGURE 4 Comparison of relative navigation position estimation
accuracy of three algorithms under Gaussian distribution
4.3 Adaptive filtering algorithm
In a complex environment, due to the existence of errors or
even gross errors in the measured values of random dynamic
systems, the statistical characteristics of noise will change, which
will reduce the accuracy of Kalman filtering and even cause
divergence. To solve the problem of inaccurate noise estima-
tion caused by the change of the statistical characteristics of
the measurement noise, the measurement noise can be adjusted
adaptively [23, 28, 29, 31].
𝜌k=(sk)2−HkPk∕k−1(Hk)T(21)
where 𝜌kis used to adjust the measurement noise adaptively, to
satisfy the online real-time estimation of measurement noise Rk,
which is expressed as the following (22):
Rk=
Rmin (𝜌k<Rmin )
Rmax (𝜌k>Rmax )
(1 −𝛾
k)Rk−1+𝛾
k𝜌k(other )
(22)
where the adaptive adjustment coefficient is 𝛾k=𝛾k−1
𝛾k−1+b,and
the initial value is 𝛾0=1. bis the fading factor, which is gener-
ally taken as 0.9 to 0.99 [30]; here, it is taken as 0.95. Rkis limited
between [Rmin,Rmax] and can have good adaptation and filtering
reliability, where Rmin is taken as 0.01 times of the measurement
noise matrix and Rmax is taken as 100 times of the measurement
noise matrix.
5SIMULATION EXPERIMENT
To verify the effectiveness and robustness of the algorithm pro-
posed in this paper, the following simulation experiments are
FIGURE 5 Comparison of relative navigation speed estimation accuracy
of three algorithms under Gaussian distribution
FIGURE 6 Comparison of relative navigation attitude estimation
accuracy of three algorithms under Gaussian distribution
carried out by setting the trajectory, sensor parameters, and
noise parameters to compare and analyze different algorithms.
5.1 Simulation settings
5.1.1 UGV-UAV motion trajectory simulation
The following simulates the trajectory data of UAV during the
process of accompanying and landing on UGV. Among them,
the UGV performs a uniform linear motion in the whole pro-
cess. In the accompanying phase, the UAV makes a detour to
ensure motion observability. In the landing stage, the UAV’s
descending state is simulated by the sigmoid function [32], which
indicates that the closer it is to the ground, the slower the
descending speed. A total of 430 s of relative motion trajectories
190 DAI ET AL.
TAB LE 3 Summary of relative navigation estimation accuracy under
Gaussian distribution
EKF Robust-EKF
Robust-adaptive-
EKF
Error AME STD AME STD AME STD
Pitch (◦) 0.0103 0.0280 0.0121 0.0284 0.0096 0.0260
Roll (◦) 0.0091 0.0302 0.00799 0.0242 0.0094 0.0255
Yaw ( ◦) 0.0102 0.2663 0.0328 0.2248 0.0294 0.2250
VX (m/s) 0.0341 0.0274 0.0392 0.0143 0.0132 0.0256
VY (m/s) 0.0282 0.0231 0.0201 0.0192 0.0264 0.0257
VZ (m/s) 0.0252 0.0151 0.0132 0.0125 0.0202 0.0240
X(m) 0.1303 0.1367 0.1501 0.1098 0.1341 0.1442
Y(m) 0.1295 0.1592 0.1434 0.1647 0.1168 0.1351
Z(m) 0.1012 0.0591 0.1922 0.0850 0.1395 0.0480
FIGURE 7 Comparison of relative navigation position estimation
accuracy of three algorithms under non-Gaussian distribution
FIGURE 8 Comparison of relative navigation speed estimation accuracy
of three algorithms under non-Gaussian distribution
FIGURE 9 Comparison of relative navigation attitude estimation
accuracy of three algorithms under non-Gaussian distribution
TAB LE 4 Summary of relative navigation estimation accuracy under
non-Gaussian distribution
EKF Robust-EKF
Robust-adaptive-
EKF
Error AME STD AME STD AME STD
Pitch (◦) 0.0244 0.0612 0.0094 0.0291 0.0135 0.0186
Roll (◦) 0.0190 0.0280 0.0095 0.0154 0.0088 0.0213
Yaw ( ◦) 0.4100 0.2647 0.3970 0.1179 0.2574 0.2666
VX (m/s) 0.0495 0.0540 0.0195 0.0179 0.0121 0.0108
VY (m/s) 0.0406 0.0397 0.0295 0.0257 0.0145 0.0142
VZ (m/s) 0.0332 0.0327 0.0301 0.0151 0.0237 0.0168
X(m) 0.5501 0.5409 0.1912 0.2037 0.1441 0.1343
Y(m) 0.4564 0.4551 0.2324 0.2478 0.1432 0.1415
Z(m) 0.7081 0.7102 0.2034 0.1844 0.1324 0.1276
are simulated through the simulation, as shown in Figure 3.The
UGV motion trajectory is expressed as the following (23):
x−UGV =1
2t
y−UGV =t
z−UGV =0
(23)
The UAV motion trajectory is expressed as the following (24):
x−UAV =
1
2t+15 sin(0.02𝜋t)+0.1cos(𝜋t)t≤300s
1
2t+0.1sin(𝜋t)t>300s
y−UAV =t+0.2sin(𝜋t)
z−UAV =
50 +0.1cos(𝜋t)t≤300s
100 −100 1
1+e−0.05tt>300s
(24)
DAI ET AL.191
TAB LE 5 Summary of relative navigation estimation accuracy of 20 simulation experiments
Gaussian distribution Non-Gaussian distribution
Error (MAEs) EKF Robust-EKF Robust-adaptive-EKF EKF Robust-EKF Robust-adaptive-EKF
Relative attitude (◦) 0.0167 0.0186 0.0175 0.0484 0.0345 0.0161
Relative velocity (m/s) 0.0201 0.0225 0.0221 0.0458 0.0342 0.0232
Relative position (m) 0.1201 0.1196 0.1213 0.4010 0.2431 0.1332
TAB LE 6 The time required for the single-step execution of the three
algorithms under non-Gaussian distribution
EKF Robust-EKF Robust-adaptive-EKF
Time (s) 6.75 ×10−38.11 ×10−39.15 ×10−3
5.1.2 Inertial sensor parameters
According to the model setting, the inertial sensor sys-
tem parameters of UGV are constant, and only the inertial
parameters of UAV are estimated, as shown in Table 1.
5.1.3 Vision sensor and cooperation sign
feature point parameters
In this paper, the focal length of the visual sensor is set to
50 mm, and the cooperation sign includes 12 feature points. The
positions of the feature points on the vehicle body are shown in
Table 2.
Here, a hybrid Gaussian probability density function is used
to model the measurement noise distribution of the vision sen-
sor. The contamination rate is zero for obeying the Gaussian
noise distribution, and the contamination rate is non-zero for
the non-Gaussian noise distribution, which is expressed as the
following (25):
g(𝜔)=1−𝜀
𝜎12𝜋exp −𝜔2
2𝜎2
1+𝜀
𝜎22𝜋exp −𝜔2
2𝜎2
2
(25)
where 𝜎1=2500𝜇rad ,𝜎2=4𝜎1.𝜀=0 is a Gaussian distribu-
tion, and 𝜀=0.5 is a non-Gaussian distribution.
5.1.4 Simulation scenario settings
Scenario 1: When the measurement noise of the vision sensor
is under the Gaussian distribution (𝜀=0), compare the relative
navigation state estimation results of the three filtering methods
EKF, Robust-EKF, and Robust-Adaptive-EKF.
Scenario 2: When the noise measured by the vision sensor is
in a non-Gaussian distribution (𝜀=0.5), compare the relative
navigation state estimation results of the three filtering methods
EKF, Robust-EKF, and Robust-Adaptive-EKF.
5.2 Relative localization results under
Gaussian distribution
The three filtering algorithms EKF, Robust-EKF, and Robust-
Adaptive-EKF are applied to the relative navigation state
estimation of UGV-UAV under the Gaussian distribution for
simulation, and the results of relative navigation position, rel-
ative navigation speed, and relative navigation attitude are
obtained. As shown in Figures 4to 6, red, green, and blue
represent the relative state estimation results obtained by
the EKF, Robust-EKF, and Robust-Adaptive-EKF algorithms,
respectively.
It can be seen from Figures 4–6 that under the Gaussian
distribution conditions, the estimation accuracy of the three fil-
tering algorithms is equivalent, and all of them can accurately
estimate the relative navigation information between the UAV
and the UAV. The relative navigation position, relative naviga-
tion speed, and relative navigation attitude result accuracy are
further calculated below, and the absolute mean error (AME)
and the standard deviation (STD) are obtained as shown in
Table 3.
It can be seen from Table 3that under the Gaussian dis-
tribution, the relative navigation attitude errors of the three
algorithms are 0.01◦, the relative velocity errors are 0.02 m/s,
and the relative position errors are about 0.12 m. All of them
can satisfy relative navigation needs.
5.3 Relative positioning results under
non-Gaussian distribution
As in Section B, the three filtering algorithms EKF, Robust-
EKF, and Adaptive Robust-EKF are applied to the UGV-UAV
relative navigation state estimation problem under the non-
Gaussian distribution for simulation. The relative navigation
position, relative navigation speed, and relative navigation
attitude are obtained as shown in Figures 7to 9.
In the following, the accuracy of the relative navigation state
estimation results is further calculated, and AME and STD are
obtained as shown in Table 4.
It can be seen from Figures 7–9, and Table 4that under the
condition of non-Gaussian distribution, the Robust-Adaptive-
EKF algorithm has the highest estimation accuracy, followed
by the Robust-EKF algorithm, and the EKF algorithm has the
lowest estimation accuracy. And from the statistical results of
STD, the EKF algorithm has a large deviation, and the Robust-
Adaptive-EKF algorithm is the best.
192 DAI ET AL.
5.4 Accuracy comparison of algorithms in
two scenarios and real-time testing under
non-Gaussian distribution
To further verify the accuracy of the Robust-Adaptive-EKF
algorithm, 20 Monte Carlo simulation experiments are carried
out to analogous the real environment. The noise, the trajec-
tory parameters, and sensor parameters are different. The AME
of the position errors for the 20 experiment groups are shown
in Figures 10 and 11, where Figure 10 shows the error results
under Gaussian distribution, and Figure 11 shows the error
results under non-Gaussian distribution. The statistical results
of the mean absolute errors (MAEs) comparison of the three
algorithms (EKF, Robust-EKF, Robust-Adaptive-EKF) under
Gaussian distribution and non-Gaussian distribution are shown
in Table 5.
It can be seen from Figures 10 and 11, and Table 5that
under the condition of Gaussian measurement noise, the rel-
ative attitude, velocity, and position accuracy of the three
algorithms are relatively high, and the errors of the three algo-
rithms are the same. Under the condition of non-Gaussian
measurement noise, the relative navigation parameter estima-
tion accuracy of Robust-Adaptive-EKF is improved by about
—two to three times compared with EKF and Robust-EKF.
Among them, the relative attitude error is 0.01◦,therel-
ative velocity error is 0.02 m/s, and the relative position
error is 0.13 m, which can almost reach the accuracy of
relative navigation parameter estimation under Gaussian dis-
tribution. Through comparative analysis, it is proved that the
Robust-Adaptive-EKF algorithm proposed in this paper has
strong adaptability to system modelling uncertainty deviation
under complex environment, higher precision, and stronger
robustness.
To test the efficiency of robust adaptive relative localization
algorithm, the calculation times of the three algorithms under
non-Gaussian distribution have been tested. The test environ-
ment was Windows C++, and the test platform was configured
at 1.99 GHz, with Intel(R) Core (TM) i7-8550U CPU. The times
required for the single-step execution of the three algorithms
are shown in Table 6. The time required for the single-step
execution of the Robust-Adaptive-EKF algorithm was 9.15
×10−3, which meets the real-time requirements of practical
applications.
6CONCLUSIONS
In this paper, a relative navigation model based on IMU/Vision
is established, and the corresponding navigation filter is
designed. Aiming at the problems of system modelling deviation
and non-Gaussian measurement noise in relative navigation, a
robust adaptive Kalman filter algorithm is proposed by com-
bining the robust Kalman filter algorithm with the adaptive
algorithm. The simulation results show that, compared with
EKF and Robust-EKF, the relative navigation parameter accu-
racy of Robust-Adaptive-EKF is improved by about —two to
(c)
(a)
(b)
FIGURE 10 The AME of error (m) in the 20 experiment groups under
Gaussian distribution
DAI ET AL.193
(a)
(b)
(c)
FIGURE 11 The AME of errors (m) in the 20 experiment groups under
non-Gaussian distribution
three times, the relative attitude accuracy is 0.01◦, the relative
speed accuracy is 0.02 m/s, and the relative position accuracy
is 0.13 m. Robust-Adaptive-EKF under non-Gaussian distri-
bution can almost reach the accuracy of relative navigation
parameter under Gaussian distribution, and meet the real-time
requirements.
In conclusion, the robust adaptive Kalman filtering algorithm
proposed in this paper can provide reliable, adaptive, robust,
and high-precision localization information in complex envi-
ronments. However, there are few studies on the application
of multi-source fusion to relative navigation and positioning.
In the next research, we plan to apply the multi-source fusion
navigation and positioning algorithm to relative navigation and
positioning, and further use hardware for testing experiments in
actual scenarios.
AUTHOR CONTRIBUTIONS
J.D., S.L., X.H., Z.R., X.Y. and Y.L. conceived and designed this
study; J.D., S.L. and X.H. performed the experiments; J.D. wrote
the paper; Z.R., X.Y. and Y.L. corrected the grammar errors and
formatted the manuscript. All authors have read and agreed to
the published version of the manuscript.
ACKNOWLEDGEMENTS
The research was supported by the Science and Technol-
ogy Department of Henan Province through the project on
Research on Key Technologies for Fully Autonomous Oper-
ation of Multi-rotor Agricultural Plant Protection UAVs (No.
222102110029), the project on Research on Collaborative
Assembly Technology of Aviation Complex Parts Based on
Multi-Agent Reinforcement Learning (No. 222102220095). In
addition, the research was supported by the University Research
Team Development Fund (No. F4211).
CONFLICT OF INTEREST
The authors declare no conflict of interest.
DATA AVAILABILITY STATEMENT
The data that support the findings of this study are available
from the corresponding author upon reasonable request.
REFERENCES
1. Zhang, X.: Research on collaborative planning method of UAV/UGV
multi-point dynamic assembly of messenger mechanism. Beijing Institute
of Technology (2015)
2. Xu, B., Bai, B.L., Hao, Y.: The research status and progress of cooperative
navigation for multiple AUVs. Acta Autom. Sin. 41(3), 445–461 (2015)
3. Guo, J.F.: Summary of Key technologies for heterogeneous unmanned
system cooperative operations. J. Astronaut. 41(6), 686–696 (2020)
4. Xu, X.W.: A Literature Review on the Research Status and Progress
of Cooperative Navigation Technology for Multiple UAVs. Navigation
Positioning and Timing 4(4), 1–9 (2017)
5. Zhu, Y., Sun, Y., Zhao, W.: Relative navigation for autonomous aerial
refueling rendezvous phase. Optik 174, 665–675 (2018)
6. Gong, B.C., Wang, S., Hao, M.R.: Cooperative relative navigation algorithm
for multi-spacecraft close-range formation. J. Astronaut. 42(3), 344–350
(2021)
7. Sun, Y.R., Liu, Z.X., Zeng, Q.H.: INS/GNSS/Vision combined short-
range air refueling relative navigation algorithm. J. Chin. Inert. Technol.
29(6), 769–776 (2021)
194 DAI ET AL.
8. Lee, J.Y., Kim, H.S., Choi, K.H.: Adaptive GPS/INS integration for relative
navigation. GPS Solut. 20(1), 63–75 (2016)
9. Wang, Q., Cui, X., Li, Y.: Performance enhancement of a USV
INS/CNS/DVL integration navigation system based on an adaptive
information sharing factor federated filter. Sensors 17(2), 239 (2017)
10. Dai, J., Hao, X., Liu, S., Ren, Z.: Research on UAV robust adaptive posi-
tioning algorithm based on IMU/GNSS/VO in complex scenes. Sensors
22(8), 2832 (2022)
11. Dai, J., Liu, S., Hao, X., Ren, Z., Yang, X.: UAV localization algorithm
based on factor graph optimization in complex scenes. Sensors 22(15),
5862 (2022)
12. Ebinuma, T., Bishop, R.H., Lightsey, E.G.: Integrated hardware inves-
tigation of precision spacecraft rendezvous using the global positioning
system. J. Guid. Control Dyn. 26(3), 425–433 (2003)
13. Yang, D.C., Zhao, C.H., Gu, D.Q.: Study on noncooperative target
long-distance rendezvous relative navigation based on laser distance mea-
surement and inertial navigation. Aerospace Shanghai 44 (4), 75–80
(2016)
14. Kim, S.G., Crassidis, J.L., Cheng, Y.: Kalman filtering for relative space-
craft attitude and position estimation. J.Guid. Control Dyn. 30(1), 133–143
(2007)
15. Wang, D., Li, G.L.: Robust extended Kalman filter for GPS/INS relative
navigation. Trans. Beijing Inst. Technol. 34(12), 1278–1282 (2014)
16. Zhang, S.J., Ning, M.F., Chen, J.: Method of vision/inertial relative naviga-
tion for non-cooperative target and sensors self-calibration. J. Natl. Univ.
Defense Technol. 41(06), 25–32 (2019)
17. Fosbury, A.M.: Control and Kalman Filtering for Relative Dynamics of a
Formation of Uninhabited Autonomous Vehicles. Doctor Dissertation of
State University of New York at Buffalo, New York, pp. 48–84 (2006)
18. Qu, F.Y., Wang, X.G., Cui, N.G.: Fault-tolerance relative navigation
approach based on SINS/GPS/Vision for UAV. J. Chinese Inert. Technol.
21(6), 781–785 (2013)
19. Benjamin, L.T.: Vision-Based Deck State Estimation for Autonomous
Ship-Board Landing. American Helicopter Society Forum, pp. 1–19 (2013)
20. Chen, C., Wang, X., Qin, W.: Unmanned aerial vehicle landing navigation
algorithm based on robust filter. J. Chinese Inert. Technol. 25(3), 415–420
(2017)
21. Yan, G., Deng, Y.: Review on practical Kalman filtering techniques in tra-
ditional integrated navigation system. Navig. Position Timing 7, 50–64
(2020)
22. Wang, X.G., Guo, J.F., Cui, N.G.: Robust sigma-point filtering and its
application to relative navigation. Acta Aeronaut. ET Astronaut. Sinica.
5, 1024–1029 (2010)
23. Huber, P.J.: Robust estimation of a location parameter. Ann. Math. Stat.
35(2), 73–101 (1964)
24. Zhu, J., Liu, B., Wang, H.: State estimation based on improved cubature
Kalman filter algorithm. IET Sci. Meas. Technol. 14(5), 536–542 (2020)
25. Yang, Y.X.: The principle of equivalent weight-robust least squares solution
of parametric adjustment model. Bull. Surv. Mapp. 4, 33–35 (1994)
26. Yang, Y.X.: Robust Estimation Theory and Its Application. Bayi Press,
Beijing, China, p. 77 (1993)
27. Yang, Y.X.: Adaptive Dynamic Navigation and Positioning. Surveying and
Mapping Press, Beijing, China, pp. 191–192 (2006)
28. Xu, X., Pang, F., Ran, Y., Bai, Y.: An indoor mobile robot positioning
algorithm based on adaptive federated Kalman filter. IEEE Sens. J. 21,
23098–23107 (2021)
29. Zheng, S., Li, Z., Liu, Y.: An optimization-based UWB-IMU fusion
framework for UGV. IEEE Sens. J. 22(5), 4369–4377 (2022)
30. Zhang, L., Chang, G., Chen, C.: GNSS velocimeter by adaptively com-
bining carrier phase and Doppler measurements. IET Sci. Meas. Technol.
14(7), 762–771 (2020)
31. Albu, F., Tran, L., Nordholm, S.: The hybrid simplified Kalman fil-
ter for adaptive feedback cancellation. In: International Conference on
Communications (COMM), pp. 45–50(2018)
32. Han, J., Morag, C.: The influence of the sigmoid function parameters
on the speed of backpropagation learning. In: International Work-
shop on Artificial Neural Networks: From Natural to Artificial Neural
Computation, pp. 195–201 (1995)
How to cite this article: Dai, J., Liu, S., Xiangyang, H.,
Ren, Z., Yang, X., Lv, Y.: Unmanned ground
vehicle-unmanned aerial vehicle relative navigation
robust adaptive localization algorithm. IET Sci. Meas.
Technol. 17, 183–194 (2023).
https://doi.org/10.1049/smt2.12141