Vision Based UAV Attitude Estimation: Progress
Abd El Rahman Shabayek, Cédric Demonceaux, Olivier Morel, David Fofi
April 14, 2011
Le2i - UMR CNRS 5158
IUT Le Creusot
Université de Bourgogne, France
bourgogne.fr, Olivier.Morel@u-bourgogne.fr, David.Fofi@u-bourgogne.fr
Unmanned aerial vehicles (UAVs) are increasingly replacing manned systems
in situations that are dangerous, remote, or difficult for manned aircraft to access.
Its control tasks are empowered by computer vision technology. Visual sensors are
robustly used for stabilization as primary or at least secondary sensors. Hence, UAV
stabilization by attitude estimation from visual sensors is a very active research area.
Vision based techniques are proving their effectiveness and robustness in handling
this problem. In this work a comprehensive review of UAV vision based attitude
estimation approaches is covered, starting from horizon based methods and passing
by vanishing points, optical flow, and stereoscopic based techniques. A novel seg-
mentation approach for UAV attitude estimation based on polarization is proposed.
Our future insightes for attitude estimation from uncalibrated catadioptric sensors
are also discussed.
In order to determine the pose of the vehicle accurately and rapidly, the regular approach
is to use inertial sensors with other sensors and applying sensor fusion. Some sensors
used for this purpose are the Global positioning sensor (GPS), inertial navigation sensor
(INS), as well as other sensors such as altitude sensors (ALS) and speedometers. These
sensorshavesomelimitations. GPSsensorforexample, isnotavailableatsomelocations
or readings subject to error. INS has the disadvantage of accumulation of errors. To
overcome these limitations, vision-based navigation approaches have been developed.
These approaches can be used where GPS or INS systems are not available or can be
used with other sensors to obtain better estimations. UAV attitude estimation has been
deeply studied in terms of data fusion of multiple low cost sensors in a Kalman filter
(KF) framework to have the vehicle full state of position and orientation. But in pure
vision based methods, if a horizontal world reference is visible (e.g horizon) the camera
attitude can be obtained.
In order to control a flying vehicle at least six parameters (pose of the vehicle) should
be known; Euler angles representing the orientation of the vehicle and a vector of co-
ordinates, representing the position of the vehicle. Pose estimation basically depends
on viewing a world unchanging physical reference (e.g landmarks on the ground) for
accurate estimation. Our main concern in this work is to review the work that focuses
on attitude (roll, pitch, and yaw angles shown in figure (1)) estimation rather than pose
Figure 1: An illustrative sketch of the attitude (roll, pitch, and yaw angles)
In a typical flight, the demand for yaw angle will be largely constant and hence dis-
turbances tend to have a relatively small effect on yaw. Further, small steady state errors
are normally acceptable since (unlike roll and pitch) any errors will have no further ef-
fect on the UAV motion. Therefor, for the sake of UAV stabilization, the most important
angles to be estimated are the pitch and roll angles as most of the work in literature
propose. In this work, the focus will be on attitude estimation from perspective and
omnidirectional cameras. It is intended to give a complete review with some views to
enhance current work and propose novel ideas under investigation and development by
our research group.
1.1 Vision sensors for attitude estimation
Vision based methods were first introduced by  . They proposed to equip a Micro Air
Vehicle (MAV) with a perspective camera to have a vision-guided flight stability and au-
tonomy system. Omnidirectional sensors for attitude estimation were first introduced by
. The omnidirectional sensors (Fisheye and Catadioptric cameras shown in figure (2))
were used in different scenarios. Catadioptric sensors are commercially available for
reasonable prices. A catadioptric sensor has two main parts, the mirror and the lens. The
lens could be telecentric or perspective. The sensor is in general assembled as shown in
Omnidirectional sensors were used alone or in stereo configurations. Omnidirec-
tional vision presents several advantages: a) a complete surrounding of the UAV can be
captured and the horizon is totally visible, b) possible occlusions will have lower impact
on the estimation of the final results, c) whatever the attitude of the UAV, the horizon is
always present in the image, even partially, and the angles can always be computed, d) it
is also possible to compute the roll and pitch angles without any prior hypothesis, con-
trary to the applications using a perspective camera. Yet, catadioptric vision also presents
some drawbacks. For example,a) a catadioptric image contains significant deformations
due to the geometry of the mirror and to the sampling of the camera, b) catadioptric cam-
eras should be redesigned to a lower scale to be attached to a micro air vehicle (MAV).
(a) Perspective (b) Fisheye
Figure 2: Perspective and omnidirectional (Fisheye and Catadioptric) cameras
1.2 The main techniques for attitude estimation
In literature, the first group of methods tries to detect a horizontal reference frame in the
world to estimate the up direction and hence the attitude of the vehicle. The horizon,
if visible, is the best natural horizontal reference to be used . However, in urban
environments the horizon might not be visible. Hence, the second group tries to find the
vanishing points from parallel vertical and horizontal lines which are basic features of
man made structure (e.g ). The third group was biologically inspired from insects, it
employs the UAV motion (optical flow) for the sake of required estimation . Stereo
vision based techniques came to the play to open the door for more accurate estimation
specially if combined with optical flow (e.g ). All these techniques will be discussed
in the following sections.
Most of the employed techniques in literature use the Kalman filter (KF) or one of
its variations in order to obtain an accurate and reliable estimation specially if more than
one sensor is used and their measurements are fused. For a general parameter estima-
tion issue, the extended Kalman filter (EKF) technique is widely adopted. Due to the
processing of EKF in a linear manner, it may lead to sub-optimal estimation and even
filter divergence. Nevertheless, state estimation using EKF assumes that both state recur-
sion and covariance propagation are Gaussian. Unscented Kalman filter (UKF) resolves
the nonlinear parameter estimation and machine learning problems. It can outperform
the EKF especially for those highly nonlinear system dynamics/measurement processes.
None of the Jacobean or derivatives of any functions are taken under the UKF process-
ing . For example in , using an EFK, the candidate horizon lines are propagated
and tracked through successive image frames, with statistically unlikely horizon can-
didates eliminated. In , they followed the EKF framework to combine inertial and
visual sensor for real time attitude estimation. They have designed a KF for image line
The paper will be organized as follows: sections (2, 3, 4), will review the general tech-
niques for attitude estimation from visual sensors (perspective and omnidirectional only)
in detail. In section (2), horizon detection algorithms will be briefly explained and re-
viewed. Vanishing points based techniques are reviewed in section (3). The classical
and hybrid approaches using stereo-vision and optical flow are reviewed in section (4).
Finally we conclude in (5).
2 Horizon Detection
The visual sensor is not only a self-contained and passive like an INS but also interactive
with its environment. An absolute attitude can be provided by detecting a reliable world
reference frame. Attitude computation by vision is based on the detection of the hori-
zon, which appears as a line in perspective images or a curve in omnidirectional images
as shown in figure (3), and on the estimation of the angle between the horizon and a
Due to the difficulty in obtaining ground-truth for aircraft attitude, most of the work
in literature do not provide a quantitative measure of error in their estimates of roll and
pitch. In , they provided a complexity and performance comparison between their
method and other methods in litterature. They have included a comparison table of exe-
cution times for various published studies on visual attitude estimation.
In the following subsections, we will cover in detail the different segmentation ap-
proaches for horizon detection in section (2.1), a proposal to segment using polarization
in section (2.2), and both the perspective and omnidirectional scenarios will be reviewed.
Section (2.3) will briefly discuss horizon estimation and attitude computation in the per-
spective case. Section (2.4) will briefly discuss the same in the omnidirectional case
specially in the catadioptric scenario which is frequently used.
(a) Perspective (b) Non-central catadioptric
Figure 3: Horizon in a) a perspective image, b) a non-central catadioptric image
2.1 Sky/Ground Segmentation
As the segmentation of sky and ground is a crucial step toward extracting the horizon
line/curve, which is used for attitude estimation, these segmentation methods will be
segmentation fails in scenarios where the underlying Gaussian assumption for the sky
and ground appearances is not appropriate . These assumptions might be enhanced by
a statistical image modeling framework by building prior models of the sky and ground
then trained. Since the appearances of the sky and ground vary enormously, no single
feature is sufficient for accurate modeling; as such, these algorithms rely both on color
and texture as critical features. They may use hue and intensity for color representation,
and the complex wavelet transform for texture representation. Then they may use Hid-
den Markov Tree models as underlying statistical models over the feature space . In
, the algorithm is based on detecting lines in an image which may correspond to the
horizon, followed by testing the optical flow against the measurements expected by the
segmentation based on color information , or the sky/ground partitioning is done in
the spherical image thanks to the optimization of the Mahalanobis distance between
these regions. The search for points in either regions takes place in the RGB space
. In order to isolate the sky from the ground [12, 13], an approach based on the
method employed by  weights the RGB components of each pixel using the function
f (RGB) = 3B2/(R+G+B).
In , they propose an algorithm which can be incorporated into any vision system
(e.g. narrow angle, wide angle or panoramic), irrespective of the way in which the en-
vironment is imaged (e.g. through lenses or mirrors). The proposed horizon detection
method consists of four stages: a) enhancing sky/ground contrast, b) determining opti-
mum threshold for sky and ground segmentation, c) converting horizon points to vectors
in the view sphere, and d) fitting 3D plane to horizon vectors to estimate the attitude.
In  they proposed segmentation using temperature from thermopile sensors in the
thermal infrared band. However, in this work, the focus will be on attitude estimation
from perspective and omnidirectional sensors only.
The previous segmentation solutions are either complex and/or time consuming. A
method based on polarization for segmentation in section (2.2) is proposed. We believe
it will have significant enhancements in both complexity and time due to its simplicity .
We propose a novel non-central catadioptric sensor where the mirror is a free-form shape
and the camera is polarimetric (e.g FD-1665P Polarization Camera ) to be used for
2.2Polarization based segmentation
Instead of using color information or edge detection algorithms for segmentation which
may require different complex models and offline processing as shown, we propose to
use polarization information which exists in the surrounding nature. Polarization infor-
mation are directly computed from three intensity images taken at three different angles
of a linear polarization filter (0, 45, and 90 degrees) or at one shot using a polarimetric
tation , material classification , water hazards detection for autonomous off-road
navigation  , and similar applications. However, to the best of our knowledge, it is
the first time to propose using polarization for sky/ground segmentation for UAV attitude
The most important polarization information are phase (angle) and degree. Accord-
ing to , the phase of polarization is computed as follows:
= q +90
q = q ?90
and the degree of polarization is:
where I0, I45, andI90are intensity images taken at 0, 45, and 90 degrees of the rotating
polarizer respectively (or at one shot from a polarimetric camera).
Figure (4) shows the segmentation results for non-central catadioptric images with
the horizon detected by simply detecting the transition area. This technique is very sim-
ple and can be optimized by kind of binary search in the image having very rapid and
robust results for the detected horizon in the image. Only few regions of the image are
needed to be inspected for their degree or angle of polarization to decide for the search
direction. Unlike conventional segmentation methods, thanks to polarization, we do not
face the illumination problem caused by the sun being in the image.
In future work, we will provide detailed algorithms with complexity and run time
comparison with other methods found in literature.
(a) 0 degree (b) 45 degree(c) 90 degree
(d) Segmentation based on the de-
gree of polarization
(f) Extracted horizon curve
Figure 4: Sky/Ground segmentation and horizon extraction based on polarization from
non-central catadioptric images
2.3 Using perspective sensors
The horizon is projected as a line in the perspective image. Intuitively, it is required to
extract that line. Most methods first segment the image into sky/ground areas, then take
the separating points as the horizon line. The attitude is dependant on the gradient of
that horizon line on the image plane. In literature, the general approach is to find the
normal to the plane of the horizon in order to estimate the roll and pitch angles. The
normal vector has direct mathematical relation with the attitude as expressed in different
methods. The work done by [20, 21] are examples of successful autonomous control of
a MAV based on attitude estimation from the horizon detected.
detection. In [1, 22] they proposed to equip a MAV with a perspective camera to have a
ing the straight line that separates the sky from the ground using the context difference of
the two regions. In  they treated the horizon detection problem as a subset of image
segmentation and object recognition, and used a percentage of the sky seen as an error
signal to a flight stability controller on a MAV. The resulting system was stable enough
to be safely flown by an untrained operator in real time. In contrast,  uses a direct
edge-detection technique, followed by automatic threshold and a Hough-like algorithm
to generate a “projection statistic”’ for the horizon. It claims a 99% success rate over
several hours of video. Importantly, it deals only with detection, not estimation of at-
titude. In  they propose an algorithm slightly similar to  in that it uses an edge
detection technique followed by a Hough transform. However, they propose different
image pre-filtering. In [23, 24, 25, 14] they use the centroids of sky and ground to ex-
tract the horizon and derive the different angles. They try to simplify their work by using
a circular mask to reduce image asymmetry and to simplify the calculations.
2.4 Using omnidirectional sensors
The use of a single perspective camera generates several drawbacks. Firstly, a partial
view of the environment and important occlusions in the horizon can have a serious
influence on the final result. Secondly, the horizon is visible only in a particular interval
of roll and pitch values. If the UAV gets out of this interval, the final image is exclusively
made of sky or earth and the horizon cannot be detected. Thirdly, it is only possible
to compute the roll angle while the pitch is only approximated thanks to a hypothesis
on the altitude of the UAV. All that pushed the need toward employing omnidirectional
sensors to capture the horizon in almost all scenarios. The horizon appears as a curve
in the omnidirectional image. It is common to use both fisheye and central catadioptric
sensors. As both are treated by the equivalence sphere theory proposed by . The
particular geometric characteristics of the catadioptric sensor will be briefly explained in
the next section. Once the horizon is detected, these characteristics are used to compute
the attitude of the UAV.
2.4.1Central catadioptric projection of the horizon
As demonstrated in , a 3D sphere projects on the equivalence sphere in a small circle,
and then on the catadioptric image plane in an ellipse (see figure (5)). Consequently, the
attitude computation is based on searching for an ellipse in the omnidirectional image or
properties of the equivalent sphere allow to deduce the roll and pitch angles. Indeed, the
normal of the projected horizon on the sphere, which is also confounded with the line
passing through the center of the sphere of equivalence and through the center of the
earth represents in fact the attitude of the UAV depending on the position of the optical
axis. Then, the computation of the coordinates of the optical axis is sufficient in order to
deduce the roll and pitch angles.
2.4.2Horizon estimation and attitude computation
To estimate the horizon, first the catadioptric image should be segmented to obtain
the sky and ground and hence the points belonging to the horizon. Next, the horizon
points should be back projected on the equivalence sphere. Finally, the best plane passes
through the horizon on that sphere should be estimated to deduce its normal which gives
the roll and pitch angles (e.g [2, 11]).
In , they proposed to use an omnidirectional visual sensor in order to compute the
attitude of a UAV. They have extended the work of [1, 22] to detect the curved horizon
line. They show an adaptation of the Markov Random Fields (MRFs) to treat the defor-
mations in the catadioptric images in order to detect the horizon and hence the catadiotric
geometric characteristics are used to compute the UAV attitude. This method gives in-
teresting results but do not use sufficiently the geometric characteristics of catadioptric
vision. Moreover, the segmentation step is time consuming and do not permit a real time
implementation. In , they present higher accuracy and computation time. They use
Figure 5: The relation between the horizon projection and the roll and pitch angles.
(Adapted from ).
the geometric characteristics of the central catadioptric sensor for a formulation of the
process as an optimization problem which is solved on the sphere of equivalence in order
to compute directly the attitude angles. In , a hybrid method that is using the horizon
and the homography is proposed. In [12, 13], they propose a similar approach to  for
attitude estimation and a stereo-based system for height and motion estimation.
3 Vanishing Points
In [11, 2], the horizon was determined with Random Markow Fields or RGB based Ma-
halanobis distance. This approach requires the conditions where the horizon is visible
(e.g low altitude in urban environments). In addition, it can not be used to estimate the
yaw angle. In urban environments, the world reference can be the parallel lines which
are a basic property of man-made structures. In this situations, vanishing points at the
intersection of parallel vertical and horizontal lines can be used for attitude estimation
In , a batch process was developed to recover the history of camera orientations
from non-linear optimization (bundle adjustment) of the vanishing points. In , their
approach is based on vanishing points detection using raw line measurements directly
to refine the attitude. They do not require any line tracking. But they fuse these line
measurements with IMU gyro angle and compare each line segment with the current
best attitude estimate.
Vanishing points were more exploited with the omnidirectional sensors. In , they
use lines that are available in urban areas which avoids the limitations of horizon deter-
mination but it is still not possible to estimate the yaw angle, also it requires to determine
the sky. Therefore, their approach is not suitable in dense city environments as well as
closed areas. A more recent work proposes the use of vanishing points and infinite ho-
mography to estimate the helicopter attitude. This approach can be used in urban
environments, however this method has never been applied to a real UAV. In , they
used the approach described in  to estimate helicopter attitude and improved it using
The research area in using vanishing points for attitude estimation is very active.
It provides the intuitive solution for the attitude estimation problem specially in urban
environments. Duetoitsimportance, thefollowingsubsectionswillexplaintheminmore
details using perspective and omnidirectional sensors. For a comprehensive evaluation
of several approaches for vanishing points detection, the reader is referred to [31, 32].
The perspective projection of parallel lines intersects at a single point on an image called
the vanishing point. In , given the camera calibration matrix, the geometric relation-
ship between the vanishing points, the horizon, and camera orientation has been well
established in a Gaussian sphere using 2D projective geometry . All vanishing point
can be considered in a Gaussian sphere representation even those at infinity. For more
details on representing vanishing points on a Gaussian sphere from a calibrated camera
(see figure (6)), the reader is referred to [33, 34, 8].
3.1.1 Gaussian sphere
Figure 6: Gaussian Sphere adapted from 
The Gaussian sphere is a unit sphere which shares the same optical center of the
pinhole camera. In the 2D projective space, an image line is represented as a normal
vector of a great circle in homogeneous coordinates. The intersection of two parallel
edges is a vanishing point which can be computed by the duality between the points and
lines in a projective plane i.evij= li⇥ljwhere vijis a vanishing point and li, and lj
are parallel lines. The vanishing point is the direction to the corresponding 3D point at
In a calibrated camera, the vanishing points formed by vertical edges and those
formed by horizontal edges are geometrically constrained to:
horizontal= 0, i = 1,.....,n.
Vanishing points that lie on the same plane define a vanishing line in an image. Then
the horizon is equal to the vanishing line that links any two horizontal vanishing points.
The horizon is dual to the vertical vanishing point. This can be geometrically explained
as having the horizon as the projection of the world ground plane, and the normal to the
ground plane is projected on the vertical vanishing point i.e:
horizon = vi
The UAV attitude can be determined when either the vertical vanishing point or at
least two horizontal vanishing points are recovered from the image given that a) the great
circle in the Gaussian sphere has the same orientation as the world ground plane, and b)
the relative camera pose with respect to an UAV is known. In general, it is assumed that
the camera is attached to the UAV where the camera’s principle axis is aligned along the
3.1.2 Vertical vanishing points
In urban environments, vertical edges meet at a single vanishing point in the same direc-
tion as the gravity in the world coordinates. The vertical vanishing point is the perspec-
tive projection of the world z-axis with the camera pose matrix. Let vvertical= (vx,vy)T,
be the vertical vanishing point , then once it is found, the attitude can be immediately
computed by (see figure (7)):
roll = f = atan2(vx,vy), pitch = q = atan
The horizon line on the image is a line defined by the vertical vanishing point where:
3.1.3Horizontal vanishing points
In urban environments, horizontal edges which are orthogonal to the gravity direction
meet at vanishing points in the world ground plane (see figure (8)). One of the horizontal
vanishing points is the perspective projection of the world x-axis with the camera pose
matrix. Then the horizontal vanishing point is:
where y is the yaw angle. All the horizontal vanishing points are along the horizon and
their locations are determined by the different yaw angles.
Figure 7: Illustration of the relation between a vertical vanishing point and the roll and
Figure 8: Horizontal vanishing points.
As previously mentioned, Projection of 3D world points to the image plane can be done
in three steps. Firstly the point is projected to the equivalent sphere, then to the plane
at infinity and finally to the image plane. Besides, projection of 3D lines generates a
great circle on the equivalent sphere (see figure (5)). By back projecting every can-
didate edge on the sphere and checking each edge if it verifies the great circle con-
straint, one can decide which edges belong to real 3D lines. In order to do this, the
edges divided according to their gradient orientations and selected by their lengths are
back projected to the sphere. Then plane normal of the great circle is computed by
cross product of first and last edgel directions. In addition, parallel lines have the same
vanishing direction on the equivalent sphere. Therefore, dominant parallel lines can be
extracted by counting lines which satisfy some similarity threshold based on their van-
ishing direction. By excluding found parallel lines and repeating the same algorithm,
these dominant vanishing directions can be found. Based on an orthogonality threshold,
if |u1⇥u2| OrthogonalityThreshold, the cross product u3= u1⇥u2is computed to
determine the third vanishing direction, where uis are orthogonal parallel lines. If the
inequality is not satisfied, this means that the detection of orthogonal parallel lines is
failed; therefore attitude estimation at that frame should be skipped. In that case, it is
thought that the UAV does not change its orientation.
4 Stereo Vision And Optical Flow
(a) Stereo vision System (b) Phase-based estimation of the optical
flow field adapted from 
Figure 9: Stereo Vision and Optical Flow
Computer stereo vision, is a part of computer vision where two cameras capture the same
scene but they are separated by a distance as shown in figure (9a). A computer compares
the images while shifting the two images together over top of each other to find the parts
that match. The shifted amount is called the disparity.
In , the authors used a dual CCD stereo vision system in order to improve the
computation of the attitude by determining the complete pose of the UAV taking advan-
tages of UKF. However, this system relies on the capture of ground targets/landmarks in
both images which limits the environment in which the UAV can move. In , they pre-
sented a mixed stereoscopic vision system made of fish-eye and perspective cameras for
altitude estimation. Since there exists a homography between the two captured views,
where the sensor is calibrated and the attitude is estimated by the fish-eye camera us-
ing the techniques in [2, 3] , the algorithm searches for the altitude which verifies this
homography. It allows real time implementation. In [12, 13] , the conventional stereo
system was used for altitude computation. But for attitude computation, they also used a
similar approach to .
4.2 Optical flow
Optical flow is the approximation of the motion field which can be computed from time-
varying image sequences (see figure (9b)). It provides many important visual cues .
It is possible to estimate the flight altitude from the observed optical flow in the down-
ward direction. Faster optic flow indicates a low flight altitude. Obstacles can be detected
intheforwarddirectionbydetectingexpansion, ordivergence, intheforwardvisualfield.
Optical Flow Estimation Methods are based on a) differential Techniques (dense mo-
tion field) where spatial and temporal variations of the image brightness at all pixels are
considered, b) phase methods where response of filters to energy signals are used, c)
matching techniques (sparse motion field) where the disparity of special image points
(features) between frames is estimated.
In [39, 40], they derived a form of the KF that uses the relationship between vision-
based measurements and the motion of the camera. The resulting implicit extended
Kalman filter (IEKF) can be used to recover the camera motion states. In , they
reused [39, 40] work in terms of an aircraft state-estimation problem by incorporating
aircraft dynamics into the IEKF framework. The resulting formulation partially esti-
mated the aircraft states but exhibited relatively slow convergence. Improvements have
been demonstrated by [42, 43] who also used an aircraft model. Unfortunately, accu-
rate MAV models are often not available within an aggressive flight regime where the
aerodynamics are difficult to characterize.
Several techniques have utilized the kinematic relationship between camera motion
and the resulting optical flow to directly solve for unknown motion parameters using
constrained optimization. In [44, 45, 46], these techniques depend on at least partial
knowledge of the translational velocity for use in the optimization. This knowledge
often depends on GPS measurements. In , they addressed the problem of estimating
aircraft states during a GPS-denied mission segment. An iterative optimization approach
is adopted to determine the angular rates and the wind-axis angles. No knowledge of
vehicle velocity is required. The coupled aircraft-camera kinematics are used to solve
for aircraft states in similar fashion to previous efforts; however, velocity dependencies
are removed through decoupling the optical flow resulting from angular and translational
motion, respectively. Angular rate estimates are obtained initially and used to setup a
simple linear least-squares problem for the aerodynamic angles. Performance of the
least-squares problem is further improved through the application of a weighting scheme
derived from parallax measurements.
But Optical flow is inherently noisy, and obtaining dense and accurate optical flow
images is computationally expensive. Additionally, systems that rely on optical flow for
extracting range information need to discount the components of optical flow that are
induced by rotations of the aircraft, and use only those components that are generated
by the translational motion of the vehicle. This either requires an often noisy, numerical
estimate of the roll, pitch, and yaw rates of the aircraft, or additional apparatus for their
explicit measurement, such as a three-axis gyroscope. Furthermore, the range perceived
from a downward facing camera or optical flow sensor is only dependent upon altitude,
velocity, and the aircraft’s attitude .
ing guidance for autonomous aircraft operating in low-altitude or cluttered environments
[5, 48]. In , the optical flow of the image for each candidate horizon line is calculated,
and using these measurements from the perspective camera, they are able to estimate the
body rates of the aircraft. In , they estimate the heading of a small fixed pitch four
rotor helicopter. Heading estimates are computed using the optical flow technique of
phase correlation on images captured using a down facing camera. The camera is fitted
with an omnidirectional lens and the images are transformed into the log-polar domain
before the main computational step.
4.3Optical flow from stereo vision
In [5, 48, 50], they proposed a stereo vision system from two non-central catadioptric
cameras. The profile of the mirror is designed to ensure that equally spaced points on
the ground, on a line parallel to the camera’s optical axis, are imaged to points that
are equally spaced in the camera’s image plane. However, they have not used physical
mirrors, but instead used high resolution video cameras equipped with wide-angle fish-
eye lenses and simulated the imaging properties of the mirrors by means of software
lookup tables. Given the measured disparity surface from the optical flow, the attitude
(roll and pitch) and altitude can be estimated by iteratively fitting the modelled surface
to the measurements. They propose to enhance their method by estimating attitude and
altitude with respect to an assumed ground plane by reprojecting the disparity points into
3D coordinates. In , he presentes a technique for estimating the aerodynamic attitude
in the presence of dynamic obstacles. This technique relies on optical flow and stereo
vision to remove dynamic objects from the static background. The resulting flow field is
used for attitude computation from the calculated flow centroids.
Any UAV may fly in low, middle, or high altitudes. We believe that the Omnidirec-
tional sensors should be always used because either the horizon will be always visible
(middle and high altitudes) or the vanishing points directions in low altitudes. If the
horizon is visible, then attitude should be estimated based on it. We proposed a simpler
method for segmentation and horizon detection based on polarization which can be used.
In urban environments, techniques based on vanishing points should be used. If obstacle
Papers RollPitch Yaw
[1, 22, 23, 24, 25, 14, 11, 2, 7, 9]
[27, 12, 13]
Stereo vision and optical flow
[46, 47, 7, 5, 48, 50, 51, 37]
Table 1: The estimated attitude angles (Roll, Pitch, and Yaw).
avoidance and altitude estimation are required with attitude estimation, then optical flow
approaches from stereoscopic sensors are recommended.
In the work presented in sections (2.4, 3), the catadioptric sensors used were assumed
to be central sensors having a single view point. But in practice, non-central catadioptric
sensors are more practical due to higher resolution and simplicity in design. Even the
claimed central sensors, might be slightly non-central due to possible misalignments of
the lens. All of that pushed the need toward developing methods for attitude estimation
from non-central catadioptric sensors.
Currently, we develop an approach for UAV attitude estimation from uncalibrated
non-central catadioptric sensor with unknown mirror shape. We assume that the mirror is
a symmetric surface of revolution (SOR), the catadioptric image has two quadrics (inner
and outer cross sections) as shown in figure (4), and a horizontal reference is visible in
the captured image (e.g horizon, horizontal edges). We try to find the angle between the
axis of revolution and the normal of the plan containing the horizontal reference.
In summary, a comprehensive review on attitude estimation approaches from visual
sensors has been covered. Table (1) shows the papers and the estimated angles in the
reviewed work. The main general approaches has been shortly discussed. Horizon de-
tection which is the main key for attitude estimation in middle and high altitudes, has
been discussed in the light of current ongoing work using different visual sensors. The
sky/ground segmentation methods, for horizon detection, found in literature have been
reviewed and a novel approach based on polarization applied to UAV attitude estimation
has been proposed. In low altitudes the horizon is mostly invisible, hence the line seg-
ments found in man-made structures are exploited to obtain vanishing points for attitude
computations. Stereoscopic and optical flow based techniques have been also covered.
Optical flow computation from stereoscopic systems is proposed in very recent works.
To the best of our knowledge, the main work done for UAV attitude estimation from
vision sensors (perspective and omnidirectional only) has been covered here.
 Scott M. Ettinger, Michael C. Nechyba, Peter G. Ifju, and Martin Waszak. Vision-
guided flight stability and control for micro air vehicles. In IEEE/RSJ Int Conf on
Robots and Systems, pages 2134–2140, 2002.
 Cédric Demonceaux, Pascal Vasseur, and Claude Pégard. Omnidirectional vision
on uav for attitude computation. In ICRA, pages 2842–2847, 2006.
 Cédric Demonceaux, Pascal Vasseur, and Claude Pégard. Uav attitude computation
by omnidirectional vision in urban environment. In ICRA, pages 2017–2022, 2007.
 Geoffrey L. Barrows, Javaan S. Chahl, and Mandyam V. Srinivasan. Biomimetic
visual sensing and flight control. In Proceedings Seventeenth International Un-
manned Air Vehicle Systems Conference, 2002.
 Richard J. D. Moore, Saul Thurrowgood, Daniel Bland, Dean Soccol, and
Mandyam V.Srinivasan. A stereo vision system for uav guidance. In IEEE/RSJ
International Conference on Intelligent Robots and Systems, 2009.
 Shan-Chih Hsieh, L.K. Wang, Fei-Bin Hsaio, Kou-Yuan Huang, and Fan-Jen Tsai.
Airborne attitude/ground target location determinations using unscented kalman
filter. In Aerospace Conference, 2004. Proceedings. 2004 IEEE, volume 3, pages 6
vol. (xvi+4192), 2004.
 Damien Dusha, Wageeh Boles, and Rodney Walker. Attitude estimation for a fixed-
wing aircraft using horizon detection and optical flow. In Proceedings of the 9th
Computing Techniques and Applications, DICTA ’07, pages 485–492, Washington,
DC, USA, 2007. IEEE Computer Society.
 MyungHwangboandTakeoKanade. Visual-inertialattitudeestimationusingurban
scene regularities. In To appear in IEEE International Conference on Robotics and
 Saul Thurrowgood, Dean Soccol, Richard J. D. Moore, Daniel Bland, and
Mandyam V. Srinivasan. A vision based system for attitude estimation of uavs.
In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009.
 Sinisa Todorovic and Michael C. Nechyba. Sky/ground modeling for autonomous
mav flight. In In IEEE International Conference on Robotics and Automation
(ICRA, pages 1422–1427, 2003.
 Cédric Demonceaux, Pascal Vasseur, and Claude Pégard. Robust attitude estima-
tion with catadioptric vision. In IROS, pages 3448–3453, 2006.
 Iván F. Mondragón, MiguelA. Olivares-Méndez, Pascual Campoy, Carol Martínez,
and Luís Mejias. Unmanned aerial vehicles uavs attitude, height, motion estimation
and control using visual systems. Autonomous Robots, 29, 2010.
 Iván F. Mondragón, Pascual Campoy, Carol Martinez, and Miguel Olivares. Om-
nidirectional vision applied to unmanned aerial vehicles uavs attitude and heading
estimation. Robotics and Autonomous Systems, March 2010.
 T.D.Cornall, G.K.Egan, and A.Price. Aircraft attitude estimation from horizon
video. ELECTRONICS LETTERS, 42(12), June 2006.
 B. Taylor, C. Bil, and S. Watkins. Horizon sensing attitude stabilisation: A vmc
autopilot. In 18th International UAV Systems Conference, 2003.
 Patrick Terrier, Vincent Devlaminck, and Jean Michel Charbois. Segmentation of
rough surfaces using a polarization imaging system. J. Opt. Soc. Am., pages 423–
 L. B. Wolff. Polarization-based material classification from specular reflection.
IEEE Trans. Pattern Anal. Mach. Intell., 12:1059–1071, November 1990.
 Bin Xie, Zhiyu Xiang, Huadong Pan, and Jilin Liu. Polarization-based water haz-
ards detection for autonomous off-road navigation. In Intelligent Robots and Sys-
tems, 2007. IROS 2007. IEEE/RSJ International Conference on, pages 3186 –3190,
 G. Bao, Z. Zhou, S. Xiong, X. Lin, , and X.Ye. Towards micro air vehicle flight
autonomy research on the method of horizon extraction. In Instrumentation and
Measurement Technology Conference, . IMTC ’03. Proceedings of the 20th IEEE,
 S. Todorovic and M. C. Nechyba. A vision system for intelligent mission profiles
of micro air vehicles. IEEE Transactions on Vehicular Technology, 53:1713–1725,
 Scott M. Ettinger, Michael C. Nechyba, Peter G. Ifju, and Martin Waszak. Vision-
guided flight stability and control for micro air vehicles.
17(7):617–640, November 2003.
 Terry D. Cornall and G. K. Egan. Measuring horizon angle from video on a small
unmanned air vehicle. In 2nd International Conference on Autonomous Robots and
 T. D. Cornall and G. K. Egan. Measuring horizon angle from video on a small un-
manned airborne vehicle. In 2nd International Conference on Autonomous Robots
and Agents, Palmerston North, New Zealand, 2004.
 Terry D.Cornall and G.K.Egan. Measuring horizon angle from video on a small un-
manned air vehicle. Technical report, MONASH University, Department of Elec-
trical and Computer Systems Engineering, 2005.
 S. Baker and S. K. Nayar. A theory of catadioptric image formation. In Interna-
tional Conference on Computer Vision (ICCV03), pages 1351–1358, Oct 2003.
 J.C.Bazin, I.S.Kweon, C.Demonceaux, and P.Vasseur.
by combining horizon-based and homography-based approaches for catadioptric
image. In 6th IFAC/EURON Intelligent Autonomous Vehicles (IAV07), Toulouse,
Uav attitude estimation
 M.E. Antone and S. Teller. Automatic recovery of relative camera rotations for
urban scenes. pages II: 282–289, 2000.
 Jean Charles Bazin, Inso Kweon, Cédric Demonceaux, and Pascal Vasseur. Uav
attitude estimation by vanishing points in catadioptric images. In ICRA, pages
 Metin Tarhan and Erdinc Altug. Ekf based attitude estimation and stabilization of a
quadrotor uav using vanishing points in catadioptric images. Journal of Intelligent
& Robotic Systems, pages 1–21–21, September 2010.
 J. A. Shufelt. Performance evaluation and analysis of vanishing point detection
techniques. Pattern Analysis and Machine Intelligence, IEEE Transactions on,
 Patrick Denis, James H. Elder, and Francisco J. Estrada. Efficient edge-based meth-
ods for estimating manhattan frames in urban imagery. In Proceedings of the 10th
European Conference on Computer Vision: Part II, pages 197–210, Berlin, Heidel-
berg, 2008. Springer-Verlag.
 S.T.Barnard. Interpreting perspective images. Artificial Intelligence, 21:435–462,
 Carsten Rother. A new approach for vanishing point detection in architectural en-
vironments. In In Proc. 11th British Machine Vision Conference, pages 382–391,
 L.K. Wang, S.-C. Hsieh, E.C.-W. Hsueh, Fei-Bin Hsaio, and Kou-Yuan Huang.
Complete pose determination for low altitude unmanned aerial vehicle using stereo
vision. In Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ
International Conference on, pages 108 – 113, 2005.
 D. Eynard, P. Vasseur, C. Demonceaux, and V. Fremont. Uav altitude estimation
by mixed stereoscopic vision. In Intelligent Robots and Systems (IROS), 2010
IEEE/RSJ International Conference on, pages 646 –651, 2010.
 J.J GIBSON. The ecological approach to visual perception. Houghton Mifflin,
 S. Soatto, R. Frezza, and P Perona. Motion estimation via dynamic vision. IEEE
Transactions on Automatic Control, 41:393–413, 1996.
 S. Soatto and P. Perona. Recursive 3-d visual motion estimation using subspace
constraints. International Journal of Computer Vision, 22:235–259, 1997.
 P. Gurfil and H. Rotstein. Partial aircraft state estimation from visual motion using
the subspace constraints approach. Journal of Guidance, Control, and Dynamics,
 T. Webb, R. Prazenica, A. Kurdila, and R. Lind. Vision-based state estimation
for autonomous micro air vehicles. Proc. of the AIAA Guidance, Navigation, and
Control Conference, page 5249, 2004.
 T. Webb, R. Prazenica, A. Kurdila, and R. Lind. Vision-based state estimation for Download full-text
uninhabited aerial vehicles. Proc. of the AIAA Guidance, Navigation, and Control
Conference, page 5869, 2005.
 G. Gebert, D. Snyder, J. Lopez, N. Siddiqi, and J. Evers. Optical flow angular
rate determination. Proc. of the International Conference on Image Processing,
 R.V. Iyer, Z. He, and P.R. Chandler. On the computation of the ego-motion and
distance to obstacles for a micro air vehicle. Proc. of the IEEE American Control
 J. Kehoe, R. Causey, A. Arvai, and R. Lind. Partial aircraft state estimation from
optical flow using non-model-based optimization. Proc. of the IEEE American
Control Conference, 2006.
 JosephJ.Kehoe, AdamS.Watkins, RyanS.Causey, andRickLind. Stateestimation
using optical flow from parallax-weighted feature tracking. Proceedings of the
AIAA Guidance, Navigation, and Control Conference, 2006.
 Richard J. D. Moore, Saul Thurrowgood, Dean Soccol, Daniel Bland, and
Mandyam V. Srinivasan. A bio-inspired stereo vision system for guidance of au-
tonomous aircraft. Advances in Theory and Applications of Stereo Vision, 2010.
 John Stowers, Andrew Bainbridge-Smith, Michael Hayes, and Steven Mills. Op-
tical flow for attitude estimation of a quadrotor helicopter. European Micro Air
Vehicle Conference, 2009.
 Richard J. D. Moore, Saul Thurrowgood, Daniel Bland, Dean Soccol, and
Mandyam V.Srinivasan. Uav altitude and attitude stabilisation using a coaxial stere
ovision system. In IEEE International Conference on Robotics and Automation,
 Chris Hedden. Vision-based uav aerodynamic attitude estimation in the presence
of dynamic obstacles. University of Kansas, 12 2010.