Conference Paper

Tracking ground targets with measurements obtained from a single monocular camera mounted on an unmanned aerial vehicle

Autonomous Vehicle Syst. Lab., Kansas State Univ., Manhattan, KS
DOI: 10.1109/ROBOT.2008.4543188 Conference: Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on
Source: OAI

ABSTRACT In this paper, a novel method is presented for tracking ground targets from an unmanned aerial vehicle (UAV) outfitted with a single monocular camera. The loss of observability resulting from the use of a single monocular camera is dealt with by constraining the target vehicle to follow ground terrain. An unscented Kalman filter (UKF) provides a simultaneous localization and mapping solution for the estimation of aircraft states and feature locations, which define the target's local environment. Another filter, a loosely coupled Kalman filter for the target states, receives 3D measurements of target position with estimated covariance obtained by an unscented transformation (UT). The UT uses the mean and covariance from the camera measurements and from the UKF estimated aircraft states and feature locations to determine the estimated target mean and covariance. Simulation results confirm the concepts developed.

0 Bookmarks
 · 
48 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a method for simultaneously locking on to a ground target and estimating the position of an unmanned aerial vehicle (UAV) under countermeasure (CM) conditions, where sensors are prevented from successfully tracking a target. Owing to the limited payload and power of the UAVs, we employ a monocular camera and a global positioning system (GPS) to carry out vision-based simultaneous localization and mapping (SLAM) using both an unscented Kalman filter and a Kalman filter. Since this approach estimates the state of the UAV and the location of the target, we can estimate the position of the target in the image, even in the presence of CMs. Our experiments show that the proposed method successfully locks on to the target and estimates the state of the UAV.
    2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan; 01/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: Vision based computation of the moving CCD camera in an outdoor environment is one of the most difficult tasks in the computer vision (CV) research field. Currently, to implement only one CV function steadily, one even needs to develop a series of algorithms to meet the computational requirement of the complex environment change inevitably. So the choice of the switch occasion of these different algorithms for an abrupt change of the imaging definition is a problem. In this paper, we propose to use the Image Quality (IQ) as a measurement to find proper switch occasions of different CV algorithms for the outdoor robot system. Firstly we define three IQ metrics to describe the imaging definition of a CCD camera. Then we present an ARMA-ARCH model based multiple flows method to detect the abrupt change of these series. Finally, we use our method to cut the image sequence into multiple segments, which are fit to be processed by different CV algorithms. Many experiment results have shown the validity of our method.
    Information and Automation (ICIA), 2010 IEEE International Conference on; 07/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: For a ground target tracking system using an unmanned helicopter, an on-board pan-tilt controller is pro- posed to adjust the attitude of the camera, so that to keep the target staying at the center of the image plane when the target is in sight. When the target is temporarily out of view due to various reasons, the designed controller will make the camera quickly re-capture the target by estimating its state and then regulating the orientation of the camera correspondingly. Specifically, a novel state transformation is firstly introduced to make the error system independent of the target image. Subsequently, a nonlinear pan-tilt controller is designed for the transformed error system. Lyapunov techniques are then employed to prove that the target tracking error in the image plane is driven to zero exponentially fast. To facilitate the implementation of the constructed controller, a visual estimator is also utilized to obtain the state information of the target even if it is occasionally out of the view of the camera by introducing an effective nonlinear filter. Simulation results are provided to validate the performance of the presented control system.
    Proceedings of the IEEE International Conference on Control Applications, CCA 2011, Denver, CO, USA, September 28-30, 2011; 01/2011

Full-text (2 Sources)

View
1 Download
Available from