Conference Paper

Effects of Parameters Variations in Particle Filter Tracking.

Multitel A.S.B.L., Mons
DOI: 10.1109/ICIP.2006.313126 Conference: Proceedings of the International Conference on Image Processing, ICIP 2006, October 8-11, Atlanta, Georgia, USA
Source: DBLP


Many implementations of visual tracking have been proposed since many years. The lack of standard evaluation process has prevented fair comparison between them. In this paper, we simply propose to evaluate different particle filter methods in people tracking applications. We introduce an objective metric and give results according to different parameter variations. Finally, based on our evaluations, we can propose a new particle filter configuration that outperforms other current implementations.

Download full-text


Available from: Céline Thillou, Oct 04, 2015
22 Reads
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Object tracking in video sequences is an important task in many applications such as video surveillance, traffic monitoring, marketing and sport analysis. In order to enhance these technologies, an objective performance evaluation is needed. This evaluation requires to test the system with a given dataset and compare the output with the ground truth. One of the contributions of the TRICTRAC project is the supply to the video processing community of synthetic, high-definition video content of Pan-Tilt-Zoom (PTZ) cameras with 3D ground truth including the parameters of the cameras and the mobile objects. This paper presents this novel dataset.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Robust real-time tracking of non-rigid objects is a challenging task. Particle filtering has proven very successful for non-linear and non-Gaussian estimation problems. The article presents the integration of color distributions into particle filtering, which has typically been used in combination with edge-based image features. Color distributions are applied, as they are robust to partial occlusion, are rotation and scale invariant and computationally efficient. As the color of an object can vary over time dependent on the illumination, the visual angle and the camera parameters, the target model is adapted during temporally stable image observations. An initialization based on an appearance condition is introduced since tracked objects may disappear and reappear. Comparisons with the mean shift tracker and a combination between the mean shift tracker and Kalman filtering show the advantages and limitations of the new approach.
    Image and Vision Computing 02/2003; 21(1-21):99-110. DOI:10.1016/S0262-8856(02)00129-4 · 1.59 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A new method for real time tracking of non-rigid objects seen from a moving camera is proposed. The central computational module is based on the mean shift iterations and finds the most probable target position in the current frame. The dissimilarity between the target model (its color distribution) and the target candidates is expressed by a metric derived from the Bhattacharyya coefficient. The theoretical analysis of the approach shows that it relates to the Bayesian framework while providing a practical, fast and efficient solution. The capability of the tracker to handle in real time partial occlusions, significant clutter, and target scale variations, is demonstrated for several image sequences
    Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference on; 02/2000
Show more