Online Boosting for Vehicle Detection

Dept. of Electr. Eng., Nat. Taipei Univ. of Technol., Taipei, Taiwan
IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics) (Impact Factor: 3.78). 07/2010; 40(3):892 - 902. DOI: 10.1109/TSMCB.2009.2032527
Source: IEEE Xplore

ABSTRACT This paper presents a real-time vision-based vehicle detection system employing an online boosting algorithm. It is an online AdaBoost approach for a cascade of strong classifiers instead of a single strong classifier. Most existing cascades of classifiers must be trained offline and cannot effectively be updated when online tuning is required. The idea is to develop a cascade of strong classifiers for vehicle detection that is capable of being online trained in response to changing traffic environments. To make the online algorithm tractable, the proposed system must efficiently tune parameters based on incoming images and up-to-date performance of each weak classifier. The proposed online boosting method can improve system adaptability and accuracy to deal with novel types of vehicles and unfamiliar environments, whereas existing offline methods rely much more on extensive training processes to reach comparable results and cannot further be updated online. Our approach has been successfully validated in real traffic environments by performing experiments with an onboard charge-coupled-device camera in a roadway vehicle.

  • Neurocomputing 02/2015; 149:800-810. DOI:10.1016/j.neucom.2014.07.054 · 2.01 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Vehicle counting system has a wide range of applications, from visual surveillance to intelligent transportation. Due to the different lighting conditions during the day and night, there is not a unified method to capture vehicles. To address this problem, we present unified vehicle detection and counting algorithm based on a new multiple feature background models using morphology and color difference in this paper. Novel Contributions of this paper include: i) unified vehicle detection and counting algorithm based on the proposed feature background model, ii) using the morphology filters to highlight the vehicles both in day and night time, and iii) integrating a color difference features to capture vehicles. The developed system has been implemented on an experiment camera and preliminarily tested in different situations. The experiments on a large number of highway scenes demonstrate that the proposed fast algorithm is robust to illumination and background changes compared to the competing works.
    2013 28th International Conference of Image and Vision Computing New Zealand (IVCNZ); 11/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Primarily, this paper introduces a framework for all-day service traffic which states recognition in traffic monitoring videos without vehicle segmentation. It proposes the average intensity of background subtract image, which is named as gray characteristic. This is a new feature to describe the amount of vehicle running on the road, and proves that there is a linear correlative relationship between the gray characteristic and the occupation ratio. Meanwhile it also presents a vehicle corners extraction method without vehicle segmentation. Secondly, this paper suggests an algorithm of Gaussian group-based histogram (GBH) to build the background; and states the average intensity of road in background which is named as illumination characteristic. This can be used as a feature to distinguish different traffic scenes. A group of classifiers are designed to recognize the traffic state in different traffic scenes to acquire all-day traffic states. Finally, by using the method presented in this paper to recognize traffic state, the vehicle segmentation is not required, thus reduces the computation complexity and enhances the robustness of system. After a suite of reasonable test, this method can recognize traffic state all-day in real-time, and the classification of the method can achieve a higher coincide ratio with human classification.
    The Journal of China Universities of Posts and Telecommunications 12/2011; 18:1-11. DOI:10.1016/S1005-8885(10)60131-8


Available from