Multi-camera real-time three-dimensional tracking of multiple flying animals

California Institute of Technology, Bioengineering, Mailcode 138-78, Pasadena, CA 91125, USA.
Journal of The Royal Society Interface (Impact Factor: 3.92). 03/2011; 8(56):395-409. DOI: 10.1098/rsif.2010.0230
Source: PubMed


Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time--with minimal latency--opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as 'virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.

82 Reads
  • Source
    • "Much research has assumed that adjacent camera views have overlap and utilized the spatial proximity of tracks in the overlapping area. As described in [2], tracks of objects observed in different camera views were stitched based on their spatial proximity [141], [142]. In order to track objects across disjoint camera views, appearance cues have been integrated with spatiotemporal reasoning [143], [144]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Traffic surveillance has become an important topic in intelligent transportation systems (ITSs), which is aimed at monitoring and managing traffic flow. With the progress in computer vision, video-based surveillance systems have made great advances on traffic surveillance in ITSs. However, the performance of most existing surveillance systems is susceptible to challenging complex traffic scenes (e.g., object occlusion, pose variation, and cluttered background). Moreover, existing related research is mainly on a single video sensor node, which is incapable of addressing the surveillance of traffic road networks. Accordingly, we present a review of the literature on the video-based vehicle surveillance systems in ITSs. We analyze the existing challenges in video-based surveillance systems for the vehicle and present a general architecture for video surveillance systems, i.e., the hierarchical and networked vehicle surveillance, to survey the different existing and potential techniques. Then, different methods are reviewed and discussed with respect to each module. Applications and future developments are discussed to provide future needs of ITS services.
    IEEE Transactions on Intelligent Transportation Systems 04/2015; 16(2):557-580. DOI:10.1109/TITS.2014.2340701 · 2.38 Impact Factor
  • Source
    • "Infrared tracking was sufficient for summarizing group behavior, but ineffective in recognizing individual mice and required a backup video system to verify their data. More recently, image processing-based tracking methods have been used in tracking animals subjected with choice tests (Straw et al., 2011). Accordingly, objectives of this study were based on employing image processing technology in tracking layers during choice tests. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed.
    Poultry Science 07/2014; 93(10). DOI:10.3382/ps.2014-04078 · 1.67 Impact Factor
  • Source
    • "A wireless, neural recording device especially argues for a wireless behavioral tracking method in order to keep the animal model freely-moving and untethered. While research is being conducted on visual and pattern recognition from video recordings, the real-time application and advanced computation required for such methodologies appears to be a limiting factor for their wide-spread application today [6] "
    [Show abstract] [Hide abstract]
    ABSTRACT: An entirely wireless neural and behavior recording platform has been developed for the purpose of studying cortical circuits during behavior of a freely moving animal with minimal observer intrusion. Our platform consists of (a) a 100-element microelectrode, fully implantable, wireless, broadband neural recording device [1], and (b) a 2-channel, wireless, event tracking system, both of which simultaneously stream asynchronous data to a custom programmed FPGA (field-programmable gate array). All data is packetized and transmitted over Ethernet. The data are received by custom software (in MATLAB) which simultaneously animates neural data and behavioral data for the observer to draw correlations and witness patterns. This platform has been implemented in a 10-month old female Yucatan mini-pig swine model (with the chronic, wireless, neural implant for over 9 months), extracting broadband neural data recorded simultaneously with time stamped behavioral recordings.
    Neural Engineering (NER), 2013 6th International IEEE/EMBS Conference on; 11/2013
Show more


82 Reads
Available from