Motion Detail Preserving Optical Flow Estimation

The Chinese University of Hong Kong, Hong Kong.
IEEE Transactions on Software Engineering (Impact Factor: 5.69). 12/2011; 34(9). DOI: 10.1109/TPAMI.2011.236
Source: PubMed

ABSTRACT A common problem of optical flow estimation in the multi-scale variational framework is that fine motion structures cannot always be correctly estimated, especially for regions with significant and abrupt displacement variation. A novel extended coarse-to-fine (EC2F) refinement framework is introduced in this paper to address this issue, which reduces the reliance of flow estimates on their initial values propagated from the coarse level and enables recovering many motion details in each scale. The contribution of this paper also includes adaption of the objective function to handle outliers and development of a new optimization procedure. The effectiveness of our algorithm is borne out by the Middlebury optical flow benchmark and by experiments on challenging examples that involve large-displacement motion.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The optical flow is a velocity field that describes the motion of pixels within a sequence (or set) of images. Its estimation plays an important role in areas such as motion compensation, object tracking and image registration. In this paper, we present a novel framework to estimate the optical flow using local all-pass filters. Instead of using the optical flow equation, the framework is based on relating one image to another, on a local level, using an all-pass filter and then extracting the optical flow from the filter. Using this framework, we present a fast novel algorithm for estimating a smoothly varying optical flow, which we term the Local All-Pass (LAP) algorithm. We demonstrate that this algorithm is consistent and accurate, and that it outperforms three state-of-the-art algorithms when estimating constant and smoothly varying flows. We also show initial competitive results for real images.
    IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2015), Brisbane, Australia; 04/2015
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a new method to compute the dense correspondences between two images by using the energy optimization and the structured patches. In terms of the property of the sparse feature and the principle that nearest sub-scenes and neighbors are much more similar, we design a new energy optimization to guide the dense matching process and find the reliable correspondences. The sparse features are also employed to design a new structure to describe the patches. Both transformation and deformation with the structured patches are considered and incorporated into an energy optimization framework. Thus, our algorithm can match the objects robustly in complicated scenes. Finally, a local refinement technique is proposed to solve the perturbation of the matched patches. Experimental results demonstrate that our method outperforms the state-of-the-art matching algorithms.
    IEEE Transactions on Multimedia 02/2015; 17(3):295-306. DOI:10.1109/TMM.2015.2395078 · 1.78 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose four algorithms for computing the inverse optical flow between two images. We assume that the forward optical flow has already been obtained. The forward and backward flows can be related through a warping formula, which allows us to propose very efficient algorithms. These methods provide high accuracy with low memory requirements and low running times. Additionally, when objects move in a sequence, some regions may appear or disappear. Finding inverse flows in these situations is difficult and, in some cases, it is not possible to obtain a correct solution. Our algorithms deal with occlusions easily and reliably, and disocclusions are treated in a post-processing step. We analyze three approaches for filling disocclusions. In the experimental results, we use standard synthetic sequences to study the performance of our methods. The experiments show that our algorithms clearly outperform current state-of-the-art methods.
    Pattern Recognition Letters 01/2015; 52(52):32–39. DOI:10.1016/j.patrec.2014.09.009 · 1.55 Impact Factor


Available from