Conference Paper

# Continuous normalized convolution

Dept. of Biomed. Eng., Linkoping Univ., Sweden

DOI: 10.1109/ICME.2002.1035884 Conference: Multimedia and Expo, 2002. ICME '02. Proceedings. 2002 IEEE International Conference on, Volume: 1 Source: IEEE Xplore

- [Show abstract] [Hide abstract]

**ABSTRACT:**A fast method for super-resolution (SR) recon- struction from low resolution (LR) frames with known registration is proposed. The irregular LR samples are incorporated into the SR grid by stamp- ing into 4-nearest neighbors with position certainties. The signal certainty reects the errors in the LR pix- els' positions (computed by cross-correlation or optic o w) and their intensities. Adaptive normalized aver- aging is used in the fusion stage to enhance local lin- ear structure and minimize further blurring. The local structure descriptors including orientation, anisotropy and curvature are computed directly on the SR grid and used as steering parameters for the fusion. The optimum scale for local fusion is achieved by a sam- ple density transform, which is also presented for the rst time in this paper.ASCI. 01/2004; - [Show abstract] [Hide abstract]

**ABSTRACT:**This thesis introduces a new signal transform, called polynomial expansion, and based on this develops novel methods for estimation of orientation and motion. The methods are designed exclusively in the spatial domain and can be used for signals of any dimensionality. Two important concepts in the use of the spatial domain for signal processing is projections into subspaces, e.g. the subspace of second degree polynomials, and representations by frames, e.g. wavelets. It is shown how these concepts can be unified in a least squares framework for representation of finite dimensional vectors by bases, frames, subspace bases, and subspace frames. This framework is used to give a new derivation of normalized convolution, a method for signal analysis that takes uncertainty in signal values into account and also allows for spatial localization of the analysis functions. Polynomial expansion is a transformation which at each point transforms the signal into a set of expansion coefficients with respect to a polynomial local signal model. The expansion coefficients are computed using normalized convolution. As a consequence polynomial expansion inherits the mechanism for handling uncertain signals and the spatial localization feature allows good control of the properties of the transform. It is shown how polynomial expansion can be computed very efficiently. As an application of polynomial expansion, a novel method for estimation of orientation tensors is developed. A new concept for orientation representation, orientation functionals, is introduced and it is shown that orientation tensors can be considered a special case of this representation. By evaluation on a test sequence it is demonstrated that the method performs excellently. Considering an image sequence as a spatiotemporal volume, velocity can be estimated from the orientations present in the volume. Two novel methods for velocity estimation are presented, with the common idea to combine the orientation tensors over some region for estimation of the velocity field according to a parametric motion model, e.g. affine motion. The first method involves a simultaneous segmentation and velocity estimation algorithm to obtain appropriate regions. The second method is designed for computational efficiency and uses local neighborhoods instead of trying to obtain regions with coherent motion. By evaluation on the Yosemite sequence, it is shown that both methods give substantially more accurate results than previously published methods. Another application of polynomial expansion is a novel displacement estimation algorithm, i.e. an algorithm which estimates motion from only two consecutive frames rather than from a whole spatiotemporal volume. This approach is necessary when the motion is not temporally coherent, e.g. because the camera is affected by vibrations. It is shown how moving objects can robustly be detected in such image sequences by using the plane+parallax approach to separate out the background motion. To demonstrate the power of being able to handle uncertain signals it is shown how normalized convolution and polynomial expansion can be computed for interlaced video signals. Together with the displacement estimation algorithm this gives a method to estimate motion from a single interlaced frame.12/2002, Degree: PhD, Supervisor: Gösta Granlund -
##### Conference Paper: Low complexity dense motion estimation using phase correlation

[Show abstract] [Hide abstract]

**ABSTRACT:**We propose a low-complexity dense motion estimation scheme particularly attractive for real-time video applications. Our scheme is based on overlapped block-based motion estimation using phase correlation at critical pixel locations. These form an irregularly sampled grid capturing salient motion features of a scene. The dense vector field is obtained by applying normalized convolution on the irregular grid. Our experiments show that our scheme provides reliable sub-pixel accuracy motion vectors corresponding to actual scene motion, outperforms differential and phase-based methods and yields comparable performance to more complex and time consuming robust motion estimation techniques.Digital Signal Processing, 2009 16th International Conference on; 08/2009

Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.