Improvement of the convergence speed and the tracking ability of the fast Newton type adaptive filtering (FNTF) algorithm
ABSTRACT In this paper, five new versions of the fast Newton type adaptive filtering (FNTF) algorithm are presented. A brief preliminary presentation of these algorithms was given in Djendi et al. [Comparative study of new version of the Newton type adaptive filtering algorithm, in: Proceedings of the IEEE ICASSP 2004, Montreal, Canada, May 2004, pp. 677–680]. The first algorithm is based on a simple modification of the filtering part, by introducing a scalar accelerator parameter. The second algorithm is based on the use of the temporal subdivision technique to update the local filter coefficients. The third algorithm is a modification of the second one, by the use of the final filtering errors to update the filter coefficients. The fourth and the fifth algorithms are based, respectively, on the combination of features from the first algorithm with features of the second and third algorithms. These new algorithms are proposed to improve the convergence speed of the original version of the FNTF algorithm for the identification of acoustic impulse responses, and also to improve the tracking ability when the systems vary in time. A comparative study of each algorithm with the original version of the FNTF algorithm is also presented.
- SourceAvailable from: upatras.gr[show abstract] [hide abstract]
ABSTRACT: A class of adaptive algorithms for the estimation of FIR (finite impulse response) transversal filters is presented. The main characteristic of this class is the fast computation of the gain vector needed for the adaptation of the transversal filters. The method for deriving these algorithms is based on the assumption that the input signal is autoregressive of order M , where M can be much smaller than the order of the filter to be estimated. Under this assumption the covariance matrix of the input signal is estimated by extending in a min-max way the M order sample covariance matrix. This estimate can be regarded as a generalization of the diagonal covariance matrix used in LMS and leads to an efficient computation of the gain needed for the adaptation. The new class of algorithms contains the LMS and the fast versions of LS as special cases. The complexity changes linearly with M , starting from the complexity of the LMS (for M =0) and ending at the complexity of the fast versions of LSIEEE Transactions on Signal Processing 11/1991; · 2.81 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Fast transversal filter (FTF) implementations of recursive-least-squares (RLS) adaptive-filtering algorithms are presented in this paper. Substantial improvements in transient behavior in comparison to stochastic-gradient or LMS adaptive algorithms are efficiently achieved by the presented algorithms. The true, not approximate, solution of the RLS problem is always obtained by the FTF algorithms even during the critical initialization period (first N iterations) of the adaptive filter. This true solution is recursively calculated at a relatively modest increase in computational requirements in comparison to stochastic-gradient algorithms (factor of 1.6 to 3.5, depending upon application). Additionally, the fast transversal filter algorithms are shown to offer substantial reductions in computational requirements relative to existing, fast-RLS algorithms, such as the fast Kalman algorithms of Morf, Ljung, and Falconer (1976) and the fast ladder (lattice) algorithms of Morf and Lee (1977-1981). They are further shown to attain (steady-state unnormalized), or improve upon (first N initialization steps), the very low computational requirements of the efficient RLS solutions of Carayannis, Manolakis, and Kalouptsidis (1983). Finally, several efficient procedures are presented by which to ensure the numerical Stability of the transversal-filter algorithms, including the incorporation of soft-constraints into the performance criteria, internal bounding and rescuing procedures, and dynamic-range-increasing, square-root (normalized) variations of the transversal filters.IEEE Transactions on Acoustics Speech and Signal Processing 05/1984;
- [show abstract] [hide abstract]
ABSTRACT: Underdetermined recursive least-squares (URLS) adaptive filtering is introduced. In particular, the URLS algorithm is derived and shown to be a direct consequence of the principle of minimal disturbance. By exploiting the Hankel structure of the filter input matrix, the fast transversal filter (FTF) version of the URLS algorithm (URLS-FTF) is derived including sliding window and growing window types. The computational complexity is reduced to O(N)+O(m), where N is the adaptive filter length, and m is the order of the URLS algorithm. In addition, the efficient URLS (EURLS) algorithm, which does not compute the filter coefficients explicitly, thereby significantly reducing the computational load, is presented. Some earlier adaptive algorithms such as the averaged LMS, filtered-X LMS, and fast conjugate gradient are shown to be suboptimal approximations of the URLS algorithm. Instrumental variable approximations are also discussed. The URLS algorithm has a whitening effect on the input, signal, which provides immunity to the eigenvalue spread of the input signal correlation matrix. Although the algorithm is sensitive to observation noise, it has good tracking characteristics, and tradeoffs can be found by tuning the step size. The utility of the URLS algorithms, in its basic form and FTF realization, depends heavily on the practical applicability of the mth-order sliding window estimate of the covariance matrix and mth-order PTF relations. The feasibility of the URLS family in practical applications is demonstrated in channel equalization and acoustic echo cancellationIEEE Transactions on Signal Processing 03/1997; · 2.81 Impact Factor