Conference Paper

# Variable step-size NLMS algorithms designed for echo cancellation

Univ. Politeh. of Bucharest, Bucharest, Romania
DOI: 10.1109/ACSSC.2009.5469916 Conference: Signals, Systems and Computers, 2009 Conference Record of the Forty-Third Asilomar Conference on
Source: IEEE Xplore

ABSTRACT A major issue in echo cancellation is to recover the near-end signal from the error signal of the adaptive filter. In this paper, we use this requirement in order to design a family of variable step-size normalized least-mean-square (VSS-NLMS) algorithms. The main parameter that is needed within these algorithms is the near-end signal power estimate. Several solutions for this problem are presented and evaluated in terms of different practical aspects (i.e., available parameters, complexity). Due to their specific characteristic, these VSS-NLMS algorithms are equipped with good robustness features against near-end signal variations (e.g., double-talk) and can be reliable candidates for real-world echo cancellation scenarios.

0 Bookmarks
·
99 Views
• Source
##### Article: The NLMS algorithm with time-variant optimum stepsize derived from a Bayesian network perspective
[Hide abstract]
ABSTRACT: In this article, we derive a new stepsize adaptation for the normalized least mean square algorithm (NLMS) by describing the task of linear acoustic echo cancellation from a Bayesian network perspective. Similar to the well-known Kalman filter equations, we model the acoustic wave propagation from the loudspeaker to the microphone by a latent state vector and define a linear observation equation (to model the relation between the state vector and the observation) as well as a linear process equation (to model the temporal progress of the state vector). Based on additional assumptions on the statistics of the random variables in observation and process equation, we apply the expectation-maximization (EM) algorithm to derive an NLMS-like filter adaptation. By exploiting the conditional independence rules for Bayesian networks, we reveal that the resulting EM-NLMS algorithm has a stepsize update equivalent to the optimal-stepsize calculation proposed by Yamamoto and Kitayama in 1982, which has been adopted in many textbooks. As main difference, the instantaneous stepsize value is estimated in the M step of the EM algorithm (instead of being approximated by artificially extending the acoustic echo path). The EM-NLMS algorithm is experimentally verified for synthesized scenarios with both, white noise and male speech as input signal.
11/2014;
• Source
##### Article: A New Variable Step-Size NLMS Algorithm and Its Performance Analysis
[Hide abstract]
ABSTRACT: Numerous variable step-size normalized least mean-square (VSS-NLMS) algorithms have been derived to solve the dilemma of fast convergence rate or low excess mean-square error in the past two decades. This paper proposes a new, easy to implement, nonparametric VSS-NLMS algorithm that employs the mean-square error and the estimated system noise power to control the step-size update. Theoretical analysis of its steady-state behavior shows that, when the input is zero-mean Gaussian distributed, the misadjustment depends only on a parameter $\beta$ controlling the update of step size. Simulation experiments show that the proposed algorithm performs very well. Furthermore, the theoretical steady-state behavior is in very good agreement with the experimental results.
IEEE Transactions on Signal Processing 04/2012; 60(4):2055-2060. · 3.20 Impact Factor
• Source
##### Article: Adaptive LMS and NLMS algorithms for cancellation of Acoustic echo
[Hide abstract]
ABSTRACT: Acoustic echo arise in hand free communication environment due to poor voice coupling between Microphone and Loudspeaker. This paper shows the implementation of Acoustic echo cancellation using gradient based LMS algorithm it also focused on NLMS algorithm to remove the unwanted echo and increase the quality of speech in communication applications .The LMS algorithms uses the estimates of the gradient vector from the available data.LMS and NLMS incorporates an iterative procedure that updates weight vector in the direction opposite of the gradient vector which evenly leads to the minimum mean square error.