Adaptive step-size natural gradient algorithm based on separating degree gradient
This paper proposes to use separating degree to control the step-size of natural algorithm for the first time. After detailed analyzing relevant fixed step-size and variable step-size gradient algorithms, the paper presents a new adaptive step-size natural algorithm. Because the variability of the new algorithm's step-size is based on separating degree, its learning ratio is chosen adaptively according to separating degree, therefore it can improve convergence speed and reduce the misadjustment error in the steady state simultaneously. Computer simulations confirm the theoretical analysis and show the algorithm performance is superior to other natural algorithms.