Adaptive step-size natural gradient algorithm based on separating degree gradient
This paper proposes to use separating degree to control the step-size of natural algorithm for the first time. After detailed analyzing relevant fixed step-size and variable step-size gradient algorithms, the paper presents a new adaptive step-size natural algorithm. Because the variability of the new algorithm's step-size is based on separating degree, its learning ratio is chosen adaptively according to separating degree, therefore it can improve convergence speed and reduce the misadjustment error in the steady state simultaneously. Computer simulations confirm the theoretical analysis and show the algorithm performance is superior to other natural algorithms.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.