The Bounds on the Rate of Uniform Convergence for Learning Machine

Conference PaperinLecture Notes in Computer Science 3496:538-545 · May 2005with19 Reads
Impact Factor: 0.51 · DOI: 10.1007/11427391_86 · Source: DBLP
Conference: Advances in Neural Networks - ISNN 2005, Second International Symposium on Neural Networks, Chongqing, China, May 30 - June 1, 2005, Proceedings, Part I

    Abstract

    The generalization performance is the important property of learning machines. The desired learning machines should have the
    quality of stability with respect to the training samples. We consider the empirical risk minimization on the function sets
    which are eliminated noisy. By applying the Kutin’s inequality we establish the bounds of the rate of uniform convergence
    of the empirical risks to their expected risks for learning machines and compare the bounds with known results.