Article

# A note on integral probability metrics and $\phi$-divergences

CoRR 01/2009; abs/0901.2698.
Source: DBLP

ABSTRACT We study some connections between integral probability metrics (21) of the form,

0 0
·
0 Bookmarks
·
42 Views
• Source
##### Article: A kernel method for the two-sample problem
Arxiv preprint arXiv:0805.2368. 01/2006;
• Source
##### Article: A new metric for probability distributions.
[hide abstract]
ABSTRACT: We introduce a metric for probability distribu- tions, which is bounded, information theoretically motivated and has a natural Bayesian interpretation. The square root of the well-known 2 distance is an asymptotic approxima- tion to it. Moreover, it is a close relative of the capacitory discrimination and Jensen-Shannon divergence. I. Introduction This paper is the result of the authors' search for a prob- ability metric that is bounded and can be easily interpreted in terms of both information theoretical as well as proba- bilistic concepts. Metric properties are the prerequisites for several important convergence theorems for iterative algo- rithms, i.e. Banach's fixed point theorem (2), which is the basis of several pattern-matching algorithms. Boundedness is a valuable property, too, when numerical applications are considered. We will limit the following discussion to discrete prob- ability distributions, but the result can be generalized to probability density functions.
IEEE Transactions on Information Theory. 01/2003; 49:1858-1860.