[show abstract][hide abstract] ABSTRACT: We introduce a metric for probability distribu- tions, which is bounded, information theoretically motivated and has a natural Bayesian interpretation. The square root of the well-known 2 distance is an asymptotic approxima- tion to it. Moreover, it is a close relative of the capacitory discrimination and Jensen-Shannon divergence. I. Introduction This paper is the result of the authors' search for a prob- ability metric that is bounded and can be easily interpreted in terms of both information theoretical as well as proba- bilistic concepts. Metric properties are the prerequisites for several important convergence theorems for iterative algo- rithms, i.e. Banach's fixed point theorem (2), which is the basis of several pattern-matching algorithms. Boundedness is a valuable property, too, when numerical applications are considered. We will limit the following discussion to discrete prob- ability distributions, but the result can be generalized to probability density functions.
IEEE Transactions on Information Theory. 01/2003; 49:1858-1860.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.