Poisson approximation for random sums of Bernoulli random variables

Department of Mathematics, Royal Institute of Technology, S-10044 Stockholm, Sweden
Statistics [?] Probability Letters (Impact Factor: 0.53). 01/1991; DOI: 10.1016/0167-7152(91)90135-E

ABSTRACT Bounds for the total variation distance between the distribution of the sum of a random number of Bernoulli summands and an appropriate Poisson distribution are given. The results can be used to derive limit theorems with rates of convergence for marked and thinned point processes. Some examples are given.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Bounds for the distance between the distribution function of random sums of independent Bernoulli random variables and an appropriate Poisson distribution function are improved. The bounds in this study are more appropriate for measuring the accuracy of Poisson approximation.
    International Journal of Pure and Applied Mathematics. 11/2013; 89(1).
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A new bound for the point metric between the distribution of random sums of independent Bernoulli random variables and an appropriate Poisson distribution is obtained. The bound in this study is sharper than those reported in [3].
    International Journal of Pure and Applied Mathematics. 01/2013; 89(2).
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Rényi's thinning operation on a discrete random variable is a natural discrete analog of the scaling operation for continuous random variables. The properties of thinning are investigated in an information-theoretic context, especially in connection with information-theoretic inequalities related to Poisson approximation results. The classical Binomial-to-Poisson convergence (sometimes referred to as the “law of small numbers”) is seen to be a special case of a thinning limit theorem for convolutions of discrete distributions. A rate of convergence is provided for this limit, and nonasymptotic bounds are also established. This development parallels, in part, the development of Gaussian inequalities leading to the information-theoretic version of the central limit theorem. In particular, a “thinning Markov chain” is introduced, and it is shown to play a role analogous to that of the Ornstein-Uhlenbeck process in connection to the entropy power inequality.
    IEEE Transactions on Information Theory 10/2010; · 2.62 Impact Factor


1 Download
Available from