Poisson approximation for random sums of Bernoulli random variables

Department of Mathematics, Royal Institute of Technology, S-10044 Stockholm, Sweden
Statistics & Probability Letters 01/1991; DOI: 10.1016/0167-7152(91)90135-E

ABSTRACT Bounds for the total variation distance between the distribution of the sum of a random number of Bernoulli summands and an appropriate Poisson distribution are given. The results can be used to derive limit theorems with rates of convergence for marked and thinned point processes. Some examples are given.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the approximation of mixed Poisson distributions by Poisson laws and also by related finite signed measures of higher order. Upper bounds and asymptotic relations are given for several distances. Even in the case of the Poisson approximation with respect to the total variation distance, our bounds have better order than those given in the literature. In particular, our results hold under weaker moment conditions for the mixing random variable. As an example, we consider the approximation of the negative binomial distribution, which enables us to prove the sharpness of a constant in the upper bound of the total variation distance. The main tool is an integral formula for the difference of the counting densities of a Poisson distribution and a related finite signed measure.
    Journal of Statistical Planning and Inference 01/2003; · 0.71 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Rényi's thinning operation on a discrete random variable is a natural discrete analog of the scaling operation for continuous random variables. The properties of thinning are investigated in an information-theoretic context, especially in connection with information-theoretic inequalities related to Poisson approximation results. The classical Binomial-to-Poisson convergence (sometimes referred to as the “law of small numbers”) is seen to be a special case of a thinning limit theorem for convolutions of discrete distributions. A rate of convergence is provided for this limit, and nonasymptotic bounds are also established. This development parallels, in part, the development of Gaussian inequalities leading to the information-theoretic version of the central limit theorem. In particular, a “thinning Markov chain” is introduced, and it is shown to play a role analogous to that of the Ornstein-Uhlenbeck process in connection to the entropy power inequality.
    IEEE Transactions on Information Theory 10/2010; · 2.62 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Consider a sequence of n Bernoulli (Success–Failure or 1–0) trials. The exact and limiting distribution of the random variable En,k denoting the number of success runs of a fixed length k, 1≤k≤n, is derived along with its mean and variance. An associated waiting time is examined as well. The exact distribution is given in terms of binomial coefficients and an extension of it covering exchangeable sequences is also discussed. Limiting distributions of En,k are obtained using Poisson and normal approximations. The exact mean and variance of En,k which are given in explicit forms are also used to derive bounds and an additional approximation of the distribution of En,k. Numbers, associated with En,k and related random variables, counting binary strings and runs of 1’s useful in applications of computer science are provided. The overall study is illustrated by an extensive numerical experimentation.
    Computers & Mathematics with Applications. 01/2011; 61:761-772.


Available from