Introducing Monte Carlo Methods with R

DOI: 10.1007/978-1-4419-1576-4
Source: OAI

ABSTRACT Solutions des exercices proposés dans cet ouvrage librement accessibles à Computational techniques based on simulation have now become an essential part of the statistician's toolbox. It is thus crucial to provide statisticians with a practical understanding of those methods, and there is no better way to develop intuition and skills for simulation than to use simulation to solve statistical problems. Introducing Monte Carlo Methods with R covers the main tools used in statistical simulation from a programmer's point of view, explaining the R implementation of each simulation technique and providing the output for better understanding and comparison. While this book constitutes a comprehensive treatment of simulation methods, the theoretical justification of those methods has been considerably reduced, compared with Robert and Casella (2004). Similarly, the more exploratory and less stable solutions are not covered here. This book does not require a preliminary exposure to the R programming language or to Monte Carlo methods, nor an advanced mathematical background. While many examples are set within a Bayesian framework, advanced expertise in Bayesian statistics is not required. The book covers basic random generation algorithms, Monte Carlo techniques for integration and optimization, convergence diagnoses, Markov chain Monte Carlo methods, including Metropolis {Hastings and Gibbs algorithms, and adaptive algorithms. All chapters include exercises and all R programs are available as an R package called mcsm. The book appeals to anyone with a practical interest in simulation methods but no previous exposure. It is meant to be useful for students and practitioners in areas such as statistics, signal processing, communications engineering, control theory, econometrics, finance and more. The programming parts are introduced progressively to be accessible to any reader. oui

  • 01/2012: chapter A Continuous Time Mover-Stayer Model for Labor Market in a Northern Italian Area: pages 101-110; Springer., ISBN: 9783642288937
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Frequently a researcher is interested in a theoretical distribution or characteristics of that distribution, such as its mean, standard deviation, or 2.5 and 97.5 percentiles. One hundred or even 50 years ago, we were restricted practically by computing limitations to theoretical distributions that are described by an explicit equation, such as the binomial or multivariate normal distribution. Using mathematical models of distributions often requires considerable mathematical ability, and also imposes rather severe and often intractable assumptions on the applied researchers (e.g., normality, independence, variance assumptions, and so on). But computer simulations now provide more flexibility specifying distributions, which in turn provide more flexibility specifying models. One contemporary simulation technique is Markov chain Monte Carlo (MCMC) simulation, which can specify arbitrarily complex and nested multivariate distributions. It can even combine different theoretical families of variates. Another contemporary technique is the bootstrap, which can construct sampling distributions of conventional statistics that are free from most (but not all) assumptions. It can even create sampling distributions for new or exotic test statistics that the researcher created for a specific experiment
    APA handbook of research methods in psychology, Edited by Cooper, Harris (Ed); Camic, Paul M. (Ed); Long, Debra L. (Ed); Panter, A. T. (Ed); Rindskopf, David (Ed); Sher, Kenneth J. (Ed, 01/2012: chapter Bootstrapping and Monte Carlo methods.: pages 407-425; American Psychological Association.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This chapter is the first of a series on simulation methods based on Markov chains. However, it is a somewhat strange introduction because it contains a description of the most general algorithm of all. The next chapter (Chapter 8) concentrates on the more specific slice sampler, which then introduces the Gibbs sampler (Chapters 9 and 10), which, in turn, is a special case of the Metropolis–Hastings algorithm. (However, the Gibbs sampler is different in both fundamental methodology and historical motivation.)