Skip to main content

SMC Samplers

  • Chapter
  • First Online:
An Introduction to Sequential Monte Carlo

Part of the book series: Springer Series in Statistics ((SSS))

  • 5418 Accesses

Summary

This chapter covers SMC samplers , that is, particle algorithms that are able to track a sequence of probability distributions \(\mathbb {P}_t(\mathrm {d}\theta )\), related by the recursion \(\mathbb {P}_t(\mathrm {d}\theta ) = \ell _t G_t(\theta ) \mathbb {P}_{t-1}(\mathrm {d}\theta )\). Chapter 3 gave a few examples of applications of these algorithms; in some, one is genuinely interested in approximating each distribution in a given sequence, e.g., sequential Bayesian learning, in which case the sequence corresponds to the incorporation of more and more data; in others, one is interested in a single target distribution, but an artificial sequence is designed so as to be able to implement successfully an SMC sampler, and the sequence corresponds to increasing “temperatures”. Despite the different applications and appearances, the two approaches are not that different. We can think the former as “data tempering” and the latter as “temperature tempering” and in both situations sampling early distributions is easier than later ones.

When used to approximate a single distribution, SMC samplers represent an alternative to MCMC, with the following advantages: first, they offer a simple way to estimate the normalising constant of the target; this quantity is of interest in many problems, in particular in Bayesian model choice. Second, they are easier to parallelise. Third, it is reasonably easy to develop adaptive SMC samplers; that is algorithms which calibrate their tuning parameters automatically, using the current sample of particles.

We start by describing a Feynman-Kac model that formalises a large class of SMC samplers, based on invariant kernels. We then provide practical recipes to obtain good performance in the following scenarios: Bayesian sequential learning, tempering, rare-event simulation, and likelihood-free inference. Finally, we explain how to further extend SMC samplers by using more general Markov kernels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Bibliography

  • Amzal, B., Bois, F. Y., Parent, E., & Robert, C. P. (2006). Bayesian-optimal design via interacting particle systems. Journal of the American Statistical Association, 101(474), 773–785.

    Article  MathSciNet  MATH  Google Scholar 

  • Beskos, A., Crisan, D., & Jasra, A. (2014). On the stability of sequential Monte Carlo methods in high dimensions. Annals of Applied Probability, 24(4), 1396–1445.

    Article  MathSciNet  MATH  Google Scholar 

  • Beskos, A., Jasra, A., Kantas, N., & Thiery, A. (2016). On the convergence of adaptive sequential Monte Carlo methods. Annals of Applied Probability, 26(2), 1111–1146.

    Article  MathSciNet  MATH  Google Scholar 

  • Beskos, A., Jasra, A., Law, K., Tempone, R., & Zhou, Y. (2017). Multilevel sequential Monte Carlo samplers. Stochastic Processes and Their Applications, 127(5), 1417–1440.

    Article  MathSciNet  MATH  Google Scholar 

  • Bornn, L., Doucet, A., & Gottardo, R. (2010). An efficient computational approach for prior sensitivity analysis and cross-validation. Canadian Journal of Statistics, 38(1), 47–64.

    MathSciNet  MATH  Google Scholar 

  • Botev, Z. I., & Kroese, D. P. (2008). An efficient algorithm for rare-event probability estimation, combinatorial optimization, and counting. Methodology and Computing in Applied Probability, 10(4), 471–505.

    Article  MathSciNet  MATH  Google Scholar 

  • Botev, Z. I., & Kroese, D. P. (2012). Efficient Monte Carlo simulation via the generalized splitting method. Statistics and Computing, 22(1), 1–16.

    Article  MathSciNet  MATH  Google Scholar 

  • Buchholz, A., & Chopin, N. (2019). Improving approximate Bayesian computation via Quasi-Monte Carlo. Journal of Computational and Graphical Statistics, 28(1), 205–219.

    Article  MathSciNet  Google Scholar 

  • Buchholz, A., Chopin, N., & Jacob, P. E. (2020). Adaptive tuning of Hamiltonian Monte Carlo within Sequential Monte Carlo. Bayesian Analysis (to appear).

    Google Scholar 

  • Cappé, O., Guillin, A., Marin, J. M., & Robert, C. P. (2004). Population Monte Carlo. Journal of Computational and Graphical Statistics, 13(4), 907–929.

    Article  MathSciNet  Google Scholar 

  • Cérou, F., Del Moral, P., Furon, T., & Guyader, A. (2012). Sequential Monte Carlo for rare event estimation. Statistics and Computing, 22(3), 795–808.

    Article  MathSciNet  MATH  Google Scholar 

  • Chen, X., Christensen, T. M., & Tamer, E. (2018). Monte Carlo confidence sets for identified sets. Econometrica, 86(6), 1965–2018.

    Article  MathSciNet  MATH  Google Scholar 

  • Chopin, N. (2002). A sequential particle filter method for static models. Biometrika, 89(3), 539–551.

    Article  MathSciNet  MATH  Google Scholar 

  • Chopin, N., & Ridgway, J. (2017). Leave Pima Indians alone: Binary regression as a benchmark for Bayesian computation. Statistical Science, 32(1), 64–87.

    Article  MathSciNet  MATH  Google Scholar 

  • Chopin, N., Rousseau, J., & Liseo, B. (2013). Computational aspects of Bayesian spectral density estimation. Journal of Computational and Graphical Statistics, 22(3), 533–557.

    Article  MathSciNet  Google Scholar 

  • Del Moral, P., Doucet, A., & Jasra, A. (2006). Sequential Monte Carlo samplers. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68(3), 411–436.

    Article  MathSciNet  MATH  Google Scholar 

  • Del Moral, P., Doucet, A., & Jasra, A. (2007). Sequential Monte Carlo for Bayesian computation. In J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith, & M. West (Eds.), Bayesian statistics 8. Oxford science publications (Vol. 8, pp. 115–148). Oxford: Oxford University Press.

    Google Scholar 

  • Del Moral, P., Doucet, A., & Jasra, A. (2012). An adaptive sequential Monte Carlo method for approximate Bayesian computation. Statistics and Computing, 22(5), 1009–1020.

    Article  MathSciNet  MATH  Google Scholar 

  • Didelot, X., Everitt, R. G., Johansen, A. M., & Lawson, D. J. (2011). Likelihood-free estimation of model evidence. Bayesian Analysis, 6(1), 49–76.

    Article  MathSciNet  MATH  Google Scholar 

  • Douc, R., Guillin, A., Marin, J.-M., & Robert, C. P. (2007a). Convergence of adaptive mixtures of importance sampling schemes. Annals of Statistics, 35(1), 420–448.

    Article  MathSciNet  MATH  Google Scholar 

  • Douc, R., Guillin, A., Marin, J.-M., & Robert, C. P. (2007b). Minimum variance importance sampling via population Monte Carlo. ESAIM: Probability and Statistics, 11, 427–447.

    Article  MathSciNet  MATH  Google Scholar 

  • Everitt, R. G., Johansen, A. M., Rowing, E., & Evdemon-Hogan, M. (2017). Bayesian model comparison with un-normalised likelihoods. Statistics and Computing, 27(2), 403–422.

    Article  MathSciNet  MATH  Google Scholar 

  • Fearnhead, P., & Taylor, B. M. (2013). An adaptive sequential Monte Carlo sampler. Bayesian Analysis, 8(2), 411–438.

    Article  MathSciNet  MATH  Google Scholar 

  • Giles, M. B. (2008). Multilevel Monte Carlo path simulation. Operational Research, 56(3), 607–617.

    Article  MathSciNet  MATH  Google Scholar 

  • Guedj, B., (2019). A Primer on PAC-Bayesian Learning. arxiv preprint 1901.05353

    Google Scholar 

  • Heng, J., Bishop, A. N., Deligiannidis, G., & Doucet, A. (2020). Controlled Sequential Monte Carlo. Annals of Statistics (to appear).

    Google Scholar 

  • Herbst, E., & Schorfheide, F. (2014). Sequential Monte Carlo sampling for DSGE models. Journal of Applied Econometrics, 29(7), 1073–1098.

    Article  MathSciNet  Google Scholar 

  • Jasra, A., Stephens, D., Doucet, & Tsagaris, T. (2011). Inference for Lévy driven stochastic volatility models via Sequential Monte Carlo. Scandinavian Journal of Statistics, 38(1), 1–22.

    Google Scholar 

  • Kroese, D. P., Taimre, T., & Botev, Z. I. (2011). Handbook of Monte Carlo methods. Hoboken, NJ: Wiley.

    Book  MATH  Google Scholar 

  • Lee, A. (2012). On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers. In Proceedings of the 2012 Winter Simulation Conference (WSC) (pp. 304–315). Berlin: IEEE.

    Google Scholar 

  • Naesseth, C. A., Lindsten, F., & Schön, T. B. (2014). Sequential Monte Carlo for graphical models. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in neural information processing systems 27 (pp. 1862–1870). Red Hook, NY: Curran Associates, Inc.

    Google Scholar 

  • Neal, R. M. (2001). Annealed importance sampling. Statistics and Computing, 11(2), 125–139.

    Article  MathSciNet  Google Scholar 

  • Olsson, J., Pavlenko, T., & Rios, F. L. (2019). Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods. Electronic Journal of Statistics, 13(2), 2865–2897.

    Article  MathSciNet  MATH  Google Scholar 

  • Prangle, D., Everitt, R. G., & Kypraios, T. (2018). A rare event approach to high-dimensional approximate Bayesian computation. Statistics and Computing, 28(4), 819–834.

    Article  MathSciNet  MATH  Google Scholar 

  • Ridgway, J. (2016). Computation of Gaussian orthant probabilities in high dimension. Statistics and Computing, 26(4), 899–916.

    Article  MathSciNet  MATH  Google Scholar 

  • Ridgway, J., Alquier, P., Chopin, N., & Liang, F. (2014). PAC-Bayesian AUC classification and scoring. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in neural information processing systems 27 (pp. 658–666). Red Hook, NY: Curran Associates, Inc.

    Google Scholar 

  • Rubinstein, R. Y., & Kroese, D. P. (2004). The cross-entropy method: A unified approach to combinatorial optimization, Monte-Carlo simulation, and machine learning. Berlin/Heidelberg: Springer.

    Book  MATH  Google Scholar 

  • Schäfer, C. (2013). Particle algorithms for optimization on binary spaces. ACM Transactions on Modeling and Computer Simulation, 23(1), Art. 8, 25.

    Google Scholar 

  • Schäfer, C., & Chopin, N. (2013). Sequential Monte Carlo on large binary sampling spaces. Statistics and Computing, 23(2), 163–184.

    Article  MathSciNet  MATH  Google Scholar 

  • Sisson, S. A., Fan, Y., & Tanaka, M. M. (2007). Sequential Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the United States of America, 104(6), 1760–1765.

    Article  MathSciNet  MATH  Google Scholar 

  • Zhou, Y., Johansen, A. M., & Aston, J. A. D. (2016). Toward automatic model comparison: An adaptive sequential Monte Carlo approach. Journal of Computational and Graphical Statistics, 25(3), 701–726.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Chopin, N., Papaspiliopoulos, O. (2020). SMC Samplers. In: An Introduction to Sequential Monte Carlo. Springer Series in Statistics. Springer, Cham. https://doi.org/10.1007/978-3-030-47845-2_17

Download citation

Publish with us

Policies and ethics