Summary
This chapter covers SMC samplers , that is, particle algorithms that are able to track a sequence of probability distributions \(\mathbb {P}_t(\mathrm {d}\theta )\), related by the recursion \(\mathbb {P}_t(\mathrm {d}\theta ) = \ell _t G_t(\theta ) \mathbb {P}_{t-1}(\mathrm {d}\theta )\). Chapter 3 gave a few examples of applications of these algorithms; in some, one is genuinely interested in approximating each distribution in a given sequence, e.g., sequential Bayesian learning, in which case the sequence corresponds to the incorporation of more and more data; in others, one is interested in a single target distribution, but an artificial sequence is designed so as to be able to implement successfully an SMC sampler, and the sequence corresponds to increasing “temperatures”. Despite the different applications and appearances, the two approaches are not that different. We can think the former as “data tempering” and the latter as “temperature tempering” and in both situations sampling early distributions is easier than later ones.
When used to approximate a single distribution, SMC samplers represent an alternative to MCMC, with the following advantages: first, they offer a simple way to estimate the normalising constant of the target; this quantity is of interest in many problems, in particular in Bayesian model choice. Second, they are easier to parallelise. Third, it is reasonably easy to develop adaptive SMC samplers; that is algorithms which calibrate their tuning parameters automatically, using the current sample of particles.
We start by describing a Feynman-Kac model that formalises a large class of SMC samplers, based on invariant kernels. We then provide practical recipes to obtain good performance in the following scenarios: Bayesian sequential learning, tempering, rare-event simulation, and likelihood-free inference. Finally, we explain how to further extend SMC samplers by using more general Markov kernels.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Bibliography
Amzal, B., Bois, F. Y., Parent, E., & Robert, C. P. (2006). Bayesian-optimal design via interacting particle systems. Journal of the American Statistical Association, 101(474), 773–785.
Beskos, A., Crisan, D., & Jasra, A. (2014). On the stability of sequential Monte Carlo methods in high dimensions. Annals of Applied Probability, 24(4), 1396–1445.
Beskos, A., Jasra, A., Kantas, N., & Thiery, A. (2016). On the convergence of adaptive sequential Monte Carlo methods. Annals of Applied Probability, 26(2), 1111–1146.
Beskos, A., Jasra, A., Law, K., Tempone, R., & Zhou, Y. (2017). Multilevel sequential Monte Carlo samplers. Stochastic Processes and Their Applications, 127(5), 1417–1440.
Bornn, L., Doucet, A., & Gottardo, R. (2010). An efficient computational approach for prior sensitivity analysis and cross-validation. Canadian Journal of Statistics, 38(1), 47–64.
Botev, Z. I., & Kroese, D. P. (2008). An efficient algorithm for rare-event probability estimation, combinatorial optimization, and counting. Methodology and Computing in Applied Probability, 10(4), 471–505.
Botev, Z. I., & Kroese, D. P. (2012). Efficient Monte Carlo simulation via the generalized splitting method. Statistics and Computing, 22(1), 1–16.
Buchholz, A., & Chopin, N. (2019). Improving approximate Bayesian computation via Quasi-Monte Carlo. Journal of Computational and Graphical Statistics, 28(1), 205–219.
Buchholz, A., Chopin, N., & Jacob, P. E. (2020). Adaptive tuning of Hamiltonian Monte Carlo within Sequential Monte Carlo. Bayesian Analysis (to appear).
Cappé, O., Guillin, A., Marin, J. M., & Robert, C. P. (2004). Population Monte Carlo. Journal of Computational and Graphical Statistics, 13(4), 907–929.
Cérou, F., Del Moral, P., Furon, T., & Guyader, A. (2012). Sequential Monte Carlo for rare event estimation. Statistics and Computing, 22(3), 795–808.
Chen, X., Christensen, T. M., & Tamer, E. (2018). Monte Carlo confidence sets for identified sets. Econometrica, 86(6), 1965–2018.
Chopin, N. (2002). A sequential particle filter method for static models. Biometrika, 89(3), 539–551.
Chopin, N., & Ridgway, J. (2017). Leave Pima Indians alone: Binary regression as a benchmark for Bayesian computation. Statistical Science, 32(1), 64–87.
Chopin, N., Rousseau, J., & Liseo, B. (2013). Computational aspects of Bayesian spectral density estimation. Journal of Computational and Graphical Statistics, 22(3), 533–557.
Del Moral, P., Doucet, A., & Jasra, A. (2006). Sequential Monte Carlo samplers. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68(3), 411–436.
Del Moral, P., Doucet, A., & Jasra, A. (2007). Sequential Monte Carlo for Bayesian computation. In J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith, & M. West (Eds.), Bayesian statistics 8. Oxford science publications (Vol. 8, pp. 115–148). Oxford: Oxford University Press.
Del Moral, P., Doucet, A., & Jasra, A. (2012). An adaptive sequential Monte Carlo method for approximate Bayesian computation. Statistics and Computing, 22(5), 1009–1020.
Didelot, X., Everitt, R. G., Johansen, A. M., & Lawson, D. J. (2011). Likelihood-free estimation of model evidence. Bayesian Analysis, 6(1), 49–76.
Douc, R., Guillin, A., Marin, J.-M., & Robert, C. P. (2007a). Convergence of adaptive mixtures of importance sampling schemes. Annals of Statistics, 35(1), 420–448.
Douc, R., Guillin, A., Marin, J.-M., & Robert, C. P. (2007b). Minimum variance importance sampling via population Monte Carlo. ESAIM: Probability and Statistics, 11, 427–447.
Everitt, R. G., Johansen, A. M., Rowing, E., & Evdemon-Hogan, M. (2017). Bayesian model comparison with un-normalised likelihoods. Statistics and Computing, 27(2), 403–422.
Fearnhead, P., & Taylor, B. M. (2013). An adaptive sequential Monte Carlo sampler. Bayesian Analysis, 8(2), 411–438.
Giles, M. B. (2008). Multilevel Monte Carlo path simulation. Operational Research, 56(3), 607–617.
Guedj, B., (2019). A Primer on PAC-Bayesian Learning. arxiv preprint 1901.05353
Heng, J., Bishop, A. N., Deligiannidis, G., & Doucet, A. (2020). Controlled Sequential Monte Carlo. Annals of Statistics (to appear).
Herbst, E., & Schorfheide, F. (2014). Sequential Monte Carlo sampling for DSGE models. Journal of Applied Econometrics, 29(7), 1073–1098.
Jasra, A., Stephens, D., Doucet, & Tsagaris, T. (2011). Inference for Lévy driven stochastic volatility models via Sequential Monte Carlo. Scandinavian Journal of Statistics, 38(1), 1–22.
Kroese, D. P., Taimre, T., & Botev, Z. I. (2011). Handbook of Monte Carlo methods. Hoboken, NJ: Wiley.
Lee, A. (2012). On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers. In Proceedings of the 2012 Winter Simulation Conference (WSC) (pp. 304–315). Berlin: IEEE.
Naesseth, C. A., Lindsten, F., & Schön, T. B. (2014). Sequential Monte Carlo for graphical models. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in neural information processing systems 27 (pp. 1862–1870). Red Hook, NY: Curran Associates, Inc.
Neal, R. M. (2001). Annealed importance sampling. Statistics and Computing, 11(2), 125–139.
Olsson, J., Pavlenko, T., & Rios, F. L. (2019). Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods. Electronic Journal of Statistics, 13(2), 2865–2897.
Prangle, D., Everitt, R. G., & Kypraios, T. (2018). A rare event approach to high-dimensional approximate Bayesian computation. Statistics and Computing, 28(4), 819–834.
Ridgway, J. (2016). Computation of Gaussian orthant probabilities in high dimension. Statistics and Computing, 26(4), 899–916.
Ridgway, J., Alquier, P., Chopin, N., & Liang, F. (2014). PAC-Bayesian AUC classification and scoring. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in neural information processing systems 27 (pp. 658–666). Red Hook, NY: Curran Associates, Inc.
Rubinstein, R. Y., & Kroese, D. P. (2004). The cross-entropy method: A unified approach to combinatorial optimization, Monte-Carlo simulation, and machine learning. Berlin/Heidelberg: Springer.
Schäfer, C. (2013). Particle algorithms for optimization on binary spaces. ACM Transactions on Modeling and Computer Simulation, 23(1), Art. 8, 25.
Schäfer, C., & Chopin, N. (2013). Sequential Monte Carlo on large binary sampling spaces. Statistics and Computing, 23(2), 163–184.
Sisson, S. A., Fan, Y., & Tanaka, M. M. (2007). Sequential Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the United States of America, 104(6), 1760–1765.
Zhou, Y., Johansen, A. M., & Aston, J. A. D. (2016). Toward automatic model comparison: An adaptive sequential Monte Carlo approach. Journal of Computational and Graphical Statistics, 25(3), 701–726.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Chopin, N., Papaspiliopoulos, O. (2020). SMC Samplers. In: An Introduction to Sequential Monte Carlo. Springer Series in Statistics. Springer, Cham. https://doi.org/10.1007/978-3-030-47845-2_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-47845-2_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-47844-5
Online ISBN: 978-3-030-47845-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)