Foundations and Trends® in Machine Learning > Vol 12 > Issue 3

Elements of Sequential Monte Carlo

By Christian A. Naesseth, Columbia University, USA, christian.a.naesseth@columbia.edu | Fredrik Lindsten, Linköping University, Sweden, fredrik.lindsten@liu.se | Thomas B. Schön, Uppsala University, Sweden, thomas.schon@it.uu.se

 
Suggested Citation
Christian A. Naesseth, Fredrik Lindsten and Thomas B. Schön (2019), "Elements of Sequential Monte Carlo", Foundations and Trends® in Machine Learning: Vol. 12: No. 3, pp 307-392. http://dx.doi.org/10.1561/2200000074

Publication Date: 28 Nov 2019
© 2019 C. A. Naesseth, F. Lindsten and T. B. Schön
 
Subjects
Bayesian learning,  Learning and statistical methods,  Sampling
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction
2. Importance Sampling to Sequential Monte Carlo
3. Learning Proposals and Twisting Targets
4. Nested Monte Carlo: Algorithms and Applications
5. Conditional SMC: Algorithms and Applications
Acknowledgements 
6. Discussion
Acknowledgments
References

Abstract

A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and machine learning, which frames all inference as expectations with respect to the posterior distribution. The key challenge is to approximate these intractable expectations. In this tutorial, we review sequential Monte Carlo (SMC), a random-samplingbased class of methods for approximate inference. First, we explain the basics of SMC, discuss practical issues, and review theoretical results. We then examine two of the main user design choices: the proposal distributions and the so called intermediate target distributions. We review recent results on how variational inference and amortization can be used to learn efficient proposals and target distributions. Next, we discuss the SMC estimate of the normalizing constant, how this can be used for pseudo-marginal inference and inference evaluation. Throughout the tutorial we illustrate the use of SMC on various models commonly used in machine learning, such as stochastic recurrent neural networks, probabilistic graphical models, and probabilistic programs.

DOI:10.1561/2200000074
ISBN: 978-1-68083-632-5
134 pp. $90.00
Buy book (pb)
 
ISBN: 978-1-68083-633-2
134 pp. $140.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Importance Sampling to Sequential Monte Carlo
3. Learning Proposals and Twisting Targets
4. Nested Monte Carlo: Algorithms and Applications
5. Conditional SMC: Algorithms and Applications
Acknowledgements
6. Discussion
Acknowledgments
References

Elements of Sequential Monte Carlo

A key strategy in machine learning is to break down a problem into smaller and more manageable parts, then process data or unknown variables recursively. Sequential Monte Carlo (SMC) is a technique for solving statistical inference problems recursively. Over the last 20 years, SMC has been developed to enabled inference in increasingly complex and challenging models in Signal Processing and Statistics. This monograph shows how the powerful technique can be applied to machine learning problems such as probabilistic programming, variational inference and inference evaluation to name a few.

Written in a tutorial style, Elements of Sequential Monte Carlo introduces the basics of SMC, discusses practical issues, and reviews theoretical results before guiding the reader through a series of advanced topics to give a complete overview of the topic and its application to machine learning problems.

This monograph provides an accessible treatment for researchers of a topic that has recently gained significant interest in the machine learning community.

 
MAL-074