Variational Sequential Monte Carlo.
Many recent advances in large scale probabilistic inference rely onvariational methods. The success of variational approaches depends on (i)formulating a flexible parametric family of distributions, and (ii) optimizingthe parameters to find the member of this family that most closely approximatesthe exact posterior. In this paper we present a new approximating family ofdistributions, the variational sequential Monte Carlo (VSMC) family, and showhow to optimize it in variational inference. VSMC melds variational inference(VI) and sequential Monte Carlo (SMC), providing practitioners with flexible,accurate, and powerful Bayesian inference. The VSMC family is a variationalfamily that can approximate the posterior arbitrarily well, while stillallowing for efficient optimization of its parameters. We demonstrate itsutility on state space models, stochastic volatility models for financial data,and deep Markov models of brain neural circuits.
Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.