A Generative Deep Recurrent Model for Exchangeable Data.
We present a novel model architecture which leverages deep learning tools toperform exact Bayesian inference on sets of high dimensional, complexobservations. Our model is provably exchangeable, meaning that the jointdistribution over observations is invariant under permutation: this propertylies at the heart of Bayesian inference. The model does not require variationalapproximations to train, and new samples can be generated conditional onprevious samples, with cost linear in the size of the conditioning set. Theadvantages of our architecture are demonstrated on learning tasks requiringgeneralisation from short observed sequences while modelling sequencevariability, such as conditional image generation, few-shot learning, setcompletion, and anomaly detection.
Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.