Publications

Martingale Posterior Distributions

arXiv, 2021

We introduce the martingale posterior distribution, which returns Bayesian uncertainty directly on any statistic of interest without the need for the likelihood and prior, and this distribution can be sampled through a computational scheme we name predictive resampling. To that end, we introduce new predictive methodologies for multivariate density estimation, regression and classification that build upon recent work on bivariate copulas.

E. Fong, C. Holmes, S. G. Walker. "Martingale Posterior Distributions" arXiv. Link

On the marginal likelihood and cross-validation

Biometrika, 2020

We show that the marginal likelihood is formally equivalent to exhaustive leave-p-out crossvalidation averaged over all values of p and all held-out test sets when using the log posterior predictive probability as the scoring rule. Moreover, the log posterior predictive score is the only coherent scoring rule under data exchangeability.

E. Fong, C. Holmes (2020). "On the marginal likelihood and cross-validation" Biometrika. Link

Scalable Nonparametric Sampling from Multimodal Posteriors with the Posterior Bootstrap

ICML, 2019

We present a scalable Bayesian nonparametric learning routine that enables posterior sampling through the optimization of suitably randomized objective functions. A Dirichlet process prior on the unknown data distribution accounts for model misspecification, and admits an embarrassingly parallel posterior bootstrap algorithm that generates independent and exact samples from the nonparametric posterior distribution. Our method is particularly adept at sampling from multimodal posterior distributions via a random restart mechanism.

E. Fong, S. Lyddon, C. Holmes (2019). "Scalable Nonparametric Sampling from Multimodal Posteriors." ICML, 2019. Link