Séminaire

How good is your Laplace approximation of the Bayesian posterior? Finite-sample error bounds for a variety of useful divergences

Mikolaj Kasprzak (University of Luxembourg)

20 mars 2023, 09h30–10h45

Auditorium A3

Maths Job Market Seminar

Résumé

The Laplace approximation is a popular method for obtaining estimates of intractable expectations with respect to Bayesian posteriors. But can we trust these estimates for practical use? A theoretical justification for this method comes from the Bernstein - von Mises Theorem, also known as the Bayesian Central Limit Theorem (BCLT), which gives conditions under which the posterior looks asymptotically Gaussian. One might, therefore, consider using rate-of-convergence bounds for the BCLT to construct non-asymptotic quality guarantees for the Laplace approximation. But the bounds in the existing versions of the BCLT either: require knowing the true data-generating parameter, are asymptotic in the number of samples, do not control the posterior mean and variance, or apply only to narrow classes of models. Such bounds are therefore of limited use in real-life applications. Our work provides the first closed-form, finite-sample quality bounds for the Laplace approximation that simultaneously (1) do not require knowing the true parameter, (2) control posterior means and variances, (3) control a variety of distances that metrize weak convergence and (4) apply generally to models that satisfy the conditions of the asymptotic BCLT. In fact, our bounds work even in the presence of misspecification. We compute exact constants in our bounds for a variety of standard models, including logistic regression, and numerically demonstrate their utility. We also provide a framework for the analysis of more complex models. Among the technical tools we use are Stein's method and the log-Sobolev inequality.

Voir aussi