The nice properties of MMD for statistical estimation

Pierre Alquier (Riken Aip)

November 10, 2022, 11:00–12:15


Room A3

MAD-Stat. Seminar


Maximum likelihood estimation (MLE) enjoys strong optimality property for statistical estimation, under strong assumptions. However, when these assumptions are not satisfied, MLE is no longer optimal, and sometimes it is totally catastrophic. In this talk, after a pedagogical introduction to the subject, we will explore alternative estimators based on the minimization of well chosen distances. In particular, we will see that the Maximum Mean Discrepancy (MMD) leads to estimation procedures that are consistent without any assumption on the model nor on the true distribution of the data. In practice, this leads to very strong robustness properties. In a second time, we will focus on Bayesian-type estimation. ABC (for Approximate Bayesian Computation) is a popular algorithm for computation of approximation of the posterior distribution. However, it relies on the choice of a so-called "summary statistics", that is not always clear in practice. Here again, we will show that the construction of a summary statistics based on MMD leads to an approximation of the posterior that is actually far more robust than the actual posterior.