November 9, 2023, 11:00–12:15
Room Auditorium 3 - JJ Laffont
Otto and Villani's HWI inequality is an inequality linking entropy, Fisher information and Wasserstein distances on the space of probability measures. It was introduced as a consequence of the convexity of entropy on the space of probability measures, and has been studied from different points of view (geometric via Ricci curvature, probabilistic via Bakry-Emery calculus, analytic via gradient descent algorithms and Lojasiewicz inequalities...). In this talk, I will present a proof of the special case of a Gaussian reference measure, due to Yihong Wu, which uses couplings of diffusion processes. I will then explain how his idea adapts to the case of certain simple graphs (hypercube, discrete torus), and leads to new functional inequalities, different from those obtained via the convexity point of view.