September 25, 2025, 11:00–12:15
Toulouse
Room Auditorium 3
MAD-Stat. Seminar
Abstract
How to go beyond the square distance d² in optimization algorithms and flows in metric spaces? Replacing it with a general cost function c(x,y) and using a majorize-minimize framework I will detail a generic class of algorithms encompassing Newton/mirror/natural/Riemannian gradient descent/Sinkhorn/EM by reframing them as an alternating minimization, each for a different cost c(x,y). Rooted in cross-differences, the convergence theory to the infimum and to the continuous flow is investigated is based on a (discrete) evolution variational inequality (EVI) which enjoys similar properties to the EVI with d² regularizer. This provides a theoretical framework for studying splitting schemes beyond the usual implicit Euler in gradient flows. This talk is based on the works https://arxiv.org/abs/2305.04917 with Flavien Léger (INRIA Paris), and https://arxiv.org/abs/2505.00559 with Giacomo Sodini and Ulisse Stefanelli (Uni Vienna).