Article

Nonconvex Lagrangian-based optimization: Monitoring schemes and global convergence

Jérôme Bolte, Shoham Sabach et Marc Teboulle

Résumé

We introduce a novel approach addressing global analysis of a difficult class of nonconvexnonsmooth optimization problems within the important framework of Lagrangian-based methods. This genuine nonlinear class captures many problems in modern disparate fields of applications. It features complex geometries, qualification conditions, and other regularity properties do not hold everywhere. To address these issues we work along several research lines to develop an original general Lagrangian methodology which can deal, all at once, with the above obstacles. A first innovative feature of our approach is to introduce the concept of Lagrangian sequences for a broad class of algorithms. Central to this methodology is the idea of turning an arbitrary descent method into a multiplier method. Secondly, we provide these methods with a transitional regime allowing us to identify in finitely many steps a zone where we can tune the step-sizes of the algorithm for the final converging regime. Then, despite the min-max nature of Lagrangian methods, using an original Lyapunov method we prove that each bounded sequence generated by the resulting monitoring schemes are globally convergent to a critical point for some fundamental Lagrangian-based methods in the broad semialgebraic setting, which to the best of our knowledge, are the first of this kind.

Référence

Jérôme Bolte, Shoham Sabach et Marc Teboulle, « Nonconvex Lagrangian-based optimization: Monitoring schemes and global convergence », Mathematics of Operations Research, vol. 43, n° 4, novembre 2018, p. 1051–1404.

Voir aussi

Publié dans

Mathematics of Operations Research, vol. 43, n° 4, novembre 2018, p. 1051–1404