Article

Inexact subgradient methods for semialgebraic functions

Jérôme Bolte, Tam Le, Eric Moulines et Edouard Pauwels

Résumé

Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an distance, where denotes the magnitude of subgradient evaluation errors, and encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of . In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.

Mots-clés

Inexact subgradient; Clarke subdifferential; Nonsmooth nonconvex optimization; Path differentiable functions; First-order methods; Semialgebraic functions;

Référence

Jérôme Bolte, Tam Le, Eric Moulines et Edouard Pauwels, « Inexact subgradient methods for semialgebraic functions », Mathematical Programming, juin 2025.

Publié dans

Mathematical Programming, juin 2025