Counterexamples to some old-standing optimization problems in the smooth convex coercive setting are provided. We show that block-coordinate, steepest descent with exact search or Bregman descent methods do not generally converge. Other failures of various desirable features are established: directional convergence of Cauchy's gradient curves, convergence of Newton's flow,finite length of Tikhonov path, convergence of central paths, or smooth Kurdyka- Lojasiewicz inequality. All examples are planar. These examples are based on general smooth convex interpolation results. Given a decreasing sequence of positively curved Ck convex compact sets in the plane, we provide a level set interpolation of a Ck smooth convex function where k 2 is arbitrary. If the intersection is reduced to one point our interpolant has positive denite Hessian, otherwise it is positive denite out of the solution set. Further- more, given a sequence of decreasing polygons we provide an interpolant agreeing with the vertices and whose gradients coincide with prescribed normals.
Jérôme Bolte et Edouard Pauwels, « Curiosities and counterexamples in smooth convex optimization », TSE Working Paper, n° 20-1080, mars 2020.
TSE Working Paper, n° 20-1080, mars 2020