Author
Boumal, N
Absil, P
Cartis, C
Journal title
IMA Journal of Numerical Analysis
DOI
10.1093/imanum/drx080
Issue
1
Volume
39
Last updated
2024-04-15T17:57:13.027+01:00
Page
1-33
Abstract
We consider the minimization of a cost function $f$ on a manifold $M$ using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance $\varepsilon$. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of $f$ to the tangent spaces of $M$, both of these algorithms produce points with Riemannian gradient smaller than $\varepsilon$ in $O(1/\varepsilon^2)$ iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than -$\varepsilon$ in $O(1/\varepsilon^3)$ iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy $\varepsilon$ (up to constants) and hence are sharp in that sense. These are the first general results for global rates of convergence to approximate first- and second-order KKT points on manifolds. They apply in particular for optimization constrained to compact submanifolds of $\mathbb{R}^n$, under simpler assumptions.
Symplectic ID
808769
Favourite
Off
Publication type
Journal Article
Publication date
07 Feb 2018
Please contact us with feedback and comments about this page. Created on 04 Dec 2017 - 21:51.