Riemannian optimization is a powerful and active area of research that studies the optimization of functions defined on manifolds with structure. A class of functions of interest is the set of geodesically convex functions, which are functions that are convex when restricted to every geodesic. In this talk, we will present an accelerated first-order method, nearly achieving the same rates as accelerated gradient descent in the Euclidean space, for the optimization of smooth and g-convex or strongly g-convex functions defined on the hyperbolic space or a subset of the sphere. We will talk about accelerated optimization of another non-convex problem, defined in the Euclidean space, that we solve as a proxy. Additionally, for any Riemannian manifold of bounded sectional curvature, we will present reductions from optimization methods for smooth and g-convex functions to methods for smooth and strongly g-convex functions and vice versa.
This talk is based on the paper https://arxiv.org/abs/2012.03618.
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please contact email@example.com.
- Numerical Analysis Group Internal Seminar