Date
Tue, 13 Feb 2018
Time
14:00 - 14:30
Location
L5
Speaker
Man-Chung Yue
Organisation
Imperial College

In this talk, we revisit the cubic regularization (CR) method for solving smooth non-convex optimization problems and study its local convergence behaviour. In their seminal paper, Nesterov and Polyak showed that the sequence of iterates of the CR method converges quadratically a local minimum under a non-degeneracy assumption, which implies that the local minimum is isolated. However, many optimization problems from applications such as phase retrieval and low-rank matrix recovery have non-isolated local minima. In the absence of the non-degeneracy assumption, the result was downgraded to the superlinear convergence of function values. In particular, they showed that the sequence of function values enjoys a superlinear convergence of order 4/3 (resp. 3/2) if the function is gradient dominated (resp. star-convex and globally non-degenerate). To remedy the situation, we propose a unified local error bound (EB) condition and show that the sequence of iterates of the CR method converges quadratically a local minimum under the EB condition. Furthermore, we prove that the EB condition holds if the function is gradient dominated or if it is star-convex and globally non-degenerate, thus improving the results of Nesterov and Polyak in three aspects: weaker assumption, faster rate and iterate instead of function value convergence. Finally, we apply our results to two concrete non-convex optimization problems that arise from phase retrieval and low-rank matrix recovery. For both problems, we prove that with overwhelming probability, the local EB condition is satisfied and the CR method converges quadratically to a global optimizer. We also present some numerical results on these two problems.

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 14:57.