Date
Tue, 20 Oct 2020
Time
12:45 - 13:30
Speaker
Zhen Shao
Organisation
(Oxford University)

We propose a subspace Gauss-Newton method for nonlinear least squares problems that builds a sketch of the Jacobian on each iteration. We provide global rates of convergence for regularization and trust-region variants, both in expectation and as a tail bound, for diverse choices of the sketching matrix that are suitable for dense and sparse problems. We also have encouraging computational results on machine learning problems.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.