A Randomised Subspace Gauss-Newton Method for Nonlinear Least-Squares

20 October 2020
12:45
Abstract

We propose a subspace Gauss-Newton method for nonlinear least squares problems that builds a sketch of the Jacobian on each iteration. We provide global rates of convergence for regularization and trust-region variants, both in expectation and as a tail bound, for diverse choices of the sketching matrix that are suitable for dense and sparse problems. We also have encouraging computational results on machine learning problems.

The join button will be published on the right (Above the view all button) 30 minutes before the seminar starts (login required).

  • Junior Applied Mathematics Seminar