A Randomised Subspace Gauss-Newton Method for Nonlinear Least-Squares
Abstract
We propose a subspace Gauss-Newton method for nonlinear least squares problems that builds a sketch of the Jacobian on each iteration. We provide global rates of convergence for regularization and trust-region variants, both in expectation and as a tail bound, for diverse choices of the sketching matrix that are suitable for dense and sparse problems. We also have encouraging computational results on machine learning problems.