A Derivative-Free Gauss-Newton Method

Author: 

Cartis, C
Roberts, L

Publication Date: 

1 January 2019

Journal: 

Mathematical Programming Computation

Last Updated: 

2020-05-25T19:43:08.49+01:00

DOI: 

10.1007/s12532-019-00161-7

abstract: 

We present DFO-GN, a derivative-free version of the Gauss-Newton method for solving nonlinear least-squares problems. As is common in derivative-free optimization, DFO-GN uses interpolation of function values to build a model of the objective, which is then used within a trust-region framework to give a globally-convergent algorithm requiring $O(\epsilon^{-2})$ iterations to reach approximate first-order criticality within tolerance $\epsilon$. This algorithm is a simplification of the method from [H. Zhang, A. R. Conn, and K. Scheinberg, A Derivative-Free Algorithm for Least-Squares Minimization, SIAM J. Optim., 20 (2010), pp. 3555-3576], where we replace quadratic models for each residual with linear models. We demonstrate that DFO-GN performs comparably to the method of Zhang et al. in terms of objective evaluations, as well as having a substantially faster runtime and improved scalability.

Symplectic id: 

890917

Submitted to ORA: 

Submitted

Publication Type: 

Journal Article