Author
Cartis, C
Roberts, L
Journal title
Mathematical Programming Computation
DOI
10.1007/s12532-019-00161-7
Last updated
2020-05-25T19:43:08.49+01:00
Abstract
We present DFO-GN, a derivative-free version of the Gauss-Newton method for solving nonlinear least-squares problems. As is common in derivative-free optimization, DFO-GN uses interpolation of function values to build a model of the objective, which is then used within a trust-region framework to give a globally-convergent algorithm requiring $O(\epsilon^{-2})$ iterations to reach approximate first-order criticality within tolerance $\epsilon$. This algorithm is a simplification of the method from [H. Zhang, A. R. Conn, and K. Scheinberg, A Derivative-Free Algorithm for Least-Squares Minimization, SIAM J. Optim., 20 (2010), pp. 3555-3576], where we replace quadratic models for each residual with linear models. We demonstrate that DFO-GN performs comparably to the method of Zhang et al. in terms of objective evaluations, as well as having a substantially faster runtime and improved scalability.
Symplectic ID
890917
Publication type
Journal Article
Publication date
1 January 2019
Please contact us with feedback and comments about this page. Created on 25 Jul 2018 - 17:30.