Date
Tue, 22 Jan 2019
Time
14:00 - 14:30
Location
L5
Speaker
Trond Steihaug
Organisation
Bergen

In this talk, we consider solving nonlinear systems of equations and the unconstrained minimization problem using Newton’s method methods from the Halley class. The methods in this class have in general local and third order rate of convergence while Newton’s method has quadratic convergence. In the unconstrained optimization case, the Halley methods will require the second and third derivative. Third-order methods will, in most cases, use fewer iterations than a second-order method to reach the same accuracy. However, the number of arithmetic operations per iteration is higher for third-order methods than for a second-order method. We will demonstrate that for a large class of problems, the ratio of the number of arithmetic operations of Halley’s method and Newton’s method is constant per iteration (independent of the number of unknowns).

We say that the sparsity pattern of the third derivative (or tensor) is induced by the sparsity pattern of the Hessian matrix. We will discuss some datastructures for matrices where the indices of nonzero elements of the tensor can be computed. Historical notes will be merged into the talk.

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 14:57.