Date
Tue, 21 Feb 2023
Time
14:30 - 15:00
Location
Lecture Room 3
Speaker
Karl Welzel

At the heart of all quasi-Newton methods is an update rule that enables us to gradually improve the Hessian approximation using the already available gradient evaluations. Theoretical results show that the global performance of optimization algorithms can be improved with higher-order derivatives. This motivates an investigation of generalizations of quasi-Newton update rules to obtain for example third derivatives (which are tensors) from Hessian evaluations. Our generalization is based on the observation that quasi-Newton updates are least-change updates satisfying the secant equation, with different methods using different norms to measure the size of the change. We present a full characterization for least-change updates in weighted Frobenius norms (satisfying an analogue of the secant equation) for derivatives of arbitrary order. Moreover, we establish convergence of the approximations to the true derivative under standard assumptions and explore the quality of the generated approximations in numerical experiments.

Please contact us with feedback and comments about this page. Last updated on 10 Feb 2023 15:24.