Date
Thu, 17 Jan 2019
Time
14:00 - 15:00
Location
L4
Speaker
Dr Anton Schiela
Organisation
Bayreuth

Just like optimization needs derivatives, shape optimization needs shape derivatives. Their definition and computation is a classical subject, at least concerning first order shape derivatives. Second derivatives have been studied as well, but some aspects of their theory still remains a bit mysterious for practitioners. As a result, most algorithms for shape optimization are first order methods.

To understand this situation better and in a general way, we consider first and second order shape sensitivities of integrals on smooth submanifolds using a variant of shape differentiation. Instead of computing the second derivative as the derivative of the first derivative, we choose a one-parameter family of perturbations  and compute first and second derivatives with respect to that parameter. The result is a  quadratic form in terms of a perturbation vector field that yields a second order quadratic model of the perturbed functional, which can be used as the basis of a second order shape optimization algorithm. We discuss the structure of this derivative, derive domain expressions and Hadamard forms in a general geometric framework, and give a detailed geometric interpretation of the arising terms.

Finally, we use our results to construct a second order SQP-algorithm for shape optimization that exhibits indeed local fast convergence.

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 14:57.