Seminar series
Date
Mon, 14 Nov 2022
14:00
Location
L4
Speaker
Jalal Fadili
Organisation
CNRS-ENSICAEN-Université Caen

In this talk, I will discuss and introduce deep insight from the dynamical system perspective to understand the convergence guarantees of first-order algorithms involving inertial features for convex optimization in a Hilbert space setting.

Such algorithms are widely popular in various areas of data science (data processing, machine learning, inverse problems, etc.).
They can be viewed discrete as time versions of an inertial second-order dynamical system involving different types of dampings (viscous damping,  Hessian-driven geometric damping).

The dynamical system perspective offers not only a powerful way to understand the geometry underlying the dynamic, but also offers a versatile framework to obtain fast, scalable and new algorithms enjoying nice convergence guarantees (including fast rates). In addition, this framework encompasses known algorithms and dynamics such as the Nesterov-type accelerated gradient methods, and the introduction of time scale factors makes it possible to further accelerate these algorithms. The framework is versatile enough to handle non-smooth and non-convex objectives that are ubiquituous in various applications.

Please contact us with feedback and comments about this page. Last updated on 25 Oct 2022 15:57.