Seminar series
Date
Thu, 23 Jul 2020
Time
16:00 - 17:00
Location
Virtual
Speaker
Franck Gabriel
Organisation
Ecole Polytechnique Federale de Lausanne

The random initialisation of Artificial Neural Networks (ANN) allows one to describe, in the functional space, the limit of the evolution of ANN when their width tends towards infinity. Within this limit, an ANN is initially a Gaussian process and follows, during learning, a gradient descent convoluted by a kernel called the Neural Tangent Kernel.

Connecting neural networks to the well-established theory of kernel methods allows us to understand the dynamics of neural networks, their generalization capability. In practice, it helps to select appropriate architectural features of the network to be trained. In addition, it provides new tools to address the finite size setting.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.