Date
Tue, 14 May 2019
Time
14:30 - 15:00
Location
L3
Speaker
Timo Welti
Organisation
ETHZ

Numerical simulations indicate that deep artificial neural networks (DNNs) seem to be able to overcome the curse of dimensionality in many computational  problems in the sense that the number of real parameters used to describe the DNN grows at most polynomially in both the reciprocal of the prescribed approximation accuracy and the dimension of the function which the DNN aims to approximate. However, there are only a few special situations where results in the literature can rigorously explain the success of DNNs when approximating high-dimensional functions.

In this talk it is revealed that DNNs do indeed overcome the curse of dimensionality in the numerical approximation of Kolmogorov PDEs with constant diffusion and nonlinear drift coefficients. A crucial ingredient in our proof of this result is the fact that the artificial neural network used to approximate the PDE solution really is a deep artificial neural network with a large number of hidden layers.

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 14:57.