Date
Mon, 13 Feb 2023
Time
15:30 - 16:30
Location
L1
Speaker
Nikolas Tapia

Using rough path techniques, we provide a priori estimates for the output of Deep Residual Neural Networks in terms of both the input data and the (trained) network weights. As trained network weights are typically very rough when seen as functions of the layer, we propose to derive stability bounds in terms of the total p-variation of trained weights for any p∈[1,3]. Unlike the C1-theory underlying the neural ODE literature, our estimates remain bounded even in the limiting case of weights behaving like Brownian motions, as suggested in [Cohen-Cont-Rossier-Xu, "Scaling Properties of Deep Residual Networks”, 2021]. Mathematically, we interpret residual neural network as solutions to (rough) difference equations, and analyse them based on recent results of discrete time signatures and rough path theory. Based on joint work with C. Bayer and P. K. Friz.
 

Please contact us with feedback and comments about this page. Last updated on 28 Apr 2023 11:11.