Author
Boulle, N
Nakatsukasa, Y
Townsend, A
Journal title
Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Last updated
2023-05-30T20:19:09.26+01:00
Abstract
We consider neural networks with rational activation functions. The choice of the nonlinear activation function in deep learning architectures is crucial and heavily impacts the performance of a neural network. We establish optimal bounds in terms of network complexity and prove that rational neural networks approximate smooth functions more efficiently than ReLU networks with exponentially smaller depth. The flexibility and smoothness of rational activation functions make them an attractive alternative to ReLU, as we demonstrate with numerical experiments.
Symplectic ID
1146199
Favourite
Off
Publication type
Conference Paper
Publication date
12 Dec 2020
Please contact us with feedback and comments about this page. Created on 20 Nov 2020 - 18:36.