Seminar series
Date
Tue, 10 May 2022
Time
14:00 - 15:00
Location
L6
Speaker
Sheheryar Zaidi and Bryn Elesedy
Organisation
Oxford

One core aim of (supervised) machine learning is to approximate an unknown function given a dataset containing examples of input-output pairs. Real-world examples of such functions include the mapping from an image to its label or the mapping from a molecule to its energy. For a variety of such functions, while the precise mapping is unknown, we often have knowledge of its properties. For example, the label of an image may be invariant to rotations of the input image. Generally, such properties formally correspond to the function being equivariant to certain actions on its input and output spaces. This has led to much research on building equivariant function classes (aka neural networks). In this talk, we survey this growing field of equivariance in deep learning for a mathematical audience, motivating the need for equivariance, covering concrete examples of equivariant neural networks, and offering a learning theoretic perspective on the benefits of equivariance. 

Please contact us with feedback and comments about this page. Last updated on 05 May 2022 11:29.