Low-rank functions in machine learning
Abstract
Speaker Edward Tansley will talk about: 'Low-rank functions in machine learning'
Functions that vary along a low-dimensional subspace of their input space, often called multi-index or low-rank functions, frequently arise in machine learning. Understanding how such structure emerges can provide insight into the learning dynamics of neural networks. One line of work that explores how networks learn low-rank data representations is the Neural Feature Ansatz (NFA), which states that after training, the Gram matrix of the first-layer weights of a deep network is proportional to some power of the average gradient outer product (AGOP) of the network with respect to its inputs. Existing results prove this relationship for 2-layer linear networks under balanced initialization. In this work, we extend these results to general L-layer linear networks and remove the assumption of balanced initialization for networks trained with weight decay.
The Centre for Teaching and Learning invites colleagues to apply for the Oxford Teaching, Learning and Educational Leadership Recognition Scheme accredited by Advance HE. The scheme includes a strong emphasis on recognition and educational leadership and is designed for staff who teach and support learning to achieve fellowships of the Higher Education Academy, including professional services staff.
The Summer Internship Programme offers a wide range of paid or funded internships in the UK and internationally across diverse sectors and organisations. These in person, remote, and hybrid opportunities run for 2-12 weeks during the long vacation and are exclusively available to Oxford students, from first year undergraduates to final year DPhils. International opportunities in 25 countries are advertised now, with further roles released on a rolling basis through to May.
