Seminar series
Date
Tue, 12 May 2026
Time
14:00 - 15:00
Location
C3
Speaker
Ramón Nartallo-Kaluarachchi
Organisation
(Mathematical Institute University of Oxford)
Add to calendar

Recurrent neural networks (RNNs) provide a theoretical framework for understanding computation in biological neural circuits, yet classical results, such as Hopfield's model of associative memory, rely on symmetric connectivity that restricts network dynamics to gradient-like flows. In contrast, biological networks support rich time-dependent behaviour facilitated by their asymmetry. In this talk, I will introduce a general framework, known as ‘drift-diffusion matching’, for training continuous-time RNNs to represent arbitrary stochastic dynamical systems within a low-dimensional latent subspace. Allowing asymmetric connectivity, I will show that RNNs can embed the drift and diffusion of an arbitrary stochastic differential equation, including nonlinear and nonequilibrium dynamics such as chaotic attractors. As an application, we have constructed RNN realisations of stochastic systems that transiently explore various attractors through both input-driven switching and autonomous transitions driven by nonequilibrium currents, which we interpret as models of associative and sequential (episodic) memory. To elucidate how these dynamics are encoded in the network, I will introduce decompositions of the RNN based on its asymmetric connectivity and its time-irreversibility. These results extend attractor neural network theory beyond equilibrium, showing that asymmetric neural populations can implement a broad class of dynamical computations within low-dimensional manifolds, unifying ideas from associative memory, nonequilibrium statistical mechanics, and neural computation.

Last updated on 4 May 2026, 2:38am. Please contact us with feedback and comments about this page.