Welcome to the homepage of the Networks seminars, a weekly seminar series on networks, complex systems, and related topics held in the Mathematical Institute.  In this year's series, we will alternate between regular talks and "fresh from the arXiv" talks (FFTA) in which we invite the author of a recently published (pre)print to discuss their work. Suggestions are always welcome!

The Networks seminar usually takes place on Tuesdays at 14:00-15:00. In line with current regulation, we are excited to announce that the seminars will now run with a new hybrid format that will allow attendees to choose whether to join our group in person in room C1 at the Mathematical Institute, or to attend remotely on Zoom. A link to the event will be made available in the schedule of upcoming talks below (for logged-in users) and via the mailing list.

To sign up to our mailing list simply send an empty email to the following address:
@email

If you would like to give a presentation at our seminar, please do not hesitate to contact the organisers Erik Hörmann and Yu Tian. The presentation can be either about your own work or on some (recent) interesting article on networks or on complex systems in general.

In case you missed any of the talks, we will also make recordings of the talks available on our youtube channel.

 

Upcoming Seminars

Tue, 12 May 2026

14:00 - 15:00
C3

Embedding Dynamics in Latent Manifolds of Asymmetric Neural Networks

Ramón Nartallo-Kaluarachchi
((Mathematical Institute University of Oxford))
Abstract

Recurrent neural networks (RNNs) provide a theoretical framework for understanding computation in biological neural circuits, yet classical results, such as Hopfield's model of associative memory, rely on symmetric connectivity that restricts network dynamics to gradient-like flows. In contrast, biological networks support rich time-dependent behaviour facilitated by their asymmetry. In this talk, I will introduce a general framework, known as ‘drift-diffusion matching’, for training continuous-time RNNs to represent arbitrary stochastic dynamical systems within a low-dimensional latent subspace. Allowing asymmetric connectivity, I will show that RNNs can embed the drift and diffusion of an arbitrary stochastic differential equation, including nonlinear and nonequilibrium dynamics such as chaotic attractors. As an application, we have constructed RNN realisations of stochastic systems that transiently explore various attractors through both input-driven switching and autonomous transitions driven by nonequilibrium currents, which we interpret as models of associative and sequential (episodic) memory. To elucidate how these dynamics are encoded in the network, I will introduce decompositions of the RNN based on its asymmetric connectivity and its time-irreversibility. These results extend attractor neural network theory beyond equilibrium, showing that asymmetric neural populations can implement a broad class of dynamical computations within low-dimensional manifolds, unifying ideas from associative memory, nonequilibrium statistical mechanics, and neural computation.

Tue, 02 Jun 2026

14:00 - 15:00
C3

TBA

Torben Berndt
(Heidelberg Institute for Theoretical Studies)
Tue, 16 Jun 2026

14:00 - 15:00
C3

TBA

Thilo Gross
(University of Oldenburg)

You can also find a list of all talks (with abstracts) prior to 2018 here, and the former website
of the Networks journal club at the Oxford complexity center (CABDyN) here.

Unsubscribing?

Simply send an empty email to
@email

Last updated on 29 Nov 2024, 12:47pm. Please contact us with feedback and comments about this page.