A coordinate descent algorithm on the Stiefel manifold for deep neural network training
Abstract
We propose to use stochastic Riemannian coordinate descent on the Stiefel manifold for deep neural network training. The algorithm rotates successively two columns of the matrix, an operation that can be efficiently implemented as a multiplication by a Givens matrix. In the case when the coordinate is selected uniformly at random at each iteration, we prove the convergence of the proposed algorithm under standard assumptions on the loss function, stepsize and minibatch noise. Experiments on benchmark deep neural network training problems are presented to demonstrate the effectiveness of the proposed algorithm.
26 Years at Oxford
Abstract
I will reflect on my time as Professor of Numerical Analysis.
13:00
Knot Homologies from Landau Ginsburg Models
Abstract
In her recent work, Mina Aganagic proposed novel perspectives on computing knot homologies associated with any simple Lie algebra. One of her proposals relies on counting intersection points between Lagrangians in Landau-Ginsburg models on symmetric powers of Riemann surfaces. In my talk, I am going to present a concrete algebraic algorithm for finding such intersection points, turning the proposal into an actual calculational tool. I am going to illustrate the construction on the example of the sl_2 invariant for the Hopf link. I am also going to comment on the extension of the story to homological invariants associated to gl(m|n) super Lie algebras, solving this long-standing problem. The talk is based on our work in progress with Mina Aganagic and Elise LePage.
14:15
Categorical and K-theoretic Donaldson-Thomas theory of $\mathbb{C}^3$
Abstract
Donaldson-Thomas theory associates integers (which are virtual counts of sheaves) to a Calabi-Yau threefold X. The simplest example is that of $\mathbb{C}^3$, when the Donaldson-Thomas (DT) invariant of sheaves of zero dimensional support and length d is $p(d)$, the number of plane partitions of $d$. The DT invariants have several refinements, for example a cohomological one, where instead of a DT invariant, one studies a graded vector space with Euler characteristic equal to the DT invariant. I will talk about two other refinements (categorical and K-theoretic) of DT invariants, focusing on the explicit case of $\mathbb{C}^3$. In particular, we show that the K-theoretic DT invariant for $d$ points on $\mathbb{C}^3$ also equals $p(d)$. This is joint work with Yukinobu Toda.
Mapper--type algorithms for complex data and relations
Abstract
Mapper and Ball Mapper are Topological Data Analysis tools used for exploring high dimensional point clouds and visualizing scalar–valued functions on those point clouds. Inspired by open questions in knot theory, new features are added to Ball Mapper that enable encoding of the structure, internal relations and symmetries of the point cloud. Moreover, the strengths of Mapper and Ball Mapper constructions are combined to create a tool for comparing high dimensional data descriptors of a single dataset. This new hybrid algorithm, Mapper on Ball Mapper, is applicable to high dimensional lens functions. As a proof of concept we include applications to knot and game theory, as well as material science and cancer research.