Mon, 14 Jun 2021

15:45 - 16:45
Virtual

The slope of a link computed via C-complexes

Ana Lecuona
(University of Glasgow)
Abstract

Together with Alex Degtyarev and Vincent Florence we introduced a new link invariant, called slope, of a colored link in an integral homology sphere. In this talk I will define the invariant, highlight some of its most interesting properties as well as its relationship to Conway polynomials and to the  Kojima–Yamasaki eta-function. The stress in this talk will be on our latest computational progress: a formula to calculate the slope from a C-complex.

Fri, 14 May 2021

16:00 - 17:00
Virtual

Academic positions between PhD and permanent jobs - a panel discussion

Candy Bowtell and Luci Basualdo Bonatto
(University of Oxford)
Abstract

In this session we will host a Q&A with current researchers who have recently gone through successful applications as well as more senior staff who have been on interview panels and hiring committees for postdoctoral positions in mathematics. The session will be a chance to get varied perspectives on the application process and find out about the different types of academic positions to apply for.

The panel members will be Candy Bowtell, Luci Basualdo Bonatto, Mohit Dalwadi, Ben Fehrman and Frances Kirwan. 

During the early growth of the brain, an extraordinary process takes place where axons, neurons, and nerves extend, grow, and connect to form an intricate network that will be used for all brain activities and cognitive processes. A fundamental scientific question is to understand the laws that these growing cells follow to find their correct target.

Mon, 07 Jun 2021

15:45 - 16:45
Virtual

The Farrell-Jones conjecture for hyperbolic-by-cyclic groups

Mladen Bestvina
(University of Utah)
Abstract

Most of the talk will be about the Farrell-Jones conjecture from the point of view of an outsider. I'll try to explain what the conjecture is about, why one wants to know it, and how to prove it in some cases. The motivation for the talk is my recent work with Fujiwara and Wigglesworth where we prove this conjecture for (virtually torsion-free hyperbolic)-by-cyclic groups. If there is time I will outline the proof of this result.

Round up, the Oxford Mathematics Annual Newsletter, is a calculated attempt to describe our lives, mathematical and non-mathematical, over the past 12 months. From a summary of some of our research into the Coronavirus to a moving tribute to Peter Neumann by Martin Bridson, via articles on diversity, fantasy football and of course our Nobel Prize winner (pictured), it throws a little light, we hope, on what we did during the year that was 2020.

Our 'Fantastic Voyage' through Oxford Mathematics Student Lectures brings us to four 3rd Year lectures by Dominic Joyce on Topological Surfaces. These lectures are shown pretty much as they are seen by the students (they use a different platform with a few more features but the lectures are the same) as we all get to grips with the online world. Lectures on Linear Algebra, Integral transforms, Networks, Set Theory, Maths History and much more will be shown over the next few weeks.

Tue, 18 May 2021
14:00
Virtual

Hashing embeddings of optimal dimension, with applications to linear least squares

Zhen Shao
(Mathematical Institute (University of Oxford))
Abstract

We investigate theoretical and numerical properties of sparse sketching for both dense and sparse Linear Least Squares (LLS) problems. We show that, sketching with hashing matrices --- with one nonzero entry per column and of size proportional to the rank of the data matrix --- generates a subspace embedding with high probability, provided the given data matrix has low coherence; thus optimal residual values are approximately preserved when the LLS matrix has similarly important rows. We then show that using $s-$hashing matrices, with $s>1$ nonzero entries per column, satisfy similarly good sketching properties for a larger class of low coherence data matrices. Numerically, we introduce our solver Ski-LLS for solving generic dense or sparse LLS problems. Ski-LLS builds upon the successful strategies employed in the Blendenpik and LSRN solvers, that use sketching to calculate a preconditioner before applying the iterative LLS solver LSQR. Ski-LLS significantly improves upon these sketching solvers by judiciously using sparse hashing sketching while also allowing rank-deficiency of input; furthermore, when the data matrix is sparse, Ski-LLS also applies a sparse factorization to the sketched input. Extensive numerical experiments show Ski-LLS is also competitive with other state-of-the-art direct and preconditioned iterative solvers for sparse LLS, and outperforms them in the significantly over-determined regime.

A link for this talk will be sent to our mailing list a day or two in advance.  If you are not on the list and wish to be sent a link, please contact @email.

Subscribe to