11:30
Non-archimedean parametrizations and some bialgebraicity results
Abstract
We will provide a general overview on some recent work on non-archimedean parametrizations and their applications. We will start by presenting our work with Cluckers and Comte on the existence of good Yomdin-Gromov parametrizations in the non-archimedean context and a $p$-adic Pila-Wilkie theorem. We will then explain how this is used in our work with Chambert-Loir to prove bialgebraicity results in products of Mumford curves.
14:15
From calibrated geometry to holomorphic invariants
Abstract
Calibrated geometry, more specifically Calabi-Yau geometry, occupies a modern, rather sophisticated, cross-roads between Riemannian, symplectic and complex geometry. We will show how, stripping this theory down to its fundamental holomorphic backbone and applying ideas from classical complex analysis, one can generate a family of purely holomorphic invariants on any complex manifold. We will then show how to compute them, and describe various situations in which these invariants encode, in an intrinsic fashion, properties not only of the given manifold but also of moduli spaces.
Interest in these topics, if initially lacking, will arise spontaneously during this informal presentation.
Anna Seigal, one of Oxford Mathematics's Hooke Fellows and a Junior Research Fellow at The Queen's College, has been awarded the 2020 Society for Industrial and Applied Mathematics (SIAM) Richard C. DiPrima Prize. The prize recognises an early career researcher in applied mathematics and is based on their doctoral dissertation.
Compressed Sensing or common sense?
Abstract
We present a simple algorithm that successfully re-constructs a sine wave, sampled vastly below the Nyquist rate, but with sampling time intervals having small random perturbations. We show how the fact that it works is just common sense, but then go on to discuss how the procedure relates to Compressed Sensing. It is not exactly Compressed Sensing as traditionally stated because the sampling transformation is not linear. Some published results do exist that cover non-linear sampling transformations, but we would like a better understanding as to what extent the relevant CS properties (of reconstruction up to probability) are known in certain relatively simple but non-linear cases that could be relevant to industrial applications.
Adaptive Gradient Descent without Descent
Abstract
We show that two rules are sufficient to automate gradient descent: 1) don't increase the stepsize too fast and 2) don't overstep the local curvature. No need for functional values, no line search, no information about the function except for the gradients. By following these rules, you get a method adaptive to the local geometry, with convergence guarantees depending only on smoothness in a neighborhood of a solution. Given that the problem is convex, our method will converge even if the global smoothness constant is infinity. As an illustration, it can minimize arbitrary continuously twice-differentiable convex function. We examine its performance on a range of convex and nonconvex problems, including matrix factorization and training of ResNet-18.
Elastic deformations of a thin component moved by a robot
Abstract
Many manufacturing processes require the use of robots to transport parts around a factory line. Some parts, which are very thin (e.g. car doors)
are prone to elastic deformations as they are moved around by a robot. These should be avoided at all cost. A problem that was recently raised by
F.E.E. (Fleischmann Elektrotech Engineering) at the ESGI 158 study group in Barcelona was to ascertain how to determine the stresses of a piece when
undergoing a prescribed motion by a robot. We present a simple model using Kirschoff-Love theory of flat plates and how this can be adapted. We
outline how the solutions of the model can then be used to determine the stresses.
Some remarkable women shaped Oxford computing: Dorothy Hodgkin won the Nobel Prize for work on insulin; Susan Hockey pioneered digital humanities; Shirley Carter, Linda Hayes and Joan Walsh got the pioneering software company NAG off the ground in 1970; and female operators and programmers were at the heart of the early large-scale computing efforts powering 20th-century science.