Anna Seigal, one of Oxford Mathematics's Hooke Fellows and a Junior Research Fellow at The Queen's College, has been awarded the 2020 Society for Industrial and Applied Mathematics (SIAM) Richard C. DiPrima Prize. The prize recognises an early career researcher in applied mathematics and is based on their doctoral dissertation. 

Spatiotemporal variability in case fatality ratios for the 2013–2016 Ebola epidemic in West Africa
Forna, A Dorigatti, I Nouvellet, P Donnelly, C International Journal of Infectious Diseases volume 93 48-55 (28 Apr 2020)
Fri, 28 Feb 2020

10:00 - 11:00
L3

Compressed Sensing or common sense?

Christopher Townsend
(Leonardo)
Abstract

We present a simple algorithm that successfully re-constructs a sine wave, sampled vastly below the Nyquist rate, but with sampling time intervals having small random perturbations. We show how the fact that it works is just common sense, but then go on to discuss how the procedure relates to Compressed Sensing. It is not exactly Compressed Sensing as traditionally stated because the sampling transformation is not linear.  Some published results do exist that cover non-linear sampling transformations, but we would like a better understanding as to what extent the relevant CS properties (of reconstruction up to probability) are known in certain relatively simple but non-linear cases that could be relevant to industrial applications.

Fri, 14 Feb 2020

12:00 - 13:00
L4

Adaptive Gradient Descent without Descent

Konstantin Mischenko
(King Abdullah University of Science and Technology (KAUST))
Abstract

We show that two rules are sufficient to automate gradient descent: 1) don't increase the stepsize too fast and 2) don't overstep the local curvature. No need for functional values, no line search, no information about the function except for the gradients. By following these rules, you get a method adaptive to the local geometry, with convergence guarantees depending only on smoothness in a neighborhood of a solution. Given that the problem is convex, our method will converge even if the global smoothness constant is infinity. As an illustration, it can minimize arbitrary continuously twice-differentiable convex function. We examine its performance on a range of convex and nonconvex problems, including matrix factorization and training of ResNet-18.

Tue, 11 Feb 2020

12:45 - 14:00
C3

Elastic deformations of a thin component moved by a robot

Oliver Bond
(Oxford University)
Abstract

Many manufacturing processes require the use of robots to transport parts around a factory line. Some parts, which are very thin (e.g. car doors)
are prone to elastic deformations as they are moved around by a robot. These should be avoided at all cost. A problem that was recently raised by
F.E.E. (Fleischmann Elektrotech Engineering) at the ESGI 158 study group in Barcelona was to ascertain how to determine the stresses of a piece when
undergoing a prescribed motion by a robot. We present a simple model using Kirschoff-Love theory of flat plates and how this can be adapted. We
outline how the solutions of the model can then be used to determine the stresses. 

Some remarkable women shaped Oxford computing: Dorothy Hodgkin won the Nobel Prize for work on insulin; Susan Hockey pioneered digital humanities; Shirley Carter, Linda Hayes and Joan Walsh got the pioneering software company NAG off the ground in 1970; and female operators and programmers were at the heart of the early large-scale computing efforts powering 20th-century science.

Index formulae for line bundle cohomology on complex surfaces
Brodie, C Constantin, A Deen, R Lukas, A Fortschritte der Physik / Progress of Physics volume 68 issue 2 (26 Jan 2020)
Scalable metropolis-hastings for exact Bayesian inference with large datasets
Cornish, R Vanetti, P Bouchard-Côté, A Deligiannidis, G Doucet, A 36th International Conference on Machine Learning, ICML 2019 volume 2019-June 2398-2429 (01 Jan 2019)
Subscribe to