Fri, 26 Jan 2024
16:00
L1

North meets South

Dr Cedric Pilatte (North Wing) and Dr Boris Shustin (South Wing)
Abstract

Speaker: Cedric Pilatte 
Title: Convolution of integer sets: a galaxy of (mostly) open problems

Abstract: Let S be a set of integers. Define f_k(n) to be the number of representations of n as the sum of k elements from S. Behind this simple definition lie fascinating conjectures that are very easy to state but seem unattackable. For example, a famous conjecture of Erdős and Turán predicts that if f_2 is bounded then it has infinitely many zeroes. This talk is designed as an accessible overview of these questions. 
 
Speaker: Boris Shustin

Title: Manifold-Free Riemannian Optimization

Abstract: Optimization problems constrained to a smooth manifold can be solved via the framework of Riemannian optimization. To that end, a geometrical description of the constraining manifold, e.g., tangent spaces, retractions, and cost function gradients, is required. In this talk, we present a novel approach that allows performing approximate Riemannian optimization based on a manifold learning technique, in cases where only a noiseless sample set of the cost function and the manifold’s intrinsic dimension are available.

Tue, 28 Nov 2023

16:00 - 17:00
L1

Euclidean Ramsey Theory

Imre Leader
(University of Cambridge)
Abstract

Euclidean Ramsey Theory is a natural multidimensional version of Ramsey Theory. A subset of Euclidean space is called Ramsey if, for any $k$, whenever we partition Euclidean space of sufficiently high dimension into $k$ classes, one class much contain a congruent copy of our subset. It is still unknown which sets are Ramsey. We will discuss background on this and then proceed to some recent results.

Mon, 20 Nov 2023
16:00
L1

Post-Quantum Cryptography (and why I’m in the NT corridor)

Patrick Hough
(University of Oxford)
Abstract

In this talk I will give a brief introduction to the field of post-quantum (PQ) cryptography, introducing a few of the most popular computational hardness assumptions. Second, I will give an overview of a recent work of mine on PQ electronic voting. I’ll finish by presenting a short selection of ‘exotic’ cryptographic constructions that I think are particularly hot at the moment (no, not blockchain). The talk will be definitionally light since I expect the area will be quite new to many and I hope this will make for a more engaging introduction.

Tue, 21 Nov 2023

17:00 - 18:00
L1

THE 16th BROOKE BENJAMIN LECTURE: Advances in Advancing Interfaces: The Mathematics of Manufacturing of Industrial Foams, Fluidic Devices, and Automobile Painting

James Sethian
((UC Berkeley))
Abstract

Complex dynamics underlying industrial manufacturing depend in part on multiphase multiphysics, in which fluids and materials interact across orders of magnitude variations in time and space. In this talk, we will discuss the development and application of a host of numerical methods for these problems, including Level Set Methods, Voronoi Implicit Interface Methods, implicit adaptive representations, and multiphase discontinuous Galerkin Methods.  Applications for industrial problems will include modeling how foams evolve, how electro-fluid jetting devices work, and the physics and dynamics of rotary bell spray painting across the automotive industry.

Fri, 17 Nov 2023
16:00
L1

Careers outside academia

V-Nova and Dr Anne Wolfes (Careers Service)
Abstract

What opportunities are available outside of academia? What skills beyond strong academic background are companies looking for to be successful in transitioning to industry? Come along and hear from video technology company V-Nova and Dr Anne Wolfes from the Careers Service to get some invaluable advice on careers outside academia.

Logo

Fri, 27 Oct 2023
16:00
L1

Academic job application workshop

Abstract

Job applications involve a lot of work and can be overwhelming. Join us for a workshop and Q+A session focused on breaking down academic applications: we’ll talk about approaching reference letter writers, writing research statements, and discussing what makes a great CV and covering letter.

Fri, 13 Oct 2023
16:00
L1

You and Your Supervisor

Abstract

How do you make the most of graduate supervisions?  Whether you are a first year graduate wanting to learn about how to manage meetings with your supervisor, or a later year DPhil student, postdoc or faculty member willing to share their experiences and give advice, please come along to this informal discussion led by DPhil students for the first Fridays@4 session of the term.  You can also continue the conversation and learn more about graduate student life at Oxford at Happy Hour afterwards.

Mon, 06 Nov 2023
16:00
L1

A Basic Problem in Analytic Number Theory

George Robinson
(University of Oxford)
Abstract

I will discuss a basic problem in analytic number theory which has appeared recently in my work. This will be a gentle introduction to the Gauss circle problem, hopefully with a discussion of some extensions and applications to understanding L-functions.

Fri, 20 Oct 2023

16:00 - 17:00
L1

Generalized Tensor Decomposition: Utility for Data Analysis and Mathematical Challenges

Tamara Kolda
( MathSci.ai)
Further Information

Tamara Kolda is an independent mathematical consultant under the auspices of her company MathSci.ai based in California. From 1999-2021, she was a researcher at Sandia National Laboratories in Livermore, California. She specializes in mathematical algorithms and computation methods for tensor decompositions, tensor eigenvalues, graph algorithms, randomized algorithms, machine learning, network science, numerical optimization, and distributed and parallel computing.

From the website: https://www.mathsci.ai/

Abstract

Tensor decomposition is an unsupervised learning methodology that has applications in a wide variety of domains, including chemometrics, criminology, and neuroscience. We focus on low-rank tensor decomposition using  canonical polyadic or CANDECOMP/PARAFAC format. A low-rank tensor decomposition is the minimizer according to some nonlinear program. The usual objective function is the sum of squares error (SSE) comparing the data tensor and the low-rank model tensor. This leads to a nicely-structured problem with subproblems that are linear least squares problems which can be solved efficiently in closed form. However, the SSE metric is not always ideal. Thus, we consider using other objective functions. For instance, KL divergence is an alternative metric is useful for count data and results in a nonnegative factorization. In the context of nonnegative matrix factorization, for instance, KL divergence was popularized by Lee and Seung (1999). We can also consider various objectives such as logistic odds for binary data, beta-divergence for nonnegative data, and so on. We show the benefits of alternative objective functions on real-world data sets. We consider the computational of generalized tensor decomposition based on other objective functions, summarize the work that has been done thus far, and illuminate open problems and challenges. This talk includes joint work with David Hong and Jed Duersch.

Subscribe to L1