Fri, 25 Oct 2024

14:00 - 15:00
L1

How to Write a Good Maths Solution

Dr Luciana Basualdo Bonatto
Abstract

In this interactive workshop, we'll discuss what mathematicians are looking for in written solutions. How can you set out your ideas clearly, and what are the standard mathematical conventions? Please bring a pen or pencil!

This session is likely to be most relevant for first-year undergraduates, but all are welcome.

Fri, 18 Oct 2024

14:00 - 15:00
L1

Making the Most of Intercollegiate Classes

Dr Luciana Basualdo Bonatto, Prof. Dmitry Belyaev, Dr Chris Hollings and Dr Neil Laws
Abstract

What should you expect in intercollegiate classes?  What can you do to get the most out of them?  In this session, experienced class tutors will share their thoughts, and a current student will offer tips and advice based on their experience.

All undergraduate and masters students welcome, especially Part B and MSc students attending intercollegiate classes. (Students who attended the Part C/OMMS induction event will find significant overlap between the advice offered there and this session!)

Wed, 13 Nov 2024
11:00
L4

Flow equation approach for the stochastic Burgers equation

Andrea Pitrone
(Mathematical Institute)
Abstract

I will present the basic idea of the flow equation approach developed by Paweł Duch to study singular stochastic partial differential equations. In particular, I will show how it can be used to prove the existence of a solution of the stochastic Burgers equation on the one-dimensional torus.

Today, 5pm, Common Room

Today Happy Hour will be free for everyone, in celebration of the beginning of term. We’ve also arranged for a special welcome from the Head of Department. In addition, there will be prizes handed out. It’s a great opportunity for the whole department to come together, meet the new arrivals, and start the term off on a positive note.

Mon, 18 Nov 2024

14:00 - 15:00
Lecture Room 3

Model-based (unfolding) neural networks and where to find them: from practice to theory

Vicky Kouni
Abstract

In recent years, a new class of deep neural networks has emerged, which finds its roots at model-based iterative algorithms solving inverse problems. We call these model-based neural networks deep unfolding networks (DUNs). The term is coined due to their formulation: the iterations of optimization algorithms are “unfolded” as layers of neural networks, which solve the inverse problem at hand. Ever since their advent, DUNs have been employed for tackling assorted problems, e.g., compressed sensing (CS), denoising, super-resolution, pansharpening. 

In this talk, we will revisit the application of DUNs on the CS problem, which pertains to reconstructing data from incomplete observations. We will present recent trends regarding the broader family of DUNs for CS and dive into their theory, which mainly revolves around their generalization performance; the latter is important, because it informs us about the behaviour of a neural network on examples it has never been trained on before. 
Particularly, we will focus our interest on overparameterized DUNs, which exhibit remarkable performance in terms of reconstruction and generalization error. As supported by our theoretical and empirical findings, the generalization performance of overparameterized DUNs depends on their structural properties. Our analysis sets a solid mathematical ground for developing more stable, robust, and efficient DUNs, boosting their real-world performance.

Thu, 05 Dec 2024
16:00
Lecture Room 3

Zeros of polynomials with restricted coefficients: a problem of Littlewood

Benjamin Bedert
(University of Oxford)
Abstract

The study of polynomials whose coefficients lie in a given set $S$ (the most notable examples being $S=\{0,1\}$ or $\{-1,1\}$) has a long history leading to many interesting results and open problems. We begin with a brief general overview of this topic and then focus on the following old problem of Littlewood. Let $A$ be a set of positive integers, let $f_A(x)=\sum_{n\in A}\cos(nx)$ and define $Z(f_A)$ to be the number of zeros of $f_A$ in $[0,2\pi]$. The problem is to estimate the quantity $Z(N)$ which is defined to be the minimum of $Z(f_A)$ over all sets $A$ of size $N$. We discuss recent progress showing that $Z(N)\geqslant (\log \log N)^{1-o(1)}$ which provides an exponential improvement over the previous lower bound. 

A closely related question due to Borwein, Erd\'elyi and Littmann asks about the minimum number of zeros of a cosine polynomial with $\pm 1$-coefficients. Until recently it was unknown whether this even tends to infinity with the degree $N$. We also discuss work confirming this conjecture.

 

Thu, 28 Nov 2024
16:00
Lecture Room 3

Large sieve inequalities for exceptional Maass forms and applications

Alexandru Pascadi
(University of Oxford)
Abstract

A number of results on classical problems in analytic number theory rely on bounds for multilinear forms of Kloosterman sums, which in turn use deep inputs from the spectral theory of automorphic forms. We’ll discuss our recent work available at arxiv.org/abs/2404.04239, which uses this interplay between counting problems, exponential sums, and automorphic forms to improve results on the greatest prime factor of $n^2+1$, and on the exponents of distribution of primes and smooth numbers in arithmetic progressions.
The key ingredient in this work are certain “large sieve inequalities” for exceptional Maass forms, which improve classical results of Deshouillers-Iwaniec in special settings. These act as on-average substitutes for Selberg’s eigenvalue conjecture, narrowing (and sometimes completely closing) the gap between previous conditional and unconditional results.

Subscribe to