14:30
Numerical solution of linear systems in low rank tensor form
Abstract
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please send email to @email.
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please send email to @email.
Let W be the Witt algebra of vector fields on the punctured complex plane, and let Vir be the Virasoro algebra, the unique nontrivial central extension of W. We discuss work in progress with Alexey Petukhov to analyse Poisson ideals of the symmetric algebra of Vir.
We focus on understanding maximal Poisson ideals, which can be given as the Poisson cores of maximal ideals of Sym(Vir) and of Sym(W). We give a complete classification of maximal ideals of Sym(W) which have nontrivial Poisson cores. We then lift this classification to Sym(Vir), and use it to show that if $\lambda \neq 0$, then $(z-\lambda)$ is a maximal Poisson ideal of Sym(Vir).
Randomized experiments, or "A/B" tests, remain the gold standard for evaluating the causal effect of a policy intervention or product change. However, experimental settings such as social networks, where users are interacting and influencing one another, violate conventional assumptions of no interference needed for credible causal inference. Existing solutions include accounting for the fraction or count of treated neighbors in a user's network, among other strategies. Yet, there are often a high number of researcher degrees of freedom in specifying network interference conditions and most current methods do not account for the local network structure beyond simply counting the number of neighbors. Capturing local network structures is important because it can account for theories, such as structural diversity and echo chambers. Our study provides an approach that accounts for both the local structure in a user's social network via motifs as well as the assignment conditions of neighbors. We propose a two-part approach. We first introduce and employ "causal network motifs," i.e. network motifs that characterize the assignment conditions in local ego networks; and then we propose a tree-based algorithm for identifying different network interference conditions and estimating their average potential outcomes. We test our method on a real-world experiment on a large-scale network and a synthetic network setting, which highlight how accounting for local structures can better account for different interference patterns in networks.
Preconditioning techniques are widely used for speeding up the iterative solution of systems of linear equations, often by transforming the system into one with lower condition number. Even though the condition number also serves as the determining constant in simple bounds for the numerical error of the solution, simple experiments and bounds show that such preconditioning on the matrix level is not guaranteed to reduce this error. Transformations on the operator level, on the other hand, improve both accuracy and speed of iterative methods as predicted by the change of the condition number. We propose to investigate such methods under a common framework, which we call full operator preconditioning, and show practical examples.
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please send an email to @email.
Part of the Oxford Discrete Maths and Probability Seminar, held via Zoom. Please see the seminar website for details.
We study the minimum total weight of a disk triangulation using any number of vertices out of $\{1,..,n\}$ where the boundary is fixed and the $n \choose 3$ triangles have independent rate-1 exponential weights. We show that, with high probability, the minimum weight is equal to $(c+o(1))n-1/2\log n$ for an explicit constant $c$. Further, we prove that, with high probability, the minimum weights of a homological filling and a homotopical filling of the cycle $(123)$ are both attained by the minimum weight disk triangulation. We will discuss a related open problem concerning simple-connectivity of random simplicial complexes, where a similar phenomenon is conjectured to hold. Based on joint works with Itai Benjamini, Eyal Lubetzky, and Zur Luria.
What do random spanning trees, graph embeddings, random walks, simplices and graph curvature have in common? As you may have guessed from the title, they are indeed all intimately connected to the effective resistance on graphs! While originally invented as a tool to study electrical circuits, the effective resistance has proven time and again to be a graph characteristic with a variety of interesting and often surprising properties. Starting from a number of equivalent but complementary definitions of the effective resistance, we will take a stroll through some classical theorems (Rayleigh monotonicity, Foster's theorem), a few modern results (Klein's metricity, Fiedler's graph-simplex correspondence) and finally discuss number of recent developments (variance on graphs, discrete curvature and graph embeddings).
Motivated by an attempt to construct a theory of quantum gravity as a perturbation around some flat background, Penrose has shown that, despite being asymptotically flat, there is an inconsistency between the causal structure at infinity of Schwarzschild and Minkowski spacetimes. This suggests that such a perturbative approach cannot possibly work. However, the proof of this inconsistency is specific to 4 spacetime dimensions. In this talk I will discuss how this result extends to higher (and lower) dimensions. More generally, I will consider examples of how the causal structure of asymptotically flat spacetimes are affected by dimension and by the presence of mass (both positive and negative). I will then show how these ideas can be used to prove a higher dimensional extension of the positive mass theorem of Penrose, Sorkin and Woolgar.
Stochastic quantisation is, broadly speaking, the use of a stochastic differential equation to construct a given probability distribution. Usually this refers to Markovian Langevin evolution with given invariant measure. However we will show that it is possible to construct other kind of equations (elliptic stochastic partial differential equations) whose solutions have prescribed marginals. This connection was discovered in the '80 by Parisi and Sourlas in the context of dimensional reduction of statistical field theories in random external fields. This purely probabilistic results has a proof which depends on a supersymmetric formulation of the problem, i.e. a formulation involving a non-commutative random field defined on a non-commutative space. This talk is based on joint work with S. Albeverio and F. C. de Vecchi.
Sieve methods are analytic tools that we can use to tackle problems in additive number theory. This talk will serve as a gentle introduction to the area. At the end we will discuss recent progress on a variation on the prime $k$-tuples conjecture which involves sums of two squares. No knowledge of sieves is required!
We propose a modulated free energy which combines of the method previously developed by the speaker together with the modulated energy introduced by S. Serfaty. This modulated free energy may be understood as introducing appropriate weights in the relative entropy to cancel the more singular terms involving the divergence of the flow. This modulated free energy allows to treat singular interactions of gradient-flow type and allows potentials with large smooth part, small attractive singular part and large repulsive singular part. As an example, a full rigorous derivation (with quantitative estimates) of some chemotaxis models, such as Patlak-Keller-Segel system in the subcritical regimes, is obtained.
We study a group theoretic analog of Dehn fillings of 3-manifolds and derive a spectral sequence to compute the cohomology of Dehn fillings of hyperbolically embedded subgroups. As applications, we generalize the results of Dahmani-Guirardel-Osin and Hull on SQ-universality and common quotients of acylindrically hyperbolic groups by adding cohomological finiteness conditions. This is a joint work with Nansen Petrosyan.
In the talk I will survey the fast growing field of metric measure spaces satisfying a lower bound on Ricci Curvature, in a synthetic sense via optimal transport. Particular emphasis will be given to discuss how such (possibly non-smooth) spaces naturally (and usefully) extend the class of smooth Riemannian manifolds with Ricci curvature bounded below.
We will discuss recent progress in understanding (ordinary and generalized) symmetries, dualities and classification of superconformal field theories in 5d and 6d, which involves the study of M-theory and F-theory compactified on Calabi-Yau threefolds.
We consider weakly-coupled QFT in AdS at finite temperature. We compute the holographic thermal two-point function of scalar operators in the boundary theory. We present analytic expressions for leading corrections due to local quartic interactions in the bulk, with an arbitrary number of derivatives and for any number of spacetime dimensions. The solutions are fixed by judiciously picking an ansatz and imposing consistency conditions. The conditions include analyticity properties, consistency with the operator product expansion, and the Kubo-Martin-Schwinger condition. For the case without any derivatives we show agreement with an explicit diagrammatic computation. The structure of the answer is suggestive of a thermal Mellin amplitude. Additionally, we derive a simple dispersion relation for thermal two-point functions which reconstructs the function from its discontinuity.
Talking maths on YouTube is a lot of fun. Your audience will contain maths enthusiasts, young people, and the general public. These are people who are interested in what you have to say, and want to learn something new. Maths videos on YouTube can be used to teach maths, or to just show people something interesting. Making videos doesn't have to be technically difficult, but is good practice in explaining difficult concepts in clear and succinct ways. In this session we will discuss how to make your first YouTube video, including questions about content, presentation and video making.
Dr James Grime started making his first maths YouTube videos while working as a postdoc in 2008. James has made maths videos with Cambridge University, the Royal Institution, and MathsWorldUK, and is also a presenter on the popular YouTube channel Numberphile, which now has over 3 million subscribers worldwide.
The session will be a panel discussion addressing practical aspects of doing a research degree. We will take questions from the audience so will discuss whatever people wish to ask us, but we expect to talk about the process of applying, why you might want to consider doing a research degree, the experience of doing research, and what people do after they have completed their degree.
Signalling pathways can be modelled as a biochemical reaction network. When the kinetics are to follow mass-action kinetics, the resulting
mathematical model is a polynomial dynamical system. I will overview approaches to analyse these models with steady-state data using
computational algebraic geometry and statistics. Then I will present how to analyse such models with time-course data using differential
algebra and geometry for model identifiability. Finally, I will present how topological data analysis can be help distinguish models
and data.
When faced with a data analysis, learning, or statistical inference problem, the amount and quality of data available fundamentally determines whether such tasks can be performed with certain levels of accuracy. With the growing size of datasets however, it is crucial not only that the underlying statistical task is possible, but also that is doable by means of efficient algorithms. In this talk we will discuss methods aiming to establish limits of when statistical tasks are possible with computationally efficient methods or when there is a fundamental «Statistical-to-Computational gap›› in which an inference task is statistically possible but inherently computationally hard. We will focus on Hypothesis Testing and the ``Low Degree Method'' and also address hardness of certification via ``quiet plantings''. Guiding examples will include Sparse PCA, bounds on the Sherrington Kirkpatrick Hamiltonian, and lower bounds on Chromatic Numbers of random graphs.
Deep convolutional networks have spectacular performances that remain mostly not understood. Numerical experiments show that they classify by progressively concentrating each class in separate regions of a low-dimensional space. To explain these properties, we introduce a concentration and separation mechanism with multiscale tight frame contractions. Applications are shown for image classification and statistical physics models of cosmological structures and turbulent fluids.
The development of high frequency and algorithmic trading allowed to considerably reduce the bid ask spread by increasing liquidity in limit order books. Beyond the problem of optimal placement of market and limit orders, the possibility to cancel orders for free leaves room for price manipulations, one of such being spoofing. Detecting spoofing from a regulatory viewpoint is challenging due to the sheer amount of orders and difficulty to discriminate between legitimate and manipulative flows of orders. However, it is empirical evidence that volume imbalance reflecting offer and demand on both sides of the limit order book has an impact on subsequent price movements. Spoofers use this effect to artificially modify the imbalance by posting limit orders and then execute market orders at subsequent better prices while canceling at a high speed their previous limit orders. In this work we set up a model to determine where a spoofer would place its limit orders to maximize its gains as a function of the imbalance impact on the price movement. We study the solution of this non local optimization problem as a function of the imbalance. With this at hand, we calibrate on real data from TMX the imbalance impact (as a function of its depth) on the resulting price movement. Based on this calibration and theoretical results, we then provide some methods and numerical results as how to detect in real time some eventual spoofing behavior based on Wasserstein distances. Joint work with Tao Xuan (SJTU), Ling Lan (SJTU) and Andrew Day (Western University)
=====================================================================
Materials made from a mixture of liquid and solid are, instinctively, very obviously complex. From dilatancy (the reason wet sand becomes dry when you step on it) to extreme shear-thinning (quicksand) or shear-thickening (cornflour oobleck) there is a wide range of behaviours to explain and predict. I'll discuss the seemingly simple case of solid spheres suspended in a Newtonian fluid matrix, which still has plenty of surprises up its sleeve.
Note the day is a Thursday!
Let $k$ be a field and $A$ a $k$-algebra. The classical Quillen's Lemma states that if $A$ if is equipped with an exhaustive filtration such that the associated graded ring is commutative and finitely generated $k$-algebra then for any finitely generated $A$-module $M$, every element of the endomorphism ring of $M$ is algebraic over $k$. In particular, Quillen's Lemma may be applied to the enveloping algebra of a finite dimensional Lie algebra. I aim to present an affinoid version of Quillen's Lemma which strengthness a theorem proved by Ardakov and Wadsley. Depending on time, I will show how this leads to an (almost) classification of the primitive spectrum of the affinoid enveloping algebra of a semisimple Lie algebra.
Whitney elements on simplices are perhaps the most widely used finite elements in computational electromagnetics. They offer the simplest construction of polynomial discrete differential forms on simplicial complexes. Their associated degrees of freedom (dofs) have a very clear physical meaning and give a recipe for discretizing physical balance laws, e.g., Maxwell’s equations. As interest grew for the use of high order schemes, such as hp-finite element or spectral element methods, higher-order extensions of Whitney forms have become an important computational tool, appreciated for their better convergence and accuracy properties. However, it has remained unclear what kind of cochains such elements should be associated with: Can the corresponding dofs be assigned to precise geometrical elements of the mesh, just as, for instance, a degree of freedom for the space of Whitney 1-forms belongs to a specific edge? We address this localization issue. Why is this an issue? The existing constructions of high order extensions of Whitney elements follow the traditional FEM path of using higher and higher “moments” to define the needed dofs. As a result, such high order finite k-elements in d dimensions include dofs associated to q-simplices, with k < q ≤ d, whose physical interpretation is obscure. The present paper offers an approach based on the so-called “small simplices”, a set of subsimplices obtained by homothetic contractions of the original mesh simplices, centered at mesh nodes (or more generally, when going up in degree, at points of the principal lattice of each original simplex). Degrees of freedom of the high-order Whitney k-forms are then associated with small simplices of dimension k only. We provide an explicit basis for these elements on simplices and we justify this approach from a geometric point of view (in the spirit of Hassler Whitney's approach, still successful 30 years after his death).
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please send email to @email.
A dagger category is a category where for every morphism f:x --> y there is a chosen adjoint f*:y --> x, as for example in the category of Hilbert spaces. I will explain this definition in elementary terms and give a few example. The only prerequisites for this talk are the notion of category, functor, and Hilbert space.
Dagger categories are a great categorical framework for some concepts from functional analysis such as C*-algebras and they also allow us to state Atiyah's definition unitary topological field theories in categorical lanugage. There is however a problem with dagger categories: they are what category theorists like to call 'evil'. This is isn't really meant as a moral judgement, it just means that many ways of thinking about ordinary categories don't quite translate to dagger categories.
For example, not every fully faithful and essentially surjective dagger functor is also a dagger equivalence. I will present a notion of 'indefinite completion' that I came up with to describe dagger categories in less 'evil' terms. (Those of you who know Karoubi completion will see a lot of similarities.) I'll also explain how this can be used to compute categories of dagger functors, and more specifically groupoids of unitary TFTs.