Thu, 01 Feb 2024
14:00
Lecture Room 3

A strongly polynomial algorithm for the minimum-cost generalized flow problem

Laszlo Vegh
(LSE)
Abstract

We give a strongly polynomial algorithm for minimum cost generalized flow, and as a consequence, for all linear programs with at most two nonzero entries per row, or at most two nonzero entries per column. While strongly polynomial algorithms for the primal and dual feasibility problems have been known for a long time, various combinatorial approaches used for those problems did not seem to carry over to the minimum-cost variant.

Our approach is to show that the ‘subspace layered least squares’ interior point method, an earlier joint work with Allamigeon, Dadush, Loho and Natura requires only a strongly polynomial number of iterations for minimum cost generalized flow. We achieve this by bounding the straight line complexity, introduced in the same paper. The talk will give an overview of the interior point method as well as the combinatorial straight-line complexity analysis for the particular setting. This is joint work with Daniel Dadush, Zhuan Khye Koh, Bento Natura, and Neil Olver.

Tue, 18 May 2021
15:15
Virtual

Factors in randomly perturbed graphs

Amedeo Sgueglia
(LSE)
Further Information

Part of the Oxford Discrete Maths and Probability Seminar, held via Zoom. Please see the seminar website for details.

Abstract

We study the model of randomly perturbed dense graphs, which is the union of any $n$-vertex graph $G_\alpha$ with minimum degree at least $\alpha n$ and the binomial random graph $G(n,p)$. In this talk, we shall examine the following central question in this area: to determine when $G_\alpha \cup G(n,p)$ contains $H$-factors, i.e. spanning subgraphs consisting of vertex disjoint copies of the graph $H$. We offer several new sharp and stability results.
This is joint work with Julia Böttcher, Olaf Parczyk, and Jozef Skokan.

Tue, 05 May 2020
14:00
Virtual

Ryser's conjecture and more

Liana Yepremyan
(LSE)
Further Information

Part of the Oxford Discrete Maths and Probability Seminar, held via Zoom. Please see the seminar website for details.

Abstract

A Latin square of order n is an $n \times n$ array filled with $n$ symbols such that each symbol appears only once in every row or column and a transversal is a collection of cells which do not share the same row, column or symbol. The study of Latin squares goes back more than 200 years to the work of Euler. One of the most famous open problems in this area is a conjecture of Ryser, Brualdi and Stein from 60s which says that every Latin square of order $n \times n$ contains a transversal of order $n-1$. A closely related problem is 40 year old conjecture of Brouwer that every Steiner triple system of order $n$ contains a matching of size $\frac{n-4}{3}$. The third problem we'd like to mention asks how many distinct symbols in Latin arrays suffice to guarantee a full transversal? In this talk we discuss a novel approach to attack these problems. Joint work with Peter Keevash, Alexey Pokrovskiy and Benny Sudakov.

Thu, 03 May 2018

16:00 - 17:30
L4

Generalized McKean-Vlasov stochastic control problems

Beatrice Acciaio
(LSE)
Abstract

Title: Generalized McKean-Vlasov stochastic control problems.

Abstract: I will consider McKean-Vlasov stochastic control problems 
where the cost functions and the state dynamics depend upon the joint 
distribution of the controlled state and the control process. First, I 
will provide a suitable version of the Pontryagin stochastic maximum 
principle, showing that, in the present general framework, pointwise 
minimization of the Hamiltonian with respect to the control is not a 
necessary optimality condition. Then I will take a different 
perspective, and present a variational approach to study a weak 
formulation of such control problems, thereby establishing a new 
connection between those and optimal transport problems on path space.

The talk is based on a joint project with J. Backhoff-Veraguas and R. Carmona.

Thu, 01 Dec 2016

16:00 - 17:30
L4

A Bayesian Methodology for Systemic Risk Assessment in Financial Networks

Luitgard A. M. Veraart
(LSE)
Abstract

We develop a Bayesian methodology for systemic risk assessment in financial networks such as the interbank market. Nodes represent participants in the network and weighted directed edges represent liabilities. Often, for every participant, only the total liabilities and total assets within this network are observable. However, systemic risk assessment needs the individual liabilities. We propose a model for the individual liabilities, which, following a Bayesian approach, we then condition on the observed total liabilities and assets and, potentially, on certain observed individual liabilities. We construct a Gibbs sampler to generate samples from this conditional distribution. These samples can be used in stress testing, giving probabilities for the outcomes of interest. As one application we derive default probabilities of individual banks and discuss their sensitivity with respect to prior information included to model the network. An R-package implementing the methodology is provided. (This is joint work with Axel Gandy (Imperial College London).)

Tue, 10 Mar 2015
14:00
C4

Algebraic characterization of autonomy of PDEs

Amol Sasane
(LSE)
Abstract

Given an ideal I in the polynomial ring C[x1,...,xn],

the variety V(I) of I is the set of common zeros in C^n

of all the polynomials belonging to I. In algebraic geometry,

one tries to link geometric properties of V(I) with algebraic properties of I.

Analogously, given a system of linear, constant coefficient

partial differential equations, one can consider its zeros, that is,

its solutions in various function and distribution spaces.

One could then hope to link analytic properties of the

set of solutions with algebraic properties of the polynomials

which describe the PDEs.

In this talk, we will focus on one such analytic property,

called autonomy, and we will provide an algebraic characterization

for it.

Thu, 23 Oct 2014

16:00 - 17:30
L4

4pm (Joint Nomura-OMI Seminar) - The Use of Randomness in Time Series Analysis

Professor Piotr Fryzlewicz
(LSE)
Abstract
This is an exploratory talk in which we describe different potential 
uses of randomness in time series analysis.

In the first part, we talk about Wild Binary Segmentation for change-point detection, where randomness is used as a device for sampling from the space of all possible contrasts (change-point detection statistics) in order to reduce the computational complexity from cubic to just over linear in the number of observations, without compromising on the accuracy of change-point estimates. We also discuss an interesting related measure of change-point certainty/importance, and extensions to more general nonparametric problems.

In the second part, we use random contemporaneous linear combinations of time series panel data coming from high-dimensional factor models and argue that this gives the effect of "compressively sensing" the components of the multivariate time series, often with not much loss of information but with reduction in the dimensionality of the model.

In the final part, we speculate on the use of random filtering in time series analysis. As an illustration, we show how the appropriate use of this device can reduce the problem of estimating changes in the autocovariance structure of the process to the problem of estimating changes in variance, the latter typically being an easier task.
 
Tue, 03 Jun 2014

12:30 - 13:30
Oxford-Man Institute

Information Aggregation in a Competitive Economy

Rohit Rahi
(LSE)
Abstract

We consider the market for a risky asset for which agents have interdependent private valuations. We study competitive rational expectations equilibria under the standard CARA-normal assumptions. Equilibrium is partially revealing even though there are no noise traders. Complementarities in information acquisition arise naturally in this setting. We characterize stable equilibria with endogenous information acquisition. Our framework encompasses the classical REE models in the CARA-normal tradition.

Wed, 13 Nov 2013
16:00
C4

Baire, Berz, Burton Jones and Steinhaus: linearity from subadditivity

Adam Ostaszewski
(LSE)
Abstract

Berz used the Hahn-Banach Theorem over Q to prove that the graph of a measurable subadditive function that is non-negatively Q-homogeneous consists of two lines through the origin. I will give a proof using the density topology and Steinhaus’ Sum-set Theorem. This dualizes to a much simpler category version: a `Baire-Berz Theorem’. I will give the broader picture of this using F. Burton Jones’ analysis of additivity versus linearity. Shift-compactness and special subsets of R will be an inevitable ingredient. The talk draws on recent work with Nick Bingham and separately with Harry I. Miller.

Subscribe to LSE