Fri, 01 Jun 2018
12:00
N3.12

Offset Hypersurfaces and Persistent Homology of Algebraic Varieties

Maddie Weinstein
(UC Berkeley)
Abstract

We will discuss the algebraicity of two quantities central to the computation of persistent homology. We will also connect persistent homology and algebraic optimization. Namely, we will express the degree corresponding to the distance variable of the offset hypersurface in terms of the Euclidean distance degree of the starting variety, obtaining a new way to compute these degrees. Finally, we will describe the non-properness locus of the offset construction and use this to describe the set of points that are topologically interesting (the medial axis and center points of the bounded components of the complement of the variety) and relevant to the computation of persistent homology.

Fri, 25 May 2018
12:00
N3.12

Persistent homology and the approximation of intrinsic volumes

Florian Pausinger
(Queen's University Belfast)
Abstract

Persistent homology is an algebraic tool for quantifying topological features of shapes and functions, which has recently found wide applications in data and shape analysis. In the first and introductory part of this talk I recall the underlying ideas and basic concepts of this very active field of research. In the second part, I plan to sketch a concrete application of this concept to digital image processing. 

Fri, 18 May 2018
12:00
N3.12

Which neural codes are convex?

Anne Shiu
(Texas A&M University)
Abstract

This talk focuses on algebraic and combinatorial-topological problems motivated by neuroscience. Neural codes allow the brain to represent, process, and store information about the world. Combinatorial codes, comprised of binary patterns of neural activity, encode information via the collective behavior of populations of neurons. A code is called convex if its codewords correspond to regions defined by an arrangement of convex open sets in Euclidean space. Convex codes have been observed experimentally in many brain areas, including sensory cortices and the hippocampus,where neurons exhibit convex receptive fields. What makes a neural code convex? That is, how can we tell from the intrinsic structure of a code if there exists a corresponding arrangement of convex open sets?

This talk describes how to use tools from combinatorics and commutative algebra to uncover a variety of signatures of convex and non-convex codes.

This talk is based on joint works with Aaron Chen and Florian Frick, and with Carina Curto, Elizabeth Gross, Jack Jeffries, Katie Morrison, Mohamed Omar, Zvi Rosen, and Nora Youngs.

Fri, 11 May 2018
12:00
N3.12

Multi-parameter Topological Data Analysis

Steve Oudot
(Ecole Polytechnique)
Abstract

How can we adapt the Topological Data Analysis (TDA) pipeline to use several filter functions at the same time? Two orthogonal approaches can be considered: (1) running the standard 1-parameter pipeline and doing statistics on the resulting barcodes; (2) running a multi-parameter version of the pipeline, still to be defined. In this talk I will present two recent contributions, one for each approach. The first contribution considers intrinsic compact metric spaces and shows that the so-called Persistent Homology Transform (PHT) is injective over a dense subset of those. When specialized to metric graphs, our analysis yields a stronger result, namely that the PHT is injective over a subset of full measurem which allows for sufficient statistics. The second contribution investigates the bi-parameter version of the TDA pipeline and shows a decomposition result "à la Crawley-Boevey" for a subcategory of the 2-parameter persistence modules called "exact modules". This result has an impact on the study of interlevel-sets persistence and on that of sheaves of vector spaces on the real line. 

This is joint work with Elchanan Solomon on the one hand, with Jérémy Cochoy on the other hand.

Fri, 04 May 2018
12:00
N3.12

Geometric invariants for Chemical Reaction Networks

Michael Adamer
(University of Oxford)
Abstract

Steady state chemical reaction models can be thought of as algebraic varieties whose properties are determined by the network structure. In experimental set-ups we often encounter the problem of noisy data points for which we want to find the corresponding steady state predicted by the model. Depending on the network there may be many such points and the number of which is given by the euclidean distance degree (ED degree). In this talk I show how certain properties of networks relate to the ED degree and how the runtime of numerical algebraic geometry computations scales with the ED degree.

Fri, 27 Apr 2018
12:00
N3.12

Multiparameter Persistence Landscapes

Oliver Vipond
(Oxford University)
Abstract

Single parameter persistent homology has proven to be a useful data analytic tool and single parameter persistence modules enjoy a concise description as a barcode, a complete invariant. [Bubenik, 2012] derived a topological summary closely related to the barcode called the persistence landscape which is amenable to statistical analysis and machine learning techniques.

The theory of multidimensional persistence modules is presented in [Carlsson and Zomorodian, 2009] and unlike the single parameter case where one may associate a barcode to a module, there is not an analogous complete discrete invariant in the multiparameter setting. We propose an incomplete invariant derived from the rank invariant associated to a multiparameter persistence module, which generalises the single parameter persistence landscape in [Bubenik, 2012] and satisfies similar stability properties with respect to the interleaving distance. Our invariant naturally lies in a Banach Space and so is naturally endowed with a distance function, it is also well suited to statistical analysis since there is a uniquely defined mean associated to multiple landscapes. We shall present computational examples in the 2-parameter case using the RIVET software presented in [Lesnick and Wright, 2015].

Tue, 12 Jun 2018
14:30
L2

Random graph coloring and the cavity method

Will Perkins
(Birmingham)
Abstract

The last remaining open problem from Erdős and Rényi's original paper on random graphs is the following: for q at least 3, what is the largest d so that the random graph G(n,d/n) is q-colorable with high probability?  A lot of interesting work in probabilistic combinatorics has gone into proving better and better bounds on this q-coloring threshold, but the full answer remains elusive.  However, a non-rigorous method from the statistical physics of glasses - the cavity method - gives a precise prediction for the threshold.  I will give an introduction to the cavity method, with random graph coloring as the running example, and describe recent progress in making parts of the method rigorous, emphasizing the role played by tools from extremal combinatorics.  Based on joint work with Amin Coja-Oghlan, Florent Krzakala, and Lenka Zdeborová.

Tue, 05 Jun 2018
14:30
L6

Fractional decompositions of dense graphs

Richard Montgomery
(Cambridge)
Abstract

It is difficult to determine when a graph G can be edge-covered by edge-disjoint copies of a fixed graph F. That is, when it has an F-decomposition. However, if G is large and has a high minimum degree then it has an F-decomposition, as long as some simple divisibility conditions hold. Recent research allows us to prove bounds on the necessary minimum degree by studying a relaxation of this problem, where a fractional decomposition is sought.

I will show how a relatively simple random process can give a good approximation to a fractional decomposition of a dense graph, and how it can then be made exact. This improves the best known bounds for this problem.
 

Tue, 15 May 2018
14:30
L6

The Erdos Matching Conjecture and related questions

Andrey Kupavskii
(Birmingham University)
Abstract

Consider a family of k-element subsets of an n-element set, and assume that the family does not contain s pairwise disjoint sets. The well-known Erdos Matching Conjecture suggests the maximum size of such a family. Finding the maximum is trivial for n<(s+1)k and is relatively easy for n large in comparison to s,k. There was a splash of activity around the conjecture in the recent years, and, as far as the original question is concerned, the best result is due to Peter Frankl, who verified the conjecture for all n>2sk. In this work, we improve the bound of Frankl for any k and large enough s. We also discuss the connection of the problem to an old question on deviations of sums of random variables going back to the work of Hoeffding and Shrikhande.
 

Subscribe to