14:15
CANCELLED
Abstract
A well-known problem in algebraic geometry is to construct smooth projective Calabi--Yau varieties $Y$. In the smoothing approach, we construct first a degenerate (reducible) Calabi--Yau scheme $V$ by gluing pieces. Then we aim to find a family $f\colon X \to C$ with special fiber $X_0 = f^{-1}(0) \cong V$ and smooth general fiber $X_t = f^{-1}(t)$. In this talk, we see how infinitesimal logarithmic deformation theory solves the second step of this approach: the construction of a family out of a degenerate fiber $V$. This is achieved via the logarithmic Bogomolov--Tian--Todorov theorem as well as its variant for pairs of a log Calabi--Yau space $f_0\colon X_0 \to S_0$ and a line bundle $\mathcal{L}_0$ on $X_0$.
15:30
Higher Order Lipschitz Functions in Data Science
Abstract
The notion of Lip(gamma) Functions, for a parameter gamma > 0, introduced by Stein in the 1970s (building on earlier work of Whitney) is a notion of smoothness that is well-defined on arbitrary closed subsets (including, in particular, finite subsets) that is instrumental in the area of Rough Path Theory initiated by Lyons and central in recent works of Fefferman. Lip(gamma) functions provide a higher order notion of Lipschitz regularity that is well-defined on arbitrary closed subsets, and interacts well with the more classical notion of smoothness on open subsets. In this talk we will survey the historical development of Lip(gamma) functions and illustrate some fundamental properties that make them an attractive class of function to work with from a machine learning perspective. In particular, models learnt within the class of Lip(gamma) functions are well-suited for both inference on new unseen input data, and for allowing cost-effective inference via the use of sparse approximations found via interpolation-based reduction techniques. Parts of this talk will be based upon the works https://arxiv.org/abs/2404.06849 and https://arxiv.org/abs/2406.03232.