In this talk I explain the fertile relationship between the foundations of inference and learning and combinatorial geometry.
My presentation contains several powerful examples where famous theorems in discrete geometry answered natural questions from machine learning and statistical inference:
In this tasting tour I will include the problem of deciding the existence of Maximum likelihood estimator in multiclass logistic regression, the variability of behavior of k-means algorithms with distinct random initializations and the shapes of the clusters, and the estimation of the number of samples in chance-constrained optimization models. These obviously only scratch the surface of what one could do with extra free time. Along the way we will see fascinating connections to the coupon collector problem, topological data analysis, measures of separability of data, and to the computation of Tukey centerpoints of data clouds (a high-dimensional generalization of median). All new theorems are joint work with subsets of the following wonderful folks: T. Hogan, D. Oliveros, E. Jaramillo-Rodriguez, and A. Torres-Hernandez.
Two relevant papers published/ to appear are
You can find out more about Professor De Loera here: https://www.math.ucdavis.edu/~deloera/