Title: Geometric Methods for Machine Learning and Optimization
Abstract: A key challenge in machine learning and optimization is the identification of geometric structure in high-dimensional data. Such structural understanding is of great value for the design of efficient algorithms and for developing fundamental guarantees for their performance. Motivated by the observation that many applications involve non-Euclidean data, such as graphs, strings, or matrices, we discuss how Riemannian geometry can be exploited in Machine Learning and Optimization. First, we consider the task of learning a classifier in hyperbolic space. Such spaces have received a surge of interest for representing large-scale, hierarchical data, since they achieve better representation accuracy with fewer dimensions. Secondly, we consider the problem of optimizing a function on a Riemannian manifold. Specifically, we will consider classes of optimization problems where exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard (Euclidean) approaches.
Title: A general overview of the different projects explored during my DPhil in Statistics.
Abstract: In the first half of the talk, I will present my work on statistical models for complex networks. I will propose a model to describe sparse spatial random graph underpinned by the Bayesian nonparametric theory and asymptotic properties of a more general class of these models, regarding sparsity, degree distribution and clustering coefficients.
The second half will be devoted to the statistical quantification of the risk of disclosure, a quantity used to evaluate the level of privacy that can be achieved by publishing a microdata file without modifications. I propose two ways to estimate the risk of disclosure, using both frequentist and Bayes nonparametric statistics.