Fri, 19 Jan 2024

15:00 - 16:00
L4

The Function-Rips Multifiltration as an Estimator

Steve Oudot
(INRIA - Ecole Normale Supérieure)
Abstract

Say we want to view the function-Rips multifiltration as an estimator. Then, what is the target? And what kind of consistency, bias, or convergence rate, should we expect? In this talk I will present on-going joint work with Ethan André (Ecole Normale Supérieure) that aims at laying the algebro-topological ground to start answering these questions.

Wed, 29 Jun 2022

16:00 - 17:00

Information theory with kernel methods

Francis Bach
(INRIA - Ecole Normale Supérieure)
Further Information
Abstract

I will consider the analysis of probability distributions through their associated covariance operators from reproducing kernel Hilbert spaces. In this talk, I will show that the von Neumann entropy and relative entropy of these operators are intimately related to the usual notions of Shannon entropy and relative entropy, and share many of their properties. They come together with efficient estimation algorithms from various oracles on the probability distributions. I will also present how these new notions of relative entropy lead to new upper-bounds on log partition functions, that can be used together with convex optimization within variational inference methods, providing a new family of probabilistic inference methods (based on https://arxiv.org/pdf/2202.08545.pdf, see also https://francisbach.com/information-theory-with-kernel-methods/).

Thu, 24 Nov 2016

14:00 - 15:00
Rutherford Appleton Laboratory, nr Didcot

Stochastic methods for inverting matrices as a tool for designing Stochastic quasi-Newton methods

Dr Robert Gower
(INRIA - Ecole Normale Supérieure)
Abstract

I will present a broad family of stochastic algorithms for inverting a matrix, including specialized variants which maintain symmetry or positive definiteness of the iterates. All methods in the family converge globally and linearly, with explicit rates. In special cases, the methods obtained are stochastic block variants of several quasi-Newton updates, including bad Broyden (BB), good Broyden (GB), Powell-symmetric-Broyden (PSB), Davidon-Fletcher-Powell (DFP) and Broyden-Fletcher-Goldfarb-Shanno (BFGS). After a pause for questions, I will then present a block stochastic BFGS method based on the stochastic method for inverting positive definite matrices. In this method, the estimate of the inverse Hessian matrix that is maintained by it, is updated at each iteration using a sketch of the Hessian, i.e., a randomly generated compressed form of the Hessian. I will propose several sketching strategies, present a new quasi-Newton method that uses stochastic block BFGS updates combined with the variance reduction approach SVRG to compute batch stochastic gradients, and prove linear convergence of the resulting method. Numerical tests on large-scale logistic regression problems reveal that our method is more robust and substantially outperforms current state-of-the-art methods.

Subscribe to INRIA - Ecole Normale Supérieure