Date
Thu, 24 Nov 2016
Time
14:00 - 15:00
Location
Rutherford Appleton Laboratory, nr Didcot
Speaker
Dr Robert Gower
Organisation
INRIA - Ecole Normale Supérieure

I will present a broad family of stochastic algorithms for inverting a matrix, including specialized variants which maintain symmetry or positive definiteness of the iterates. All methods in the family converge globally and linearly, with explicit rates. In special cases, the methods obtained are stochastic block variants of several quasi-Newton updates, including bad Broyden (BB), good Broyden (GB), Powell-symmetric-Broyden (PSB), Davidon-Fletcher-Powell (DFP) and Broyden-Fletcher-Goldfarb-Shanno (BFGS). After a pause for questions, I will then present a block stochastic BFGS method based on the stochastic method for inverting positive definite matrices. In this method, the estimate of the inverse Hessian matrix that is maintained by it, is updated at each iteration using a sketch of the Hessian, i.e., a randomly generated compressed form of the Hessian. I will propose several sketching strategies, present a new quasi-Newton method that uses stochastic block BFGS updates combined with the variance reduction approach SVRG to compute batch stochastic gradients, and prove linear convergence of the resulting method. Numerical tests on large-scale logistic regression problems reveal that our method is more robust and substantially outperforms current state-of-the-art methods.

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 14:57.