Checkpoint-blocker-induced autoimmunity is associated with favourable outcome in metastatic melanoma and distinct T-cell expression profiles
Ye, W Olsson-Brown, A Watson, R Cheung, V Morgan, R Nassiri, I Cooper, R Taylor, C Akbani, U Brain, O Matin, R Coupe, N Middleton, M Coles, M Sacco, J Payne, M Fairfax, B British Journal of Cancer volume 124 issue 10 1661-1669 (11 May 2021)
Formal control synthesis via simulation relations and behavioral theory for discrete-time descriptor systems
Haesaert, S Chen, F Abate, A Weiland, S IEEE Transactions on Automatic Control volume 66 issue 3 1024-1039 (04 May 2020)
A Stein goodness-of-test for exponential random graph models
Xu, W Reinert, G 415-423 (18 Mar 2021)
Exploring the relationship between pain and self-harm thoughts and behaviours in young people using network analysis
Hinze, V Ford, T Evans, R Gjelsvik, B Crane, C Psychological Medicine volume 52 issue 15 3560-3569 (15 Mar 2021)

Our 'Fantastic Voyage' through Oxford Mathematics Student Lectures brings us to four 3rd Year lectures by Dominic Joyce on Topological Surfaces. These lectures are shown pretty much as they are seen by the students (they use a different platform with a few more features but the lectures are the same) as we all get to grips with the online world. Lectures on Linear Algebra, Integral transforms, Networks, Set Theory, Maths History and much more will be shown over the next few weeks.

The Syndrome-Trellis Sampler for Generative Steganography
Nakajima, T Ker, A volume 00 1-6 (11 Dec 2020)
Modular Deep Reinforcement Learning for Continuous Motion Planning With Temporal Logic
Cai, M Hasanbeig, M Xiao, S Abate, A Kan, Z volume 6 issue 4 7973-7980
Tue, 18 May 2021
14:00
Virtual

Hashing embeddings of optimal dimension, with applications to linear least squares

Zhen Shao
(Mathematical Institute (University of Oxford))
Abstract

We investigate theoretical and numerical properties of sparse sketching for both dense and sparse Linear Least Squares (LLS) problems. We show that, sketching with hashing matrices --- with one nonzero entry per column and of size proportional to the rank of the data matrix --- generates a subspace embedding with high probability, provided the given data matrix has low coherence; thus optimal residual values are approximately preserved when the LLS matrix has similarly important rows. We then show that using $s-$hashing matrices, with $s>1$ nonzero entries per column, satisfy similarly good sketching properties for a larger class of low coherence data matrices. Numerically, we introduce our solver Ski-LLS for solving generic dense or sparse LLS problems. Ski-LLS builds upon the successful strategies employed in the Blendenpik and LSRN solvers, that use sketching to calculate a preconditioner before applying the iterative LLS solver LSQR. Ski-LLS significantly improves upon these sketching solvers by judiciously using sparse hashing sketching while also allowing rank-deficiency of input; furthermore, when the data matrix is sparse, Ski-LLS also applies a sparse factorization to the sketched input. Extensive numerical experiments show Ski-LLS is also competitive with other state-of-the-art direct and preconditioned iterative solvers for sparse LLS, and outperforms them in the significantly over-determined regime.

A link for this talk will be sent to our mailing list a day or two in advance.  If you are not on the list and wish to be sent a link, please contact @email.

Subscribe to