What's all the fuss about AI and robots? Kraftwerk predicted it over 40 years ago with the Computer World album. Here they are singing about the loneliness of home computer life and online dating. Earlier they had imagined robots playing their gigs. They eventually did.

ChatGPT's favourite band?

Sat, 11 Nov 2023
14:00
Mathematical Institute

The Vicky Neale Celebration

Various
Further Information

This autumn Oxford Mathematics and Balliol College will be hosting an afternoon to celebrate the life and contributions of Vicky Neale who died in May of this year.

November 11, 2023, 14.00–16.30
Mathematical Institute, University of Oxford
Woodstock Road, OX2 6GG

If you would like to join us, please register here by October 6th.

You can leave your memories of Vicky here.

Image of Arkady's work

Many modern challenges facing industry and government relate to cleaning: removing pollution from water or industrial waste gases, or decontaminating the environment after chemical spills. Mathematicians in Oxford are collaborating with various industrial and government partners to model specific cleaning challenges, deepening understanding of these processes and working towards optimal cleaning solutions.

Fri, 20 Oct 2023

16:00 - 17:00
L1

Generalized Tensor Decomposition: Utility for Data Analysis and Mathematical Challenges

Tamara Kolda
( MathSci.ai)
Further Information

Tamara Kolda is an independent mathematical consultant under the auspices of her company MathSci.ai based in California. From 1999-2021, she was a researcher at Sandia National Laboratories in Livermore, California. She specializes in mathematical algorithms and computation methods for tensor decompositions, tensor eigenvalues, graph algorithms, randomized algorithms, machine learning, network science, numerical optimization, and distributed and parallel computing.

From the website: https://www.mathsci.ai/

Abstract

Tensor decomposition is an unsupervised learning methodology that has applications in a wide variety of domains, including chemometrics, criminology, and neuroscience. We focus on low-rank tensor decomposition using  canonical polyadic or CANDECOMP/PARAFAC format. A low-rank tensor decomposition is the minimizer according to some nonlinear program. The usual objective function is the sum of squares error (SSE) comparing the data tensor and the low-rank model tensor. This leads to a nicely-structured problem with subproblems that are linear least squares problems which can be solved efficiently in closed form. However, the SSE metric is not always ideal. Thus, we consider using other objective functions. For instance, KL divergence is an alternative metric is useful for count data and results in a nonnegative factorization. In the context of nonnegative matrix factorization, for instance, KL divergence was popularized by Lee and Seung (1999). We can also consider various objectives such as logistic odds for binary data, beta-divergence for nonnegative data, and so on. We show the benefits of alternative objective functions on real-world data sets. We consider the computational of generalized tensor decomposition based on other objective functions, summarize the work that has been done thus far, and illuminate open problems and challenges. This talk includes joint work with David Hong and Jed Duersch.

Sampling-based Nystrom approximation and kernel quadrature
Hayakawa, S Oberhauser, H Lyons, T Proceedings of the 40th International Conference on Machine Learning volume 202 12678-12699 (08 May 2023)
Interacting particle systems approximations of the Kushner-Stratonovitch equation
Crişan, D Del Moral, P Lyons, T Advances in Applied Probability volume 31 issue 3 819-838 (01 Sep 1999)
A Personal Perspective on Raghu Varadhan’s Role in the Development of Stochastic Analysis
Lyons, T The Abel Prize 289-314 (01 Dec 2010)
Sketching the order of events
Lyons, T Oberhauser, H (31 Aug 2017)
A signature-based machine learning model for bipolar disorder and borderline personality disorder
Arribas, I Saunders, K Goodwin, G Lyons, T (22 Jul 2017)
Subscribe to