Colloquia
The Colloquia are followed by a reception designed to give people the opportunity to have more informal contact with the speaker. A book display will be available at this time in the common room. The series is funded, in part, through the generous support of Oxford University Press.
The colloquia are aimed towards a general mathematical audience.
Please note that the list below only shows forthcoming events, which may not include regular events that have not yet been entered for the forthcoming term. Please see the past events page for a list of all seminar series that the department has on offer.
Generalized Tensor Decomposition: Utility for Data Analysis and Mathematical Challenges
Tamara Kolda is an independent mathematical consultant under the auspices of her company MathSci.ai based in California. From 1999-2021, she was a researcher at Sandia National Laboratories in Livermore, California. She specializes in mathematical algorithms and computation methods for tensor decompositions, tensor eigenvalues, graph algorithms, randomized algorithms, machine learning, network science, numerical optimization, and distributed and parallel computing.
From the website: https://www.mathsci.ai/
Abstract
Tensor decomposition is an unsupervised learning methodology that has applications in a wide variety of domains, including chemometrics, criminology, and neuroscience. We focus on low-rank tensor decomposition using canonical polyadic or CANDECOMP/PARAFAC format. A low-rank tensor decomposition is the minimizer according to some nonlinear program. The usual objective function is the sum of squares error (SSE) comparing the data tensor and the low-rank model tensor. This leads to a nicely-structured problem with subproblems that are linear least squares problems which can be solved efficiently in closed form. However, the SSE metric is not always ideal. Thus, we consider using other objective functions. For instance, KL divergence is an alternative metric is useful for count data and results in a nonnegative factorization. In the context of nonnegative matrix factorization, for instance, KL divergence was popularized by Lee and Seung (1999). We can also consider various objectives such as logistic odds for binary data, beta-divergence for nonnegative data, and so on. We show the benefits of alternative objective functions on real-world data sets. We consider the computational of generalized tensor decomposition based on other objective functions, summarize the work that has been done thus far, and illuminate open problems and challenges. This talk includes joint work with David Hong and Jed Duersch.