Fri, 16 Nov 2018

12:00 - 13:00
L4

Topological adventures in neuroscience

Kathryn Hess
(École Polytechnique Fédérale de Lausanne (EPFL))
Abstract

Over the past decade, and particularly over the past five years, research at the interface of topology and neuroscience has grown remarkably fast.  In this talk I will briefly survey a few quite different applications of topology to neuroscience in which members of my lab have been involved over the past four years: the algebraic topology of brain structure and function, topological characterization and classification of neuron morphologies, and (if time allows) topological detection of network dynamics.

Tue, 22 May 2018

14:00 - 14:30
L5

Storage optimal semidefinite programming

Volkan Cevher
(École Polytechnique Fédérale de Lausanne (EPFL))
Abstract

Semidefinite convex optimization problems often have low-rank solutions that can be represented with O(p)-storage. However, semidefinite programming methods require us to store the matrix decision variable with size O(p^2), which prevents the application of virtually all convex methods at large scale.

Indeed, storage, not arithmetic computation, is now the obstacle that prevents us from solving large- scale optimization problems. A grand challenge in contemporary optimization is therefore to design storage-optimal algorithms that provably and reliably solve large-scale optimization problems in key scientific and engineering applications. An algorithm is called storage optimal if its working storage is within a constant factor of the memory required to specify a generic problem instance and its solution.

So far, convex methods have completely failed to satisfy storage optimality. As a result, the literature has largely focused on storage optimal non-convex methods to obtain numerical solutions. Unfortunately, these algorithms have been shown to be provably correct only under unverifiable and unrealistic statistical assumptions on the problem template. They can also sacrifice the key benefits of convexity, as they do not use key convex geometric properties in their cost functions.

To this end, my talk introduces a new convex optimization algebra to obtain numerical solutions to semidefinite programs with a low-rank matrix streaming model. This streaming model provides us an opportunity to integrate sketching as a new tool for developing storage optimal convex optimization methods that go beyond semidefinite programming to more general convex templates. The resulting algorithms are expected to achieve unparalleled results for scalable matrix optimization problems in signal processing, machine learning, and computer science.

Subscribe to École Polytechnique Fédérale de Lausanne (EPFL)