From Convolutional Sparse Coding to Deep Sparsity and Neural Networks

13 February 2018
14:30
Jeremias Sulam
Abstract

Within the wide field of sparse approximation, convolutional sparse coding (CSC) has gained considerable attention in the computer vision and machine learning communities. While several works have been devoted to the practical aspects of this model, a systematic theoretical understanding of CSC seems to have been left aside. In this talk, I will present a novel analysis of the CSC problem based on the observation that, while being global, this model can be characterized and analyzed locally. By imposing only local sparsity conditions, we show that uniqueness of solutions, stability to noise contamination and success of pursuit algorithms are globally guaranteed. I will then present a Multi-Layer extension of this model and show its close relation to Convolutional Neural Networks (CNNs). This connection brings a fresh view to CNNs, as one can attribute to this architecture theoretical claims under local sparse assumptions, which shed light on ways of improving the design and implementation of these networks. Last, but not least, we will derive a learning algorithm for this model and demonstrate its applicability in unsupervised settings.

  • Numerical Analysis Group Internal Seminar