Past Numerical Analysis Group Internal Seminar

9 May 2017
14:00
Amirali Ahmadi
Abstract


The joint spectral radius (JSR) of a set of matrices characterizes the maximum growth rate that can be achieved by multiplying them in arbitrary order. This concept, which essentially generalizes the notion of the "largest eigenvalue" from one matrix to many, was introduced by Rota and Strang in the early 60s and has since emerged in many areas of application such as stability of switched linear systems, computation of the capacity of codes, convergence of consensus algorithms, tracability of graphs, and many others. The JSR is a very difficult quantity to compute even for a pair of matrices. In this talk, we present optimization-based algorithms (e.g., via semidefinite programming or dynamic programming) that can either compute the JSR exactly in special cases or approximate it with arbitrary prescribed accuracy in the general case.

Based on joint work (in different subsets) with Raphael Jungers, Pablo Parrilo, and Mardavij Roozbehani.
 

  • Numerical Analysis Group Internal Seminar
2 May 2017
14:00
Abstract

The past few years have seen a surge of interest in nonconvex reformulations of convex optimizations using nonlinear reparameterizations of the optimization variables. Compared with the convex formulations, the nonconvex ones typically involve many fewer variables, allowing them to scale to scenarios with millions of variables. However, one pays the price of solving nonconvex optimizations to global optimality, which is generally believed to be impossible. In this talk, I will characterize the nonconvex geometries of several low-rank matrix optimizations. In particular, I will argue that under reasonable assumptions, each critical point of the nonconvex problems either corresponds to the global optimum of the original convex optimizations, or is a strict saddle point where the Hessian matrix has a negative eigenvalue. Such a geometric structure ensures that many local search algorithms can converge to the global optimum with random initializations. Our analysis is based on studying how the convex geometries are transformed under nonlinear parameterizations.

  • Numerical Analysis Group Internal Seminar
7 March 2017
14:00
Abstract

In various research fields such as machine learning, compressed sensing and operations research, optimization problems which seek sparsity of solutions by the cardinality constraint or rank constraint are studied. We formulate such problems as DC (Difference of two Convex functions) optimization problems and apply DC Algorithm (DCA) to them. While a subproblem needs to be solved in each DCA iteration, its closed-form solution can be easily obtained by soft-thresholding operation. Numerical experiments demonstrate the efficiency of the proposed DCA in comparison with existing methods.
This is a joint work with J. Gotoh (Chuo Univ.) and K. Tono (U. Tokyo). 

  • Numerical Analysis Group Internal Seminar

Pages