14:00
Learning State-Space Models of Dynamical Systems from Data
Abstract
Learning dynamical models from data plays a vital role in engineering design, optimization, and predictions. Building models describing the dynamics of complex processes (e.g., weather dynamics, reactive flows, brain/neural activity, etc.) using empirical knowledge or first principles is frequently onerous or infeasible. Therefore, system identification has evolved as a scientific discipline for this task since the 1960ies. Due to the obvious similarity of approximating unknown functions by artificial neural networks, system identification was an early adopter of machine learning methods. In the first part of the talk, we will review the development in this area until now.
For complex systems, identifying the full dynamics using system identification may still lead to high-dimensional models. For engineering tasks like optimization and control synthesis as well as in the context of digital twins, such learned models might still be computationally too challenging in the aforementioned multi-query scenarios. Therefore, it is desirable to identify compact approximate models from the available data. In the second part of this talk, we will therefore exploit that the dynamics of high-fidelity models often evolve in lowdimensional manifolds. We will discuss approaches learning representations of these lowdimensional manifolds using several ideas, including the lifting principle and autoencoders. In particular, we will focus on learning state-space representations that can be used in classical tools for computational engineering. Several numerical examples will illustrate the performance and limitations of the suggested approaches.
Ten years of Direct Multisearch
Abstract
Direct Multisearch (DMS) is a well-known multiobjective derivative-free optimization class of methods, with competitive computational implementations that are often successfully used for benchmark of new algorithms and in practical applications. As a directional direct search method, its structure is organized in a search step and a poll step, being the latter responsible for its convergence. A first implementation of DMS was released in 2010. Since then, the algorithmic class has continued to be analyzed from the theoretical point of view and new improvements have been proposed for the numerical implementation. Worst-case-complexity bounds have been derived, a search step based on polynomial models has been defined, and parallelization strategies have successfully improved the numerical performance of the code, which has also shown to be competitive for multiobjective derivative-based problems. In this talk we will survey the algorithmic structure of this class of optimization methods, the main theoretical properties associated to it and report numerical experiments that validate its numerical competitiveness.
formulae in abelian categories