Speaker: Boumediene Hamzi (Dr., Department of Computing and Mathematical Sciences, Caltech)

Date / Time: Tuesday, November 12, 2024 / 14:00 (Ankara, Turkey)

Place: Online Participation (please self-register first, if necessary)

Abstract: Since its inception in the 19th century through the efforts of Poincare and Lyapunov, the theory of dynamical systems addresses the qualitative behavior of dynamical systems as understood from models. From this perspective, the modeling of dynamical processes in applications requires a detailed understanding of the processes to be analyzed. This deep understanding leads to a model, which approximates the observed reality and is often expressed by a system of Ordinary/Partial, Underdetermined (Control), Deterministic/Stochastic differential or difference equations. While models are very precise for many processes, for some of the most challenging applications of dynamical systems (such as climate dynamics, brain dynamics, biological systems, or the financial markets), the development of such models is notably difficult. On the other hand, the field of machine learning is concerned with algorithms designed to accomplish a certain task, whose performance improves with the input of more data. Applications for machine learning methods include computer vision, stock market analysis, speech recognition, recommender systems and sentiment analysis in social media. The machine learning approach is invaluable in settings where no explicit model is formulated, but measurement data is available. This is frequently the case in many systems of interest, and the development of data-driven technologies is becoming increasingly important in many applications. The intersection of the fields of dynamical systems and machine learning is largely unexplored, and the objective of this talk is to show that working in reproducing kernel Hilbert spaces offers tools for a data-based theory of nonlinear dynamical systems.

In the first part of the talk, we introduce simple methods to learn surrogate models for complex systems. We present variants of the method of Kernel Flows as simple approaches for learning the kernel that appear in the emulators we use in our work. First, we will talk about the method of parametric and nonparametric kernel flows for learning chaotic dynamical systems. We’ll also talk about learning dynamical systems from irregularly sampled time series as well as from partial observations. We will also introduce the methods of Sparse Kernel Flows and Hausdorff-metric based Kernel Flows (HMKFs) and apply them to learn 132 chaotic dynamical systems. Finally, we extend the method of Kernel Mode Decomposition to design kernels in view of detecting critical transitions in some fast-slow random dynamical systems.

Then, we introduce a data-based approach to estimating key quantities which arise in the study of nonlinear autonomous, control and random dynamical systems. Our approach hinges on the observation that much of the existing linear theory may be readily extended to nonlinear systems – with a reasonable expectation of success- once the nonlinear system has been mapped into a high or infinite dimensional Reproducing Kernel Hilbert Space. We develop computable, non-parametric estimators approximating controllability and observability energies for nonlinear systems. We apply this approach to the problem of model reduction of nonlinear control systems. It is also shown that the controllability energy estimator provides a key means for approximating the invariant measure of an ergodic, stochastically forced nonlinear system. Finally, we show how kernel methods can be used to approximate center manifolds, propose a data-based version of the center manifold theorem and construct Lyapunov functions for nonlinear ODEs.


    Biography

    Boumediene Hamzi is currently a Senior Scientist at the Department of Computing and Mathematical Sciences, Caltech. He is also co-leading the Research Interest Group on Machine Learning and Dynamical Systems at the Alan Turing Institute. Broadly speaking, his research is at the interface of Machine Learning and Dynamical Systems.