This seminar series will consist of several parts:

Learning Seminars:
We will be learning and understanding the Reproducing Kernel Hilbert Spaces; we also aim to prepare ourselves and students with the foundations and background needed for the applications that follow.

The Learning Seminars will mostly be based on A Primer on Reproducing Kernel Hilbert Spaces by Jonathan H. Manton and Pierre-Olivier Amblard. Hopefully, we will start this part of the seminar series in the near future (expected to start early in March, 2024).

Expert Talks:
We will be inviting researchers and experts on RKHS to give us a talk. These talks can be theoretical as well as practical implementation of RKHS to various fields, including Machine Learning.

The purpose of this seminar series is an introduction to the theory and application of RKHS in connection with machine learning.

Below is a list of possible contributors to this seminar series:

  • Aydın Aytuna (Department of Mathematics, Sabancı and Institute of Applied Mathematics, METU)
  • Azize Hayfavi (Emeritus, Institute of Applied Mathematics, METU)
  • Bülent Karasözen (Emeritus, Department of Mathematics and Institute of Applied Mathematics, METU)
  • Baver Okutmuştur (Department of Mathematics, METU)
  • Ömür Uğur (Institute of Applied Mathematics, METU)

We wish to thank all the audience of this seminar series a lot!


Brief Info / History of RKHS

Reproducing Kernel Hilbert Space (RKHS) is a special case of Hilbert space of functions with reproducing kernels. The concept of RKHS was introduced in (Aronszajn, 1950). The RKHS remained in pure mathematics until it was used for the first time in machine learning by introduction of kernel Support Vector Machine (SVM) (Boser et al., 1992).

Currently RKHS is used frequently in machine learning for nonlinear classification and in dimensionality reduction (Ghojogh et al., 2023), model order reduction (Fujii & Kawahara, 2019), nonlinear signal processing (Rojo-Álvarez et al., 2018).

Furthermore, RKHS has many applications in linear equations and optimisation (Manton & Amblard, 2015), inverse problems (Yun & Panaretos, 2023), in finance (LeFloch & Mercier, 2023), and in statistics (Berlinet & Thomas-Agnan, 2004) as well as stochastic processes (Manton & Amblard, 2015).

Bibliography

  1. Aronszajn, N. (1950). Theory of reproducing kernels. Transactions of the American Mathematical Society, 68, 337–404. https://doi.org/10.2307/1990404
  2. Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A Training Algorithm for Optimal Margin Classifiers. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, 144–152. https://doi.org/10.1145/130385.130401
  3. Ghojogh, B., Crowley, M., Karray, F., & Ghodsi, A. (2023). Elements of Dimensionality Reduction and Manifold Learning. Springer International Publishing. https://doi.org/10.1007/978-3-031-10602-6
  4. Fujii, K., & Kawahara, Y. (2019). Dynamic mode decomposition in vector-valued reproducing kernel Hilbert spaces for extracting dynamical structure among observables. Neural Networks, 117, 94–103. https://doi.org/10.1016/j.neunet.2019.04.020
  5. Rojo-Álvarez, J. L., Martínez-Ramón, M., Muñoz-Marí, J., & Camps-Valls, G. (2018). Kernel Functions and Reproducing Kernel Hilbert Spaces. In Digital Signal Processing with Kernel Methods (pp. 165–207). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781118705810.ch4
  6. Manton, J. H., & Amblard, P.-O. (2015). A Primer on Reproducing Kernel Hilbert Spaces. https://doi.org/10.1561/2000000050
  7. Yun, H., & Panaretos, V. M. (2023). Computerized Tomography and Reproducing Kernels. https://doi.org/10.48550/arXiv.2311.07465
  8. LeFloch, P. G., & Mercier, J.-M. (2023). A Class of Mesh-Free Algorithms for Some Problems Arising in Finance and Machine Learning. Journal of Scientific Computing, 95(3), 75. https://doi.org/10.1007/s10915-023-02179-5
  9. Berlinet, A., & Thomas-Agnan, C. (2004). Reprodueing Kemel Hilbert Spaees in Probability and Statistics. Springer Science+Business Media. https://doi.org/10.1007/978-1-4419-909