Hilbert Space Kernel Methods for Machine Learning: Background and Foundations

Описание к видео Hilbert Space Kernel Methods for Machine Learning: Background and Foundations

QuantUniversity 2021 Winter School lecture
www.quantuniversity.com

Hilbert Space Kernel Methods for Machine Learning: Background and Foundations
With Daniel Duffy from Datasim Education BV & Jean-Marc Mercier from MPG Partners

Daniel will, in the first part of this talk, overviews RKHS (Reproducing Kernel Hilbert Space) methods and some of their applications to statistics and machine learning. They have several attractive properties such as solid mathematical foundations, computational efficiency and versatility when compared to earlier machine learning methods (for example, artificial neural networks (ANNs)). We can draw on the full power of (applied) Functional Analysis to give sharper and a priori error estimation for classification and regression problems, and we have access to any partial differential equations driven approach. We discuss how RKHS methods subsume and improve traditional machine learning methods and we discuss their advantages for the two-sample problems for distributions and Support Vector Estimation and Regression Estimation.

Jean-Marc will then present and discuss a Python library called codpy (curse of dimensionality - for Python), that is an application oriented library supporting Support Vector Machine (SVM) and implementing RKHS methods, providing tools for machine learning, statistical learning and numerical simulations. This library has been used in the last five years for the internal algorithmic needs of his company, as the main tool and ingredient of proof-of-concept projects for institutional clients. He will also present a benchmark of this library against a more traditional neural network approach, for two important, sometimes critical, classes of applications: the first one is classification methods, illustrated with the benchmark MNIST pattern recognition problem. The second one is statistical learning, for which he will compare both approaches with methods computing conditional expectations.

Комментарии

Информация по комментариям в разработке