Many problems in real-world applications of machine learning can be formalized as classical statistical problems, e.g., pattern recognition, regression or dimension reduction, with the caveat that the data are often not vectors of numbers. For example, protein sequences and structures in computational biology, text and XML documents in web mining, segmented pictures in image processing, or time series in speech recognition and finance, have particular structures which contain relevant information for the statistical problem but can hardly be encoded into finite-dimensional vector representations.
Kernel methods are a class of algorithms well suited for such problems. Indeed they extend the applicability of many statistical methods initially designed for vectors to virtually any type of data, without the need for explicit vectorization of the data. The price to pay for this extension to non-vectors is the need to define a so-called positive definite kernel function between the objects, formally equivalent to an implicit vectorization of the data. The "art" of kernel design for various objects have witnessed important advances in recent years, resulting in many state-of-the-art algorithms and successful applications in many domains.
The goal of this course is to present the mathematical foundations of kernel methods, as well as the main approaches that have emerged so far in kernel design. We will start with a presentation of the theory of positive definite kernels and reproducing kernel Hilbert spaces, which will allow us to introduce several kernel methods including kernel principal component analysis and support vector machines. Then we will come back to the problem of defining the kernel. We will present the main results about Mercer kernels and semigroup kernels, as well as a few examples of kernel for strings and graphs, taken from applications in computational biology, text processing and image analysis. Finally we will touch upon topics of active research, such as large-scale kernel methods and deep kernel machines.
Lecture (in english) usually take place at ENS Paris, 29 rue d'Ulm, 75005 Paris, in amphi Jaures, 1:30-4pm.
On Feb 14, the lecture will take place at Institut Curie, 12 rue Lhomond, 75005 Paris, in amphi Burg, 1:30-4pm.
WARNING: DATE CHANGE FOR THE EXAM: we had to move the exam from March 21 to March 28 as no large room was available March 21. If you can not attend the March 28 exam, let us know ASAP.
Date | Place | Lecturer | Topic | Slides | More material |
Jan 17 | ENS Ulm, amphi Jaures | JPV | Positive definite kernel, RKHS, Aronszajn's theorem | 1-38 | Uniqueness of the RKHS Aronszajn's theorem |
Jan 24 | ENS Ulm, amphi Jaures | JM | Kernel trick, Representer theorem, kernel ridge regression | 39-97 | |
Jan 31 | ENS Ulm, amphi Jaures | JPV | Supervised classification, Kernel logistic regression, large margin classifiers, SVM | 98-159 | |
Feb 7 | ENS Ulm, amphi Jaures | JM | Unsupervised analysis, kernel PCA, kernel CCA, kernel K-means | 162-198 | |
Feb 14 | Institut Curie, amphi Burg | JPV | Green, Mercer, Herglotz and Bochner kernels | 199-273 | |
Feb 21 | ENS Ulm, amphi Jaures | JM | Kernels from generative models, string kernels | 294-370 | review paper on string kernels |
Feb 28 | Break | ||||
Mar 7 | ENS Ulm, amphi Jaures | JM | Large-scale kernel machines, deep kernel learning | 538-639 | |
Mar 14 | ENS Ulm, amphi Jaures | JPV | Graph kernels, kernels on graphs | ||
Mar 21 | MINES ParisTech, Room V112 | 1:30pm-3:30pm: Final exam, only if you can not attend the March 28 exam. In that case, please contact JP Vert to explain why you can not attend March 28's exam. You can only take one exam. | |||
Mar 28 | MINES ParisTech, Room L108 | 2pm-4pm: Final exam |