Machine learning with kernel methods, Spring 2021

Julien Mairal and Jean-Philippe Vert

MSc Mathematics, Vision, Machine Learning (MVA) (ENS Paris Saclay)

MSc Mathematics, Machine Learning, and the Humanities (MASH) (Dauphine, PSL University)


Slides

Slides are frequently updated. Please let us know if you spot typos!

Videos

Outline

Many problems in real-world applications of machine learning can be formalized as classical statistical problems, e.g., pattern recognition, regression or dimension reduction, with the caveat that the data are often not vectors of numbers. For example, protein sequences and structures in computational biology, text and XML documents in web mining, segmented pictures in image processing, or time series in speech recognition and finance, have particular structures which contain relevant information for the statistical problem but can hardly be encoded into finite-dimensional vector representations.

Kernel methods are a class of algorithms well suited for such problems. Indeed they extend the applicability of many statistical methods initially designed for vectors to virtually any type of data, without the need for explicit vectorization of the data. The price to pay for this extension to non-vectors is the need to define a so-called positive definite kernel function between the objects, formally equivalent to an implicit vectorization of the data. The "art" of kernel design for various objects have witnessed important advances in recent years, resulting in many state-of-the-art algorithms and successful applications in many domains.

The goal of this course is to present the mathematical foundations of kernel methods, as well as the main approaches that have emerged so far in kernel design. We will start with a presentation of the theory of positive definite kernels and reproducing kernel Hilbert spaces, which will allow us to introduce several kernel methods including kernel principal component analysis and support vector machines. Then we will come back to the problem of defining the kernel. We will present the main results about Mercer kernels and semigroup kernels, as well as a few examples of kernel for strings and graphs, taken from applications in computational biology, text processing and image analysis. Finally we will touch upon topics of active research, such as large-scale kernel methods and deep kernel machines.

References

Schedule and organization

DateLecturerTopicSlidesVideoExercicesMore material
Jan 13JM+JPVKernels, RKHS, examples1-41Lecture 1, Lecture 2Homework 1Uniqueness of the RKHS Aronszajn's theorem
Jan 20JMSmoothness functional, Kernel trick, Representer theorem42-82Lecture 3, Lecture 4, Lecture 5
Jan 27JPVKernel ridge and logistic regression83-114Lecture 6Homework 2 Draft solution
Feb 3JMLarge-margin classifiers, SVMs115-165Lecture 7, Lecture 8Bartlett et al. 2003
Feb 10JPVUnsupervised learning, kernel PCA, K-means, CCA166-202Lecture 9, Lecture 10Homework 3
Feb 17Break
Feb 24JMGreen, Mercer, Herglotz, Bochner and friends203-310Lecture 11a Lecture 11b Lecture 11dLecture 11c
Mar 3JPVKernels for graphs, kernels on graphs436-549Lecture 12a Lecture 12bHomework 4
Mar 10JPVMKL, large-scale learning with kernels551-621Lecture 13a Lecture 13
Mar 17JMDeep kernel machines630-714Lecture 14aFinal HomeworkLecture 14b and Lecture 14c

Evaluation

The final note will be a weighted average of a data challenge (40%), a final homework (40%) and regular quizzes every week (20%).


Back to homepage