Learning Theory

Introduction to Machine Learning - 10-701/15-781


    • Application of McDiarmid's inequality

    • Infinite many hypothesis

    • Uniform convergence bounds

    • PAC bound for estimation error

    • Structural risk minimzation

  • Vapnik Chervonenkis dimension

    • Shattering

    • Growth Function

    • Sauer's lemma

    • Vapnik-Chervonenkis inequality

    • Vapnik-Chervonenkis theorem

Supplementary material

  • Slides in PDF.

  • L. Devroye, L. Gyorfi, G. Lugosi: A Probabilistic Theory of Pattern Recognition. Springer, New York, 1996.

  • L. Gyorfi (editor): Principles of Nonparametric Learning, Springer-Verlag Wien New York, 2002

  • L. Gyorfi, M. Kohler, A. Krzyzak, H. Walk: A Distribution-Free Theory of Nonparametric Regression, Springer-Verlag, New York, 2002

  • A. Tsybakov: Introduction to Nonparametric Estimation, Springer Series in Statistics, 2008

  • Robert D. Nowak: Lecture notes about Statistical Learning Theory


This is unedited video straight from a Lumix GF2 with a 20mm lens which should explain the sound (it doesn't have a dedicated audio input) … But it should help as a supplement with the slides (YouTube typically makes the 1080i version available within 1 week of the upload).