Basic Statistics

Introduction to Machine Learning - 10-701/15-781

Content

  • Probabilities

    • Dependence, independence, conditional probabilities

    • Bayes rule and Chain rule

    • Paradoxes in measure theory

  • Parameter estimation

    • Maximum Likelihood Estimation (MLE)

    • Maximum a Posteriori Estimation (MAP)

  • Application I - Naive Bayes for spam filtering

    • Discrete features in Naive Bayes

    • Estimating parameters

    • Finite sample size problems

  • Application II - Naive Bayes for fMRI data processing

    • Continuous features in Naive Bayes

Supplementary material

  • Slides in PDF.

  • Alex Smola and S.V.N. Vishwanathan: Introduction to Machine Learning, Chapter I and II in PDF

  • Patrick Billingsley: Probability and Measure (Wiley Series in Probability and Statistics)

  • Larry Wasserman: All of Statistics: A Concise Course in Statistical Inference (Springer Texts in Statistics)

  • Tom Mitchell's 10701 lectures (Lectures 2,3,4)

  • Aarti Singh's 10701 lectures (Lectures 2,3,4)

  • Eric P. Xing's 10701 lectures (Lectures 2,3)

  • Tom Mitchell: Machine Learning, Chapter I in PDF

  • Andrew Moore's Basic Probability Tutorial slides in PDF

Videos

This is unedited video straight from a Lumix GF2 with a 14-42mm kit lens which should explain the sound (it doesn't have a dedicated audio input) … But it should help as a supplement with the slides.