Hong Kong, ICONIP’06, October 3
Alex Smola, RSISE, Machine Learning Group, Australian National University, Canberra
- Lecture 1: Exponential Families
We introduce exponential families and show how they can be used for modelling a large range of distributions important for supervised learning. In particular we will discuss multinomial and Gaussian families. Moreover, we show how optimization problems are solved in the case of normal priors. Finally, we discuss connections to graphical models and message passing. - Lecture 2: Conditional Models
By conditioning on location we extend exponential family models into state of the art multiclass classification and regression estimators. In addition, we will discuss conditional random fields, which are used for document annotation and named entity tagging. - Lecture 3: Maximum Mean Discrepancy Operator methods are useful to test for identity between distributions. We will discuss a very simple and easily implementable criterion for such tests. Applications to data integration are discussed. We also discuss applications to covariate shift correction, that is, cases where training and test set are drawn from different distributions.
- Lecture 4: Dependency Estimation In a similar fashion to the two sample test above, we can also use operator methods for dependency tests. More specifically, we can use them to obtain contrast functions for independent component analysis and feature selection. We will discuss simple algorithms which achieve this goal.
Prerequisites
Nothing beyond undergraduate knowledge in mathematics is expected. More specifically, I assume:
- Basic linear algebra (matrix inverse, eigenvector, eigenvalue, etc.)
- Some numerical mathematics (beenficial but not required), such as matrix factorization, conditioning, etc.
- Basic statistics and probability theory (Normal distribution, conditional distributions).