ICML 2002. Theory Slides, Implementation Slides

Alex Smola, RSISE, Machine Learning Group, Australian National University, Canberra

## Abstract

The tutorial will introduce Gaussian Processes both for Classifcation and Regression. This includes a brief presentation of covariance functions, their connection to Support Vector Kernels, and an overview over recent optimization methods for Gaussian Processes.

Target Audience: Novices and researchers more advanced in the knowledge of Gaussian Processes will benefit from the presentation. While being self contained, i.e., without requiring much further knowledge than basic calculus and linear algebra, the presentation will advance to state of the art results in optimization and adaptive inference. This means that the course will cater for Graduate Students and senior researchers alike. In particular, I will not assume knowledge beyond undergraduate mathematics (see Prerequisites for further detail).

Expected Knowledge Gain: a working knowledge in Gaussian Processes which will enable the audience to apply Bayesian inference methods in their research without much further training.

## Prerequisites

Nothing beyond undergraduate knowledge in mathematics is expected. More specifically, I assume:

- Basic linear algebra (matrix inverse, eigenvector, eigenvalue, etc.)
- Some numerical mathematics (beenficial but not required), such as matrix factorization, conditioning, etc.
- Basic statistics and probability theory (Normal distribution, conditional distributions).
- (OPTIONAL:) Some knowledge in Bayesian methods
- (OPTIONAL:) Some knowledge in kernel methods