Information Theory
Content
Entropy
- Basic properties, decomposition
- Kullback-Leibler Divergence (with properties)
- Mutual Information
Examples
- Exponential families
- Gaussian mutual information as special case
Application - cocktail party problem
- Independent Component Analysis
- JADE, Radical, InfoMax, and FastICA (time permitting)
Supplementary material
Slides in PDF and Keynote coming soon. If you want to extract the equations from the slides you can do so by using LaTeXit, simply by dragging the equation images into it. ## Videos
See the YouTube playlist for available lecture videos. This is unedited video straight from a Lumix GF2 with a 20mm lens which should explain the sound (it doesn’t have a dedicated audio input) … But it should help as a supplement with the slides (YouTube typically makes the 1080i version available within 1 week of the upload).