The course addresses fundamental questions in quantitatively defining, evaluating, or estimating information. What is information? How many bits do we need to store some data? How many bits do we need to communicate some messages? How many bits do we need to produce some strings? How many bits do we need to learn some concepts? How many bits do we need to implement some formula? The course answers these questions in a broad and connected manner, and gives students some starting points to understand important areas like compression, error correcting code, machine learning, and neural computation.
date | syllabus | todo/done | suggested reading |
09/10 | course introduction; what is information? | ||
09/17 | information in storage: deterministic and probabilistic information |
|
|
09/24 | information in storage: entropy |
|
|
10/01 | information in storage: Shannon's first theorem | homework 1 released |
|
10/08 | information in storage: source coding |
|
|
10/15 | information in storage: Huffman coding |
|
|
10/22 | information in storage: scalar quantization |
|
|
10/29 | information in storage: mutual information |
|
|
11/05 | information in storage: vector quantization | homework 1 due; homework 2 released |
|
11/12 | information in storage: rate-distortion tradeoff |
|
|
11/19 | no class as instructor attends ACML | ||
11/26 | information in communication: channel capacity |
|
|
12/03 | information in computation: computational compression | homework 3 released |
|
12/10 | information in computation: Kolmogorov complexity | homework 2 due |
|
12/17 | information in computation: algorithmic entropy |
|
|
12/24 | information in computation/learning: universal probability | ||
12/31 | information in learning: Solomonoff inference and PAC | homework -log2 sqrt(Omega) due | |
01/07 | information in learning: more about PAC; summary | ||
01/14 | winter vacation begins (really?) | homework 3 due |
Last updated at CST 13:08, October 04, 2023 Please feel free to contact me: ![]() |
|