Probabilistic Graphical Model (Fall 2012, 3 credits)
Instructor: Prof. Shou-de Lin (sdlin@csie.ntu.edu.tw)
Classroom: CSIE 101
Meeting Time: Tu 2:20-5:20 pm
Office Hour: Tu after class or by appointment
TA : Chung-yi Li (r00922051@csie.ntu.edu.tw), Ting-wei Lin (b97083@csie.ntu.edu.tw), En-hsu Yen (a061105@gmail.com)
Course Description:
Real-world events are full of uncertainty. Probabilistic Theory is a way to model uncertainty. However, probabilistic theory itself can hardly be exploited to deal with large-scale real-world problem which contains many correlated variables. Thanks to the development of a new reasoning and inference framework called probabilistic graphical models, we are then able to deal with thousands or even more variables altogether in a efficient manner. This course will cover the basic and advanced topics about probabilistic graphical models, including directed models such as Bayesian Networks, undirected models such as Markov network, and the corresponding inference and learning methods such as variable elimination, brief propagation, EM alrogihtm, and Markov chain Monte Carlo methods.
Grading:
Three Homework assignments (70%), three persons per team
Final Project (30%)
Textbook:
Bayesian Reasoning and Machine Learning, David Barber, Cambridge 2012 (pdf version available online)
Probabilistic Graphical Models Principles and Techniques Daphne Koller and Nir Friedman (ISBN 0-262-01319-3)
Recommend Readings:
- Elements of Statistical Learning by Trevor Hastie,
Robert Tibshirani and Jerome Friedman (ISBN 0387952845)
- Pattern Recognition and Machine Learning by Chris
Bishop (SBN 0387310738)
- Machine Learning by Tom Mitchell (ISBN 0070428077)
- The EM algorithm and related statistical models /
edited by Michiko Watanabe, Kazunori Yamaguchi
Syllabus:
Research Methods | First section (2:20-3:20) | Chpaters | HW | |
11-Sep | Introduction | CH1~2 | ||
18-Sep | representation | Bayesian Networks, PGM tool | CH3 | HW1-1 |
25-Sep | representation | BN, Case Study | CH3 | |
2-Oct | Inference | Markov Networks | CH4 | HW1-2 |
9-Oct | Inference | Exact Inference | CH5 | |
16-Oct | Inference | Exact Inference | CH6 | HW1 due HW2-1 out |
23-Oct | Inference | Sampling as approximate inference | CH27 | HW2-2 out |
30-Oct | Learning | Deterministic Approximate Inference | CH28 | |
6-Nov | Learning | HW1 discussion, statistics for ML | CH8.6~8.8,T | HW2 due |
13-Nov | Learning | Learning as Inference (learning for BN), MLE, MAP, |
CH9.1~9.4 T, |
|
20-Nov | Learning | MLE for undirected models, Naive Bayes | CH9.6 CH10, C | HW3 out |
27-Nov | Learning | Learning with hidden variables (EM1) | CH11 | |
4-Dec | Learning | Learning with hidden variables (EM2) | CH11 | HW3 due |
11-Dec | Learning | Project Proposal | ||
18-Dec | Case study | Structure learning & Bayesian Model Selection | CH9.5, 12.1~12.5 C | |
25-Dec | Case study | Hidden Markov Model | CH23 | |
8-Jan | Final project | Final project presentation |