Machine learning allows computational systems to adaptively improve their performance with experience accumulated from the data observed. This course introduces the basics of learning theories, the design and analysis of learning algorithms, and some applications of machine learning.

- instructor: Hsuan-Tien Lin (htlin AT csie . ntu . edu . tw) [office hour: after classes, or by appointment]
- TAs and TA hour: ml2015ta AT csie . ntu . edu . tw
- You-Lin Tsou (CSIE R03): Mondays 14:00--15:00 in CSIE R536
- Hong-Min Chu (CSIE R04): Mondays 14:00--15:00 in CSIE R536
- Liang-Wei Chen (CSIE B01): Tuesdays 10:00--11:00 in CSIE R536
- Yao-Yuan Yang (CSIE B01): Tuesdays 10:00--11:00 in CSIE R536
- Yu-An Chung (CSIE B01): Thursdays 13:00--14:00 in CSIE R536
- Meng-Yuan Yang (CSIE B01): Thursdays 14:00--15:00 in CSIE BASEMENT
- Si-An Chen (CSIE B02): Thursdays 14:00--15:00 in CSIE BASEMENT
- Kuan-Hao Huang (CSIE R03): Fridays 10:40--11:40 in CSIE R536
- Hsien-Chun Chiu (CSIE R04): Fridays 16:00--17:00 in CSIE R536

- Time: Mondays 10:20 to 12:10; Wednesdays 10:20 to 12:10
- Room:
**CSIE Building, R103** - Textbook:
*Learning from Data*, by Yaser Abu-Mostafa, Malik Magdon-Ismail and Hsuan-Tien Lin - Language:
**Mandarin**teaching with**English**slides - Grading: 70% homework, 30% project (tentative)
- Full Course Website: http://ceiba.ntu.edu.tw/1041mlearn

- homework 8 announced on 1/1/2016, due on 1/20/2016
- homework 7 announced on 12/25/2015, due on 1/6/2015
- final project announced on 11/26/2015 (lucky again), due on 01/20/2016
- homework 6 announced on 12/10/2015, due on 12/23/2015
- homework 5 announced on 11/26/2015 (the lucky day), due on 12/9/2015
- homework 4 announced on 11/12/2015, due on 11/25/2015
- homework 3 announced on 11/02/2015, due on 11/16/2015
- homework guidelines updated on 10/15/2015
- homework 2 announced on 10/15/2015, due on 11/02/2015
- homework 1 announced on 9/30/2015, due on 10/14/2015
- homework 0 announced on 9/13/2015, self-graded
- policy and homework guidelines announced on 9/13/2015

date | syllabus | todo/done | suggested reading |

9/14 | course introduction | course slides | |

9/16 | topic 1: when can machines learn?the learning problem | course slides; LFD 1.0, 1.1.1, 1.2.4 | |

9/21 | learning to answer yes/no | course slides; LFD 1.1.2, 3.1 | |

9/23 | types of learning | course slides; LFD 1.2 | |

9/28 | no class because of Mid-Autumn Festival | ||

9/30 | feasibility of learning | homework 1 announced | course slides; LFD 1.3 |

10/5 | topic 2: why can machines learn?training versus testing | course slides; LFD 2.0, 2.1.1 | |

10/7 | theory of generalization | course slides; LFD 2.1.2 | |

10/12 | the VC dimension | course slides; LFD 2.2 | |

10/14 | noise and error | homework 1 due; homework 2 announced | course slides; LFD 1.4 |

10/19 | topic 3: how can machines learn?linear regression | course slides; LFD 3.2 | |

10/21 | logistic regression | course slides; LFD 3.3 | |

10/26 | linear models for classification | course slides; LFD 3.3 (for SGD part only) | |

10/28 | nonlinear transformation | homework 2 due; homework 3 announced | course slides; LFD 3.4 |

11/2 | topic 4: how can machines learn better?hazard of overfitting | course slides; LFD 4.0, 4.1 | |

11/4 | regularization | course slides; LFD 4.2 | |

11/9 | validation | course slides; LFD 4.3 | |

11/11 | three learning principles | homework 3 due; homework 4 announced | course slides; LFD 5 |

11/16 | topic 5: how can machines learn by embedding numerous features?linear support vector machine | course slides; LFD e-8.1 | |

11/28 | dual support vector machine | final project announced | course slides; LFD e-8.2 |

11/23 | kernel support vector machine | course slides; LFD e-8.3 | |

11/25 | soft-margin support vector machine | homework 4 due; homework 5 announced | course slides; LFD e-8.4 |

11/30 | kernel logistic regression | course slides; extended reading: | |

12/2 | support vector regression | course slides; extended reading: | |

12/7 | topic 6: how can machines learn by combining predictive features?blending and bagging |
course slides; extended reading: | |

12/9 | adaptive boosting | homework 5 due; homework 6 announced | course slides; extended reading: |

12/14 | decision tree | course slides; extended reading: - Classification and regression trees (overview of decision tree by Loh)
- Classification and regression trees (book of CART by Breiman et al.)
| |

12/16 | random forest | course slides; extended reading: | |

12/21 | gradient boosted decision tree | course slides; extended reading: | |

12/23 | topic 7: how can machines learn by distilling hidden features?neural network |
homework 6 due; homework 7 announced | course slides; LFD e-7.1, e-7.2, e-7.3, e-7.4 (selected parts) |

12/28 | deep learning | course slides; LFD e-7.6 | |

12/30 | radial basis function network | course slides; LFD e-6.3 | |

1/4 | no class to enjoy your holidays and homework/project better (instructor office hour in R314) | ||

1/6 | matrix factorization | homework 7 due; homework 8 announced | course slides; |

1/11 | no class to fight for the final project in the last hours | ||

1/13 | finale and award ceremony | course slides | |

1/18 | no class because winter vacation started (really?) |
||

1/20 | no class because winter vacation started (really?) |
homework 8 due; final project due |

Last updated at CST 04:53, May 30, 2020 Please feel free to contact me: |