Instructor:
Dan Gildea office hours T/Th 2-3pm, 3019 Wegmans
TAs: Zhenghong Zhou, office hours Fri 4-5pm, 3504 Wegmans
Daoan Zhang, office hours Mon 9-10am, 3504 Wegmans
Prereqs: Probability, Linear Algebra, Vector Calculus.
Homeworks
Lecture notes
Required text: Christopher M. Bishop, Pattern Recognition and Machine Learning.
The following are useful references in addition
to the reading material assigned for each class:
- Stuart Russell and Peter Norvig, Artificial Intelligence, A Modern Approach.
- Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction.
- Larry Wasserman, All of Statistics, 2004.
Syllabus
On | we will cover | which means that after class you will understand | if before class you have read |
1/21 |
Regression and Classification |
perceptron, linear regression
| bishop 1.2, 1.4, app E |
1/23 |
Logistic Regression |
stochastic gradient descent
| bishop 4.3 |
1/28 |
Convexity |
positive semidefinite matrices
| |
1/30 |
Backpropagation |
DP for gradient descent
| bishop 5.1, 5.2, 5.3, Krizhevsky 2012 |
2/4 |
Support Vectors |
max-margin
| bishop 3.1, 4.1 |
2/6 |
Support Vectors |
strong duality
| bishop 7.1 |
2/11 |
Support Vectors |
the kernel trick
| bishop 6.1, 6.2 |
2/13 |
Hidden Markov Models |
forward-backward
| bishop 13.2 |
2/18 |
Graphical Models |
bayes ball
| |
2/20 |
Probabilistic Inference |
message passing
| bishop 8.4 |
2/25 |
Tree decomposition |
cyclic graphs
| koller and friedman |
2/27 |
Tree decomposition cont'd |
vertex elimination
| |
3/4 |
Review |
| |
3/6 |
Midterm |
| |
3/18 |
Midterm Solutions |
| |
3/20 |
Fairness in machine learning |
| Kleinberg, Corbett-Davies |
3/25 |
Expectation Maximization |
mixture of gaussians
| bishop 9 |
3/27 |
Expectation Maximization |
L = Q + H + D
| bishop 9 |
4/1 |
EM for HMM |
minimum bayes risk
| |
4/3 |
Sampling |
Markov Chain Monte Carlo
| bishop 11.2 |
4/8 |
Metropolis Hastings |
detailed balance
| bishop 11.2 |
4/10 |
Gibbs sampling |
annealing
| bishop 11.3 |
4/15 |
no class |
| |
4/17 |
Transformers |
deep generative models
| |
4/22 |
Gradient Descent |
SGD convergence
| Ruder 2016 |
4/24 |
Reinforcement Learning |
q-learning
| sutton ch 3, 4.3, 4.4, 6.1, 6.5, 7.2, 11.1 |
4/29 |
Something fun |
VAE, Diffusion
| Kingma and Welling 2014, Ho et al. 2020 |
5/1 | Review |
come to class with questions!
| |
Final exam: Th May 8, 7:15-10:15pm, in classroom.
Grading
- Homeworks: 50%
- Final exam: 30%
- Midterm: 20%
gildea @ cs rochester edu
March 27, 2025
|