Computer Science 246/446
Homework
Spring 2012
|
|
No late homework!
Lecture notes are graded as a homework assignment and are due
by the beginning of the next lecture.
- Homework 7
- Due Wed 4/18 5pm
Implement EM to train an HMM for the data from Homework 6. The model should have four hidden states with gaussian observation probabilities. Does the HMM model the data better than the original mixture of gaussians?
- Homework 6
- Due Wed 3/7 5pm
Implement EM fitting of a mixture of gaussians on the two-dimensional
data set points.dat. You should
try different numbers of mixtures, as well as tied vs. separate
covariance matrices for each gaussian. Which model seems to fit the data best?
- Homework 5
- Due Tu 2/21 in class
- Bishop 6.2, 6.8, 6.9
- Bishop 7.2, 7.7
- Homework 4
- Due Web 2/15 5pm by turn_in script
- Implement the raw (no learning rate) and gradient descent
versions of the perceptron algorithm for the voting data.
Experiment with different learning rate schedules.
How do the results compare with naive bayes?
- Homework 3
- Due Web 2/1 5pm by turn_in script
- Implement naive bayes on the voting dataset
using MATLAB. Plot results on dev and test sets as
a function of the number of features used.
- Homework 2
- Due Thursday 1/26 in class
- Given a set of n i.i.d. observations x_1 to x_n,
let P be the empirical distribution P(k)=c(k)/n.
Prove the the distribution Q that
minimizes the KL divergence with P, D(P||Q), also
maximizes the probability of the data Q(x_1, ..., x_n).
- Prove (step by step) that entropy for a discrete random variable is maximized by
the uniform distribution.
- Prove that if X and Y are independent, Var[X+Y] = Var[X] + Var[Y].
- Homework 1
- Due Tuesday 1/24 in class
Bishop ex. 1.3, 1.11
gildea @ cs rochester edu
April 12, 2012
|