Computer Science 246/446
Homework
Spring 2013
|
|
No late homework!
- Homework 1, due Fri 1/18 3:25pm
- Make a comment on NB. If you have not received an email invitation for NB,
then 1) make sure you are registered for the class, 2)
make sure that your email address with the registrar is correct and then 3)
send email to the TA.
- Homework 2, due Wed 1/23 in class
- Murphy 2.3, 2.4, 2.6, 2.9
- Homework 3, due Fri 1/25 5pm by turnin script
- Implement nearest neighbor and linear regression
for the voting dataset using Python.
Split the 435 examples into 348 train / 45 dev / 42 test and report your
classification accuracy on the test set.
- Homework 4, due Wed 1/30 in class
- Murphy 2.11, 2.12, 2.14
- Homework 5, due Mon 2/4 3:25pm by turn_in script
-
Implement naive bayes on the voting dataset.
Plot results on dev and test sets as
a function of the Dirichlet alpha.
- Homework 6, due Mon 2/11 in class
- Murphy 10.2, 10.3, 10.5
- Homework 7, due Mon 2/18 3:25pm by turn_in script
-
Implement the raw (no learning rate) and gradient descent
versions of the perceptron algorithm for the voting data.
Experiment with different learning rate schedules.
How do the results compare with naive bayes?
- Homework 8, due Wed 2/20 in class
-
Find the dual! (See lecture notes.)
- Homework 9, due Mon 3/4 in class
-
Murphy 20.1 (ignore reference to figure 10.14 - draw the MRF
from the eqn in the problem).
- Homework 10, due Fri 3/8 5pm by turn_in script
-
Implement EM fitting of a mixture of gaussians on the two-dimensional
data set points.dat. You should
try different numbers of mixtures, as well as tied vs. separate
covariance matrices for each gaussian.
OR
Implement EM fitting of the aspect model on the
discrete data set pairs.dat.
You should try different numbers of mixtures.
IN EITHER CASE
Use the final 1/10 of the data for dev.
Plot likelihood on train and dev vs iteration
for different numbers of mixtures.
- Homework 11, due Wed 3/20 in class
-
Review problems passed out on 3/6.
- Homework 12, due Mon 4/1 3:25pm by turn_in script
-
Implement Gibbs sampling with a Dirichlet Process prior
for whichever dataset you chose for assignment 10.
Experiment with different values of alpha and discuss how
they effect the number of clusters found.
- Homework 13, due Mon 4/8 in class
- Murphy 12.7
- Homework 14, due Mon 4/15 3:25pm by turn_in script
-
Implement logistic regression for the voting data.
Experiment with various regularization parameters.
Use this template for calling lbfgs.
- Homework 15, due Mon 4/22 3:25pm by turn_in script
-
Implement EM to train an HMM for whichever dataset
you used for assignment 10.
The observation probs should be as in assignment 10:
either gaussian, or two discrete distributions conditionally
independent given the hidden state.
Does the HMM model the data better than the original
non-sequence model? What is the best number of states?
- Homework 16, due Wed 5/1 in class
-
Review problems handed out in class.
gildea @ cs rochester edu
April 25, 2013
|