This course introduces machine learning to students with a statistical background. Besides teaching standard methods such as logistic and ridge regression, kernel density estimation, and random forests, this course course will try to offer a broader view of model-building and optimization using probabilistic building blocks.
Syllabus and Course Information
Readings: Chapter 2 of David Mackay’s textbook
Readings:
Example code:
January 23 and 24: Linear basis function models, decision theory, classification
January 30 and 31: Bayesian inference and kNN
February 5: Assignment 1 due.
February 13: Midterm exam for both sections. (No class on Feb 14.) Grade distribution
February 17 to 26: Reading week
February 27 and 28: Mixture models
March 6 and 7: Continuous Latent variable models, neural networks
March 12: Assignment 2 due.
March 13 and 14: Sampling and Monte Carlo methods
March 20 and 21: Modelling sequential data
March 27 and 28: Stochastic Variational Inference and Variational autoencoders
April 1: Assignment 3 due at 1 pm. We have added the second question, here