S.No.

Topics

Lectures

Instructor

References

0

Introduction to Machine Learning

0101

SDR



Flavours of Machine Learning: Unsupervised, Supervised,
Reinforcement, Hybrid models. Decision Boundaries: crisp, and
noncrisp, optimisation problems. Examples of unsupervised learning.

02 Jan (Tue) {lecture#01}

SDR


1

Unsupervised Learning:
KMeans, Gaussian Mixture Models, EM

0209

SDR

[Bishop Chap.9],
[Do: Gaussians],
[Do: More on Gaussians],
[Ng: KMeans],
[Ng: GMM],
[Ng: EM],
[Smyth:
EM]


The KMeans algorithm: two flavours. Algorithms: history,
flavours. A mathematical formulation of the KMeans algorithm.

03 Jan (Tue) {lecture#02}

SDR



The Objective function to minimise. The basic KMeans algorithm,
computation complexity issues: each step, overall. Alternate
formulation with a distance threshold. Limitations of KMeans.

05 Jan (Fri) {lecture#03}

SDR



Gaussian Mixture Models. The Bayes rule, and Responsibilities.

09 Jan (Tue) {lecture#04}

SDR



Maximum Likelihood Estimation. Parameter estimation for a mixture
of Gaussians, starting with the simple case of one 1D Gaussian,
to the general case of K Ddimensional Gaussians.

10 Jan (Wed) {lecture#05}

SDR



The general case of K Ddimensional Gaussians.
Getting stuck, using Lagrange Multipliers.
The EM Algorithm for Gaussian Mixtures.

12 Jan (Fri) {lecture#06}

SDR



Application: Assignment 1:
The
Stauffer and Grimson Adaptive Background Subtraction
Algorithm.
An introduction to the basic set of interesting heuristics!

16 Jan (Tue) {lecture#07}

SDR

[CVPR'99], [PAMI'00]


The Stauffer and Grimson algorithm (contd)

17 Jan (Wed) {lecture#08}

SDR



The Stauffer and Grimson algorithm (contd)

19 Jan (Fri) {lecture#09}

SDR


2

Unsupervised Learning:
EigenAnalysis:
PCA, LDA and Subspaces

0913

SDR

[Ng: PCA],
[Ng: ICA],
[Burges: Dimension Reduction],
[Bishop Chap.12]


Introduction to Eigenvalues and Eigenvectors

23 Jan (Tue) {lecture#10}

SDR



Properties of Eigenvalues and Eigenvectors

24 Jan (Wed) {lecture#11}

SDR



OpenCV Introduction

30 Jan (Tue) {lecture#12}

SC



GramSchmidt Orthogonalisation, other properties

31 Jan (Wed) {lecture#13}

SDR



 No class 

02 Feb (Fri) {lecture#xx}






Minor I

08 Feb (Thu)






 No class 

09 Feb (Fri) {lecture#xx}





 Extra class in lieu of 02,09 Feb lectures 
The SVD and its properties

11 Feb (Sun) {lecture#14, #15}




3

Linear Models for Regression, Classification

1416, 1719

SDR

[Bishop Chap.3],
[Bishop Chap.4],
[Ng: Supervised, Discriminant Analysis],
[Ng: Generative],


General introduction to Regression and Classification. Linearity
and restricted linearity.

13 Feb (Tue) {lecture#16}





Maximum Likelihood and Least Squares

20 Feb (Tue) {lecture#17}





The MoorePenrose Pseudoinverse.

21 Feb (Wed) {lecture#18}





Regularised Least Squares. Classification.
Fisher's Linear Discriminant.

23 Feb (Fri) {lecture#19}





Fisher's Linear Discriminant: Continued.
Introduction to SVMs.

06 Mar (Tue) {lecture#20}




4

SVMs

20xx

SDR

[Alex: SVMs],
[Ng: SVMs],
[Burges: SVMs]
[Khardon: Kernels]


SVMs: the concept of the margin, optimisation problem, getting
the physical significance of the y = +1 and y = 1 lines

07 Mar (Wed) {lecture#21}





The basic SVM optimisation: the primal and the dual problems

09 Mar (Fri) {lecture#22}





Lagrange Multipliers and the KKT Conditions

13 Mar (Tue) {lecture#23}



[Early class! 07:00am08:00am, II241 (EE Committee Room)]


Lagrange Multipliers and the KKT Conditions (contd)
SoftMargin SVMs

14 Mar (Wed) {lecture#24}





SoftMargin SVMs (contd)

16 Mar (Fri) {lecture#25}





Recap: Lagrange Multipliers and the KKT Conditions
The hardmargin SVM

20 Mar (Tue) {lecture#26}





Recap: Lagrange Multipliers and the KKT Conditions
The softmargin SVM

21 Mar (Wed) {lecture#27}






23 Mar (Fri) {lecture#28}






Minor II

28 Mar (Wed)





5

Feature Selection

xxxx

SDR


6

Logistic Regression

2829

SDR

[Alex: LogReg]

7

Neural Networks

3033

SDR

[Alex: NNRep],
[Alex: NNLearn]

xx

Mathematical Basics for Machine Learning

xxxx

xx

[Burges: Math for ML],
[Burges: Math Slides],
[Do,
Kolter: Linear Algebra Notes],
