S.No.
|
Topics
|
Lectures
|
Instructor
|
References
|
0
|
Introduction to Machine Learning
|
01-01
|
SDR
|
|
|
Flavours of Machine Learning: Unsupervised, Supervised,
Reinforcement, Hybrid models. Decision Boundaries: crisp, and
non-crisp, optimisation problems. Examples of unsupervised learning.
|
02 Jan (Tue) {lecture#01}
|
SDR
|
|
1
|
Unsupervised Learning:
K-Means, Gaussian Mixture Models, EM
|
02-09
|
SDR
|
[Bishop Chap.9],
[Do: Gaussians],
[Do: More on Gaussians],
[Ng: K-Means],
[Ng: GMM],
[Ng: EM],
[Smyth:
EM]
|
|
The K-Means algorithm: two flavours. Algorithms: history,
flavours. A mathematical formulation of the K-Means algorithm.
|
03 Jan (Tue) {lecture#02}
|
SDR
|
|
|
The Objective function to minimise. The basic K-Means algorithm,
computation complexity issues: each step, overall. Alternate
formulation with a distance threshold. Limitations of K-Means.
|
05 Jan (Fri) {lecture#03}
|
SDR
|
|
|
Gaussian Mixture Models. The Bayes rule, and Responsibilities.
|
09 Jan (Tue) {lecture#04}
|
SDR
|
|
|
Maximum Likelihood Estimation. Parameter estimation for a mixture
of Gaussians, starting with the simple case of one 1-D Gaussian,
to the general case of K D-dimensional Gaussians.
|
10 Jan (Wed) {lecture#05}
|
SDR
|
|
|
The general case of K D-dimensional Gaussians.
Getting stuck, using Lagrange Multipliers.
The EM Algorithm for Gaussian Mixtures.
|
12 Jan (Fri) {lecture#06}
|
SDR
|
|
|
Application: Assignment 1:
The
Stauffer and Grimson Adaptive Background Subtraction
Algorithm.
An introduction to the basic set of interesting heuristics!
|
16 Jan (Tue) {lecture#07}
|
SDR
|
[CVPR'99], [PAMI'00]
|
|
The Stauffer and Grimson algorithm (contd)
|
17 Jan (Wed) {lecture#08}
|
SDR
|
|
|
The Stauffer and Grimson algorithm (contd)
|
19 Jan (Fri) {lecture#09}
|
SDR
|
|
2
|
Unsupervised Learning:
EigenAnalysis:
PCA, LDA and Subspaces
|
09-13
|
SDR
|
[Ng: PCA],
[Ng: ICA],
[Burges: Dimension Reduction],
[Bishop Chap.12]
|
|
Introduction to Eigenvalues and Eigenvectors
|
23 Jan (Tue) {lecture#10}
|
SDR
|
|
|
Properties of Eigenvalues and Eigenvectors
|
24 Jan (Wed) {lecture#11}
|
SDR
|
|
|
OpenCV Introduction
|
30 Jan (Tue) {lecture#12}
|
SC
|
|
|
Gram-Schmidt Orthogonalisation, other properties
|
31 Jan (Wed) {lecture#13}
|
SDR
|
|
|
--- No class ---
|
02 Feb (Fri) {lecture#xx}
|
---
|
|
---
|
Minor I
|
08 Feb (Thu)
|
---
|
---
|
|
--- No class ---
|
09 Feb (Fri) {lecture#xx}
|
---
|
|
|
--- Extra class in lieu of 02,09 Feb lectures ---
The SVD and its properties
|
11 Feb (Sun) {lecture#14, #15}
|
---
|
|
3
|
Linear Models for Regression, Classification
|
14-16, 17-19
|
SDR
|
[Bishop Chap.3],
[Bishop Chap.4],
[Ng: Supervised, Discriminant Analysis],
[Ng: Generative],
|
|
General introduction to Regression and Classification. Linearity
and restricted linearity.
|
13 Feb (Tue) {lecture#16}
|
---
|
|
|
Maximum Likelihood and Least Squares
|
20 Feb (Tue) {lecture#17}
|
---
|
|
|
The Moore-Penrose Pseudo-inverse.
|
21 Feb (Wed) {lecture#18}
|
---
|
|
|
Regularised Least Squares. Classification.
Fisher's Linear Discriminant.
|
23 Feb (Fri) {lecture#19}
|
---
|
|
|
Fisher's Linear Discriminant: Continued.
Introduction to SVMs.
|
06 Mar (Tue) {lecture#20}
|
---
|
|
4
|
SVMs
|
20-28
|
SDR
|
[Bishop Chap.7],
[Alex: SVMs],
[Ng: SVMs],
[Burges: SVMs],
[Bishop Chap.6],
[Khardon: Kernels]
|
|
SVMs: the concept of the margin, optimisation problem, getting
the physical significance of the y = +1 and y = -1 lines
|
07 Mar (Wed) {lecture#21}
|
---
|
|
|
The basic SVM optimisation: the primal and the dual problems
|
09 Mar (Fri) {lecture#22}
|
---
|
|
|
Lagrange Multipliers and the KKT Conditions
|
13 Mar (Tue) {lecture#23}
|
---
|
[Early class! 07:00am-08:00am, II-241 (EE Committee Room)]
|
|
Lagrange Multipliers and the KKT Conditions (contd)
Soft-Margin SVMs
|
14 Mar (Wed) {lecture#24}
|
---
|
|
|
Soft-Margin SVMs (contd)
|
16 Mar (Fri) {lecture#25}
|
---
|
|
|
Recap: Lagrange Multipliers and the KKT Conditions
The hard-margin SVM
|
20 Mar (Tue) {lecture#26}
|
---
|
|
|
Recap: Lagrange Multipliers and the KKT Conditions
The soft-margin SVM
|
21 Mar (Wed) {lecture#27}
|
---
|
|
|
Recap: Lagrange Multipliers and the KKT Conditions
The soft-margin SVM (contd).
|
23 Mar (Fri) {lecture#28}
|
---
|
|
---
|
Minor II
|
28 Mar (Wed)
|
---
|
---
|
|
--- No Class ---
|
03 Apr (Tue) {lecture#xx}
|
---
|
|
|
Introduction to Kernels
|
04 Apr (Wed) {lecture#29}
|
---
|
|
|
Kernels in Regression
|
06 Apr (Fri) {lecture#30}
|
---
|
|
|
Kernels in Regression (contd)
|
10 Apr (Tue) {lecture#31}
|
---
|
|
|
Kernel Functions: properties, construction
|
11 Apr (Wed) {lecture#32}
|
---
|
|
5
|
Neural Networks
|
33-40
|
SDR
|
[Bishop Chap.5],
[Alex: NNRep],
[Alex: NNLearn]
|
|
Introduction to Neural Networks: the Multi-Layer Perceptron:
Conventions, restricted non-linearity
|
13 Apr (Fri) {lecture#33}
|
---
|
|
|
Basic Perceptron, Optimisation
|
17 Apr (Tue) {lecture#34}
|
---
|
|
|
Optimisation: Estimating network parameters for a regression problem
|
18 Apr (Wed) {lecture#35}
|
---
|
|
|
Optimisation: Estimating network parameters on a classification
problem.
The Binomial distribution, the Bernoulli distribution
Physical Significance of the NN solution vis-a-vis the linear
method done before
|
20 Apr (Fri) {lecture#36}
|
---
|
|
|
Basic Optimisation basics: local quadratic approximation,
geometric interpretation, computing the gradient
|
24 Apr (Tue) {lecture#37}
|
---
|
|
|
The Backpropagation Algorithm
|
25 Apr (Wed) {lecture#38}
|
---
|
|
|
Backpropagation vs numerical evaluation of the gradient
Invariance issues: linear transformation of the input and outputs
4 Basic Invariance-handling techniques
|
27 Apr (Fri) {lecture#39}
|
---
|
|
|
--- No Class ---
|
01 May (Tue) {lecture#xx}
|
---
|
|
|
Some details on Tangent Propagation, CNNs
|
02 May (Wed) {lecture#40}
|
---
|
|
---
|
Major
|
08 May (Tue)
|
---
|
---
|
5
|
Feature Selection
|
xx-xx
|
SDR
|
|
6
|
Logistic Regression
|
28-29
|
SDR
|
[Alex: LogReg]
|
xx
|
Mathematical Basics for Machine Learning
|
xx-xx
|
xx
|
[Burges: Math for ML],
[Burges: Math Slides],
[Do,
Kolter: Linear Algebra Notes],
|