Introduction to Machine Learning (ELL784)


General Information

No one shall be permitted to audit the course. People are welcome to sit through it, however. The course is open to all suitably inclined M.Tech, M.S.(R) and Ph.D. students of all disciplines. This course is not open to B.Tech and Dual Degree students, who are supposed to opt for ELL409 (Machine Intelligence and Learning). This is a Departmental Elective (DE), one of the `essential electives' for the Cognitive and Intelligent Systems (CIS) stream of the Computer Technology Group, Department of Electrical Engineering. A general note for all EE Machine Learning courses: students will be permitted to take only one out of the following courses: ELL409 (Machine Intelligence and Learning), and the two CSE Machine Learning courses: COL341 Machine Learning and COL774 Machine Learning.

Credits: 3 (LTP: 3-0-0) [Slot C]

Schedule for Classes:

Tuesday
08:00 - 09:00
IIA-305 (Bharti Building)
Wednesday
08:00 - 09:00
IIA-305 (Bharti Building)
Friday
08:00 - 09:00
IIA-305 (Bharti Building)

Schedule for Examinations:

Minor I: 08 February 2018 (Thursday), 02:30 pm - 03:30 pm, LH-418
Minor II: 28 March 2018 (Wednesday), 02:30 pm - 03:30 pm, LH-418
Major: 08 May 2018 (Tuesday), 10:30 am - 12:30 pm, LH-510

Teaching Assistants: 

Akash Nayak
Hemant Goyal

Books, Papers and other Documentation

Textbook:

Reference Books:

Papers:

Some Interesting Web Links:


Lecture Schedule, Links to Material

S.No.
Topics
Lectures
Instructor
References
0
Introduction to Machine Learning
01-01
SDR
Flavours of Machine Learning: Unsupervised, Supervised, Reinforcement, Hybrid models. Decision Boundaries: crisp, and non-crisp, optimisation problems. Examples of unsupervised learning.
02 Jan (Tue) {lecture#01}
SDR
1
Unsupervised Learning:
K-Means, Gaussian Mixture Models, EM
02-09
SDR
[Bishop Chap.9], [Do: Gaussians], [Do: More on Gaussians], [Ng: K-Means], [Ng: GMM], [Ng: EM], [Smyth: EM]
The K-Means algorithm: two flavours. Algorithms: history, flavours. A mathematical formulation of the K-Means algorithm.
03 Jan (Tue) {lecture#02}
SDR
The Objective function to minimise. The basic K-Means algorithm, computation complexity issues: each step, overall. Alternate formulation with a distance threshold. Limitations of K-Means.
05 Jan (Fri) {lecture#03}
SDR
Gaussian Mixture Models. The Bayes rule, and Responsibilities.
09 Jan (Tue) {lecture#04}
SDR
Maximum Likelihood Estimation. Parameter estimation for a mixture of Gaussians, starting with the simple case of one 1-D Gaussian, to the general case of K D-dimensional Gaussians.
10 Jan (Wed) {lecture#05}
SDR
The general case of K D-dimensional Gaussians. Getting stuck, using Lagrange Multipliers. The EM Algorithm for Gaussian Mixtures.
12 Jan (Fri) {lecture#06}
SDR
Application: Assignment 1: The Stauffer and Grimson Adaptive Background Subtraction Algorithm. An introduction to the basic set of interesting heuristics!
16 Jan (Tue) {lecture#07}
SDR
[CVPR'99], [PAMI'00]
The Stauffer and Grimson algorithm (contd)
17 Jan (Wed) {lecture#08}
SDR
The Stauffer and Grimson algorithm (contd)
19 Jan (Fri) {lecture#09}
SDR
2
Unsupervised Learning: EigenAnalysis:
PCA, LDA and Subspaces
09-13
SDR
[Ng: PCA], [Ng: ICA], [Burges: Dimension Reduction], [Bishop Chap.12]
Introduction to Eigenvalues and Eigenvectors
23 Jan (Tue) {lecture#10}
SDR
Properties of Eigenvalues and Eigenvectors
24 Jan (Wed) {lecture#11}
SDR
OpenCV Introduction
30 Jan (Tue) {lecture#12}
SC
Gram-Schmidt Orthogonalisation, other properties
31 Jan (Wed) {lecture#13}
SDR
--- No class ---
02 Feb (Fri) {lecture#xx}
---
---
Minor I
08 Feb (Thu)
---
---
--- No class ---
09 Feb (Fri) {lecture#xx}
---
--- Extra class in lieu of 02,09 Feb lectures ---
The SVD and its properties
11 Feb (Sun) {lecture#14, #15}
---
3
Linear Models for Regression, Classification
14-16, 17-19
SDR
[Bishop Chap.3], [Bishop Chap.4], [Ng: Supervised, Discriminant Analysis], [Ng: Generative],
General introduction to Regression and Classification. Linearity and restricted linearity.
13 Feb (Tue) {lecture#16}
---
Maximum Likelihood and Least Squares
20 Feb (Tue) {lecture#17}
---
The Moore-Penrose Pseudo-inverse.
21 Feb (Wed) {lecture#18}
---
Regularised Least Squares. Classification. Fisher's Linear Discriminant.
23 Feb (Fri) {lecture#19}
---
Fisher's Linear Discriminant: Continued. Introduction to SVMs.
06 Mar (Tue) {lecture#20}
---
4
SVMs
20-28
SDR
[Bishop Chap.7], [Alex: SVMs], [Ng: SVMs], [Burges: SVMs], [Bishop Chap.6], [Khardon: Kernels]
SVMs: the concept of the margin, optimisation problem, getting the physical significance of the y = +1 and y = -1 lines
07 Mar (Wed) {lecture#21}
---
The basic SVM optimisation: the primal and the dual problems
09 Mar (Fri) {lecture#22}
---
Lagrange Multipliers and the KKT Conditions
13 Mar (Tue) {lecture#23}
---
[Early class! 07:00am-08:00am, II-241 (EE Committee Room)]
Lagrange Multipliers and the KKT Conditions (contd)
Soft-Margin SVMs
14 Mar (Wed) {lecture#24}
---
Soft-Margin SVMs (contd)
16 Mar (Fri) {lecture#25}
---
Recap: Lagrange Multipliers and the KKT Conditions
The hard-margin SVM
20 Mar (Tue) {lecture#26}
---
Recap: Lagrange Multipliers and the KKT Conditions
The soft-margin SVM
21 Mar (Wed) {lecture#27}
---
Recap: Lagrange Multipliers and the KKT Conditions
The soft-margin SVM (contd).
23 Mar (Fri) {lecture#28}
---
---
Minor II
28 Mar (Wed)
---
---
--- No Class ---
03 Apr (Tue) {lecture#xx}
---
Introduction to Kernels
04 Apr (Wed) {lecture#29}
---
Kernels in Regression
06 Apr (Fri) {lecture#30}
---
Kernels in Regression (contd)
10 Apr (Tue) {lecture#31}
---
Kernel Functions: properties, construction
11 Apr (Wed) {lecture#32}
---
5
Neural Networks
33-40
SDR
[Bishop Chap.5], [Alex: NNRep], [Alex: NNLearn]
Introduction to Neural Networks: the Multi-Layer Perceptron: Conventions, restricted non-linearity
13 Apr (Fri) {lecture#33}
---
Basic Perceptron, Optimisation
17 Apr (Tue) {lecture#34}
---
Optimisation: Estimating network parameters for a regression problem
18 Apr (Wed) {lecture#35}
---
Optimisation: Estimating network parameters on a classification problem. The Binomial distribution, the Bernoulli distribution
Physical Significance of the NN solution vis-a-vis the linear method done before
20 Apr (Fri) {lecture#36}
---
Basic Optimisation basics: local quadratic approximation, geometric interpretation, computing the gradient
24 Apr (Tue) {lecture#37}
---
The Backpropagation Algorithm
25 Apr (Wed) {lecture#38}
---
Backpropagation vs numerical evaluation of the gradient
Invariance issues: linear transformation of the input and outputs
4 Basic Invariance-handling techniques
27 Apr (Fri) {lecture#39}
---
--- No Class ---
01 May (Tue) {lecture#xx}
---
Some details on Tangent Propagation, CNNs
02 May (Wed) {lecture#40}
---
---
Major
08 May (Tue)
---
---
5
Feature Selection
xx-xx
SDR
6
Logistic Regression
28-29
SDR
[Alex: LogReg]
xx
Mathematical Basics for Machine Learning
xx-xx
xx
[Burges: Math for ML], [Burges: Math Slides], [Do, Kolter: Linear Algebra Notes],

[Internal Link: IIT Delhi]

The above list is (obviously!) not exhaustive. Other reference material will be announced in the class. The Web has a vast storehouse of tutorial material on AI, Machine Learning, and other related areas.



Assignments

... A combination of theoretical work as well as programming work.
Both will be scrutinized in detail for original work and thoroughness.
For programming assignments, there will be credit for good coding.
Sphagetti coding will be penalized.
Program correctness or good programming alone will not fetch you full credit ... also required are results of extensive experimentation with varying various program parameters, and explaining the results thus obtained.
Assignments will have to be submitted on or before the due date and time.
Late submissions will not be considered at all.
Unfair means will be result in assigning as marks, the number said to have been discovered by the ancient Indians, to both parties (un)concerned.
Assignment 1
Assignment 2
Assignment 3

Examinations and Grading Information

The marks distribution is as follows (out of a total of 100):
Minor I
25
Minor II
25
Assignments
25
Major
25
Grand Total
100

ELL784 Evaluation: Programming Assignment Groups, Assignment/Examination-related Information [Internal Link: IIT Delhi]

Attendance Requirements:

As per Institute rules for IIT Delhi students: a minimum of 75% (i.e., a maximum of 11 absents permitted), else one grade less.
Illness policy: illness to be certified by a registered medical practioner.
Attendance in Examinations is Compulsory.

ELL784 Complete Attendance Records (on the moodle page) (02.01.2018-31.01.2018; 09.02.2018-23.03.2018; 03.04.2018-02.05.2018)


Course Feedback

Link to Course Feedback Form

Sumantra Dutta Roy, Department of Electrical Engineering, IIT Delhi, Hauz Khas,
New Delhi - 110 016, INDIA. sumantra@ee.iitd.ac.in