COL 865: Special Topics in Computer Applications - Deep Learning

General Information

Instructor: Parag Singla (email: parags AT

Class Timings/Venue:
  • Slot H. Time (Modified): Monday - 2:00 pm - 3:20 pm. Thursday - 12:00 noon - 1:20 pm.
  • Slot H. Time: Monday, Wednesday - 11:00 am - 11:50 pm. Thursday - 12:00 noon - 12:50 pm.
  • Venue: Bharti 305.

Office Hours:

Teaching Assistants:
Ankit Anand (csz138105 AT cse), Akshay Gupta (cs5130275 AT cse)


  • [October 2, 2017] Check out the updated grading policy!
  • [September 22, 2017] We are planning to use Microsoft Azure Credits for part of the assignments in the course. Thanks to Microsoft for the credits!
  • [September 22, 2017] Course Webpage is finally up! Yay!!

Course Objective and Content

Obective: This course is meant to be the first graduate level course in deep learning. Deep Learning is an emerging area of Machine Learning which has revolutionized the progress in the field during last few years with applications found in NLP, Vision and Speech to name a few domains. This course is intended to give a basic overview of the mathematical foundations of the field, and present the standard techniques/arhitectures which become basis for more advanced ones. About a 3rd of the course will focus on latest research in the area through research paper dicussions. Without an implementation, no deep learning class can be complete. Students will get to implement some of the architectures (CNNs/LSTMs etc.) on a GPU to test on large datasets. Students will also likely get some experience with cloud computing facilities such as Microsoft Azure and/or other HPC systems.

For 2017-18 Sem I offering: We plan to use Data Science Virtual Machine (DSVM) service provided by Microsoft Azure so that students can collaborate with each other for building deep learning models on large amount of data.

Content: Basics: Introduction. Multi-layered Perceptrons. Backpropagation. Regularization: L1-L2 Norms. Dropouts. Optimization: Challenges. Stochastic Gradient Descent. Advanced Optimization Algorithms. Convolutional Neural Networks (CNNs). Advanced Architectures for Vision. Recurrent Neural Networks. Long Short Term Memory (LSTMs). Gated Recurrent Units (GRUs). Attention. Word Vectors. Generative Adversarial Networks (GANs). Deep Re-inforcement Learning. More Recent Advances in the field.

Week-Wise Schedule

WeekTopic Book ChaptersClass Notes/
Supplementary Material
1 Introduction, Motivation
2 Mulit-layered Perceptrons, Backpropagation Goodfellow et al. Chapter 6
3 Regularization Techniques Goodfellow et al. Chapter 7
4,4.5 Optimization Goodfellow et al. Chapter 8
5 Convolutional Neural Networks (CNNs) Goodfellow et al. Chapter 9
6 CNNs - Advanced Architectures Slides
7 Recurrent Neural Networks (RNNs) Goodfellow et al. Chapter 10
8 LSTMs, GRUs Goodfellow et al. Chapter 10
8.5 Word-2-Vec Slides
9 Generative Adversarial Learning (GANs)
10 Research Paper Discussions
11 Deep Reinforcement Learning
12 - 14 Research Paper Discussions

Paper Presentation

Guidelines & Detailed Schedule

Additional Reading

Review Material

Chapters 1 - 5, Goodfellow et al.


  1. Deep Learning. Ian Goodfellow, Yoshua Bengio and Aaron Courville. 2016.

Assignment Submission Instructions

  1. You are free to discuss the problems with other students in the class. But the final solution/code that you produce should come through your individual efforts.
  2. Required code should be submitted using Moodle Page.
  3. Honor Code: Any cases of copying will be awarded a zero on the assignment. Additional penalities will be imposed based on the severity of copying. Any copying cases run the chances of being escalated to the Department/DISCO.
  4. Late policy: You will lose 20% of the score for every late day in submission. Maximum of two late days are allowed for any given assignment. You are allowed a total of 2 buffer days for the two programming assignments. There is no penalty if your submission stays withing the limit of the 2 buffer days (total). Any delay beyond 2 days will result in a zero on your submission(s).


  • Assignment 1: Implement AlexNet on a Subset of classes in the Imagenet Dataset. Implement an LSTM Cell (tentative) [Weight : 10%]. Due: Sunday October 15, 11:50 pm.
  • Assignment 2. Open ended implementation of a deep learning system. Details to be decided.
We are planning to use Microsoft Azure Credits for part of the above assignments. Thanks to Microsoft for the credits!

Grading Secheme (Tentative)

Assignment 1 10%
Assignment 2 15%
Class Presentation + Reviewing 15% (Presentation:6% + Reviewing:9%)
Minor 1/Assignment 0 6%
Minor 2 20%
Major 34%