
COL 776: Learning Probabilistic Graphical Models
Instructor: Parag Singla (email: parags AT cse.iitd.ac.in)
Class Timings/Venue:
 Slot H. Time: Mon,Wed: 11:00 am  11:50 am. Thu: 12:00 noon  12:50 pm.
 Venue: LH 416.
Office Hours:
Teaching Assistants:
Ankit Anand (csz138105 AT cse),Dhruvin Patel (cs5120293 AT cse)
Happy Mittal (csz138233 AT cse), Prachi Jain (csz148211 AT cse)
Shubhankar Singh (csz168113 AT cse)
TA Assignment
Announcements
 [November 12, 2016] Check out the updated Assignment 3 below.
As was already announced, due date is Tuesday Nov 15, 11:50 pm.
 [October 27, 2016] Assignment 3 is out on the web! Due Date: Sunday
November 13, 11:50 pm.
 [September 30, 2016] Assignment 2 is out on the web! Due Date: Thursday
October 20, 11:50 pm.
 [August 18, 2016] Assignment 1 is out on the web! Due Date: Tuesday September 13, 11:50 pm.
Obective:
This course is meant to be the first graduate level course in the area of Probabilistic
Graphical Models (PGM). PGMs have emerged as a very important research field during
last decade or so with wide range of applications including Computer Vision, Information
Retrieval, Natural Language Processing, Biology and Robotics. This course aims to provide
students with a comprehensive overview of PGMs. The course content will include introduction
to Probabilistic Graphical Models, directed and undirected representations, inference and
learning algorithms and practical applications. This course is also meant to provide the
required background for pursuing research in this area.
Content: Basics: Introduction. Undirected and Directed Graphical Models. Bayesian
Networks. Markov Networks. Exponential Family Models. Factor Graph Representation. Hidden
Markov Models. Conditional Random Fields. Triangulation and Chordal Graphs. Other
pecial Cases: Chains, Trees. Inference: Variable Elimination (Sum Product and MaxProduct).
Junction Tree Algorithm. Forward Backward Algorithm (for HMMs). Loopy Belief Propagation.
Markov Chain Monte Carlo. Metropolis Hastings. Importance Sampling. Gibbs Sampling. Variational
Inference. Learning: Discriminative Vs. Generative Learning. Parameter Estimation in Bayesian
and Markov Networks. Structure Learning. EM: Handling Missing Data. Applications in Vision,
Web/IR, NLP and Biology. Advanced Topics: Statistical Relational Learning, Markov Logic Networks.
Note: All the topics above may not be covered in the course.
WeekWise Schedule
Note: KF below stands for Koller and Friedman Book (2009 ed.) on Probabilistic Graphical Models.
Week  Topic  Book Chapters  Class Notes/ Supplementary Material
 1  Introduction, Basics  KF Chapter 1, 2 
Introduction

2  Bayesian Networks  KF Chapter 3 
Bayes Net1
Bayes Net2

3  Markov Networks  KF Chapter 4 
Markov Network1
Markov Network2

4,5  Factor Graph Representation, HMMs, CRFs, Expoential Family 
KF Chapter 4,8 
Factor Graphs Log Linear Models
Hidden Markov Models Condiation Random Fields

6,7  Exact Inference: Variable Elimination
 KF Chapter 9 
Variable Elimination1
Variable Elimination2

8,9,10  Exact Inference: Junction Tree Algorithm. Belief Propagation (Loopy or not)
 KF Chapter 10,11 
Clique Tree Message Passing1
Clique Tree Message Passing2
Cique Tree Message Passing3
Loopy Belief Propagation
MaxProduct Belief Propagation

11,12  Sampling Based Approximate Inference: MCMC, Metropolis Hastings,
Gibbs Sampling, Importance Sampling  KF Chapter 12 
Sampling Based Inference  Basics
Forward Sampling, Likelihood Weighting, Importance Sampling
Importance Sampling
Markov Chain Monte Carlo
Markov Chain Monte Carlo1
Gibbs Sampling, Metropolis Hastings
Gibbs Sampling  Additional Notes

13  Learning: Overview, Learning in Bayesian Networks,
Learning in Markov Networks
 KF Chapter 16,17,19 
Parameter Estimation, Expectation Maximization

14  Markov Logic, Contextual Symmetries. Revision  
Markov Logic Presentation,
Contextual Symmetries

*  Additional Notes  
Learning Parmaters
Expectation Maximiation
Revision

Topic  Notes 
Probability  prob.pdf
Borrowed from Andrew Ng's Machine Learning Course at Stanford 
References
 Probabilistic Graphical Models: Principles and Techniques. Daphne Koller and Nir
Friedman. First Edition, MIT Press, 2009.
 Learning in Graphical Models. Michael Jordan (ed.). MIT Press, 1998. Collection
of Papers.
 Probabilistic Reasoning in Intelligent Systems. Judea Pearl. Morgan Kaufmann, 1988.
Other Places where a Similar Course is Offered
Assignment Submission Instructions
 You are free to discuss the problems with other students in the class. You should include the
names of the people you had a significant discussion with in your submission.
 All your solutions should be produced independently without referring to any
discussion notes or the code someone else would have written.
 All the nonprogramming solutions should be submitted using a hard copy. If you are writing
by hand, write legibly.
 Required code should be submitted using Moodle Page.
 You should archive all your submission (code, plots etc.) in one single zip file. This zip file
should be named as "yourentrynumber_firstname_lastname.zip". For example, if your entry number is
"2008anz7535" and your name is "Nilesh Pathak", your submission should be named as
"2008anz7535_nilesh_pathak.zip. Inside your zip folder, you should create a separate
subdirectory for each question in the assignment which should be named as "qk", k being
the question number. All the code (and plots etc.) for a question should be placed in the
corresponding subdirectory.
 Honor Code: Any cases of copying will be awarded a zero on the assignment. An
additional penalty of 5 points will also be imposed (on the total course points
out of 100). More severe penalties may follow.
 Late policy: You are allowed a total of 5 late days acorss all the assignments. You
are free to decide how you would like to use them. You will get a zero on an assignment once
you exceed the (total) allowed exemption of 5 days.
Practice Questions
Assignments

Assignment 3. [Weight : 10% (tentative)]. Due: Sunday November 13, 2016. 11:50 pm.
Version Updated on: Nov 12, 2016.
 Assignment 2. [Weight : 10%]. Due: Thursday October 20, 2016. 11:50 pm.
 Assignment 1. [Weight : 8%]. Due: Tuesday September 13, 2016. 11:50 pm.
 Question 1 Dataset: bayes.zip
 Question 2 Dataset: ocr.zip (link accessible only from the IIT Delhi network)
Grading Secheme (Tentative)
Assignment 1  8% 
Assignment 2  10% 
Assignment 3  10% (tentative) 
Quiz 1  2% 
Quiz 2 (Tentative)  TBD 
Minor 1  15% 
Minor 2  15% 
Major  3640% 
