COL 776: Learning Probabilistic Graphical Models
Instructor: Parag Singla (email: parags AT cse.iitd.ac.in)
- Slot H. Time: Mon,Wed: 11:00 am - 12:00 noon. Thu: 12:00 noon - 1:00 pm.
- Venue: LH 310.
Ankit Anand (csz138105 AT cse), Happy Mittal (csz138233 AT cse)
Prachi Jain (csz148211 AT cse), Shashank Sharma (anz138012 AT csce)
- [Sun Nov 1]: Assignment 3 (Part B) is out! Due Date: Sunday Nov 8, 11:50 pm.
- [Sun Oct 26]: Assignment 3 (Part A) is out! Due Date: Sunday Nov 1, 11:50 pm.
- [Wed Oct 7]: A set of pratice questions for minor 2 has been posted!
- [Wed Oct 7]: Assignment 2 (A and B) deadline postponed to Wednesday Oct 21, 12 noon.
- [Wed Sep 30]: Assignment 2 (Part B) is out. Due Date: Fri Oct 16, 12:00 noon.
- [Fri Sep 25]: Assignment 2 (Part A) is out. Due Date: Fri Oct 16, 12:00 noon.
- [Fri Sep 4]: Assignment 1 (Part B) is out. Due Date: Fri Sep 18, 11:50 pm.
- [Tue Sep 1]: A set of practice questions has been posted (see below)!
- [Sun Aug 23]: Assignment 1: Assignment file, display script, readme.txt and sample-output.txt updated (see below)
- [Fri Aug 21]: Extra Class on Monday 24, 2:00pm - 3:30 pm. Venue: Bharti 501
- [Fri Aug 21]: Assignment 1: Samples files and display script uploaded (see Assignment Section)
- [Fri Aug 14]: Extra Class on Monday 17, 2:00pm - 3:30 pm. Venue: Bharti 501
- [Fri Aug 14]: Assignment 1 (Part A) is out. Due Date: Thu Aug 27, 11:50 pm.
- [Sat Jul 18]: No Classes in the week of July 27 - July 31 (Make up classes to be held later).
This course is meant to be the first graduate level course in the area of Probabilistic
Graphical Models (PGM). PGMs have emerged as a very important research field during
last decade or so with wide range of applications including Computer Vision, Information
Retrieval, Natural Language Processing, Biology and Robotics. This course aims to provide
students with a comprehensive overview of PGMs. The course content will include introduction
to Probabilistic Graphical Models, directed and undirected representations, inference and
learning algorithms and practical applications. This course is also meant to provide the
required background for pursuing research in this area.
Content: Basics: Introduction. Undirected and Directed Graphical Models. Bayesian
Networks. Markov Networks. Exponential Family Models. Factor Graph Representation. Hidden
Markov Models. Conditional Random Fields. Triangulation and Chordal Graphs. Other
pecial Cases: Chains, Trees. Inference: Variable Elimination (Sum Product and Max-Product).
Junction Tree Algorithm. Forward Backward Algorithm (for HMMs). Loopy Belief Propagation.
Markov Chain Monte Carlo. Metropolis Hastings. Importance Sampling. Gibbs Sampling. Variational
Inference. Learning: Discriminative Vs. Generative Learning. Parameter Estimation in Bayesian
and Markov Networks. Structure Learning. EM: Handling Missing Data. Applications in Vision,
Web/IR, NLP and Biology. Advanced Topics: Statistical Relational Learning, Markov Logic Networks.
Note: All the topics above may not be covered in the course.
Note: KF below stands for Koller and Friedman Book (2009 ed.) on Probabilistic Graphical Models.
|Week||Topic ||Book Chapters||Class Notes/|
|1 || Introduction, Basics || KF Chapter 1, 2
|2 || Bayesian Networks|| KF Chapter 3
|| Bayes Net-1
|3 || Markov Networks || KF Chapter 4
|| Markov Network-1
|4,5 || Factor Graph Representation, HMMs, CRFs,
| KF Chapter 4,8
|| Factor Graphs
Log Linear Models
Hidden Markov Models
Condiation Random Fields
|6,7 || Exact Inference: Variable Elimination
|| KF Chapter 9
|| Variable Elimination-1
|8,9,10 || Exact Inference: Junction Tree Algorithm. |
Belief Propagation (Loopy or not)
| KF Chapter 10,11
|| Clique Tree Message Passing-1
Clique Tree Message Passing-2
Cique Tree Message Passing-3
Loopy Belief Propagation
Max-Product Belief Propagation
|11,12 || Sampling Based Approximate Inference: |
MCMC, Metropolis Hastings,
| KF Chapter 12
|| Sampling Based Inference - Basics
Forward Sampling, Likelihood Weighting, Importance Sampling
Markov Chain Monte Carlo
Markov Chain Monte Carlo-1
Gibbs Sampling, Metropolis Hastings
Gibbs Sampling - Additional Notes
|13 || Learning: Overview, Learning in Bayesian |
Learning in Markov Networks
| KF Chapter 16,17,19
|| Parameter Estimation, Expectation Maximization
|14 || Markov Logic/Revision ||
Markov Logic Presentation
|New! || Additional Notes ||
Borrowed from Andrew Ng's
Machine Learning Course at Stanford
- Probabilistic Graphical Models: Principles and Techniques. Daphne Koller and Nir
Friedman. First Edition, MIT Press, 2009.
- Learning in Graphical Models. Michael Jordan (ed.). MIT Press, 1998. Collection
- Probabilistic Reasoning in Intelligent Systems. Judea Pearl. Morgan Kaufmann, 1988.
Assignment Submission Instructions
- You are free to discuss the problems with other students in the class. You should include the
names of the people you had a significant discussion with in your submission.
- All your solutions should be produced independently without referring to any
discussion notes or the code someone else would have written.
- All the non-programming solutions should be submitted using a hard copy. If you are writing
by hand, write legibly.
- Required code should be submitted using Moodle Page.
- You should archive all your submission (code) in one single zip file. This zip file
should be named as "yourentrynumber_firstname_lastname.zip". For example, if your entry number is
"2008anz7535" and your name is "Nilesh Pathak", your submission should be named as
- Honor Code: Any cases of copying will be awarded a zero on the assignment. An
additional penalty of 5 points will also be imposed (on the total course points
out of 100). More severe penalties may follow.
- Late Policy: You will lose 20% for each late day in submission. Maximum of 2 days late submissions are allowed.
|Assignments 1 ||10%
|Assignments 2 ||12%
|Assignments 3 ||10% (tentative)
|Minor 1|| 15%
|Minor 2|| 15%