In the Holi Term 2021, the CSE Department is offering a very large number of Special Topics Courses. These may not have listed pre-requisites in the Courses of Study, but it is expected that if you wish to register for any of these courses, you will have the necessary background and maturity. In particular, these courses are meant for those seeking to do research (PhD, MS Research, and M Tech level students), so the expectation will be that you have competed 120 earned credits.

All students wishing to register for any Special Topics Courses are required to fill the following form:

**Choice for Special Topics Courses (Holi Term 2021)**

Students are not expected to register for more than 1 special topics course. An exception is made for those who are applying for a specialisation, in which case at most 2 courses are permitted. Instructors reserve the right to deregister those who do not fill this form. Also, if you are opting for a Special Topics course, you will be required to take it for credit (exception given only to PhD students), and you will be expected not to withdraw (the only concession being on grounds of medical or family emergencies).

**Instructor:** Prof. M. Balakrishnan + (Being an online course I plan to invite other researchers for some of the specific topics)

**About the course:** Processor, custom hardware, firmware and software represent one continuum today - driven by the goals of obtaining highest system performance. With the massive growth in embedded devices very often comparison between competing solutions is not just on speed (mips, mflops, fps etc) but more critically on performance per watt. In this course we would cover the advances primarily from hardware viewpoint to meet these objectives both at component level (e.g. CPU, Memory etc) as well as at system level (e.g. Accelerators).

**Prerequisites:** Two basic courses in digital circuits/systems and computer architecture/organization

**Who can benefit:** Anyone who is interested in understanding the big picture of system performance in relation to advances in architecture, technology as well as design process

**Detailed course outline:** Expect a detailed course outline by 20 Jan 2021 on my website

**Instructor:** Prof. Amitabha Bagchi

**Course objectives**

At the end of the course the student is expected to develop a working familiarity with the mathematical foundations of most of the techniques used in data science, machine learning and AI.

**Background required:** Basics of Probability, Graph Theory, and Linear Algebra.

**Topics**

Geometry of High-dimensional space including dimensionality reduction; Singular Value Decomposition and applications; Random walks and Markov Chains; Sketching and sampling; Clustering.

We will closely follow the book by Blum et. al. (2018) cited below. Specifically we will go through Chapters 2, 3, 4, 6 and 7.

**Texts**

- A. Blum, J. Hopcroft, and R. Kannan,
*Foundations of Data Science*, 2018. Download here.

**Refresher texts**

- Linear algebra can be refreshed from Andrei Antonenko’s course notes. Available here (local download only).
- Graph theory can be reviewed from Reinhard Diestel’s book. Available here as an ebook.

**Instructor:** Prof. Rohan Paul

**Description**

Planning and estimation are central to modern autonomous systems. This course will cover the concepts, principles and methods for intelligent decision-making with imperfect or uncertain knowledge. Students will develop an understanding of how different planning and learning techniques are useful in problem domains where robots or other embodied-AI agents are deployed. Previous coursework in artificial intelligence or machine learning is required.

**Topic list (tentative)**

- Introduction: AI view of autonomous systems: Centrality of estimation and planning.
- State Estimation: Graphical models (review), Bayes Filter and Kalman Filter
- Task Planning: STRIPS-planning, PDDL planning, Graph Plan
- Planning under Uncertainty: MDPs (review), Tree Search for MDPs
- Reinforcement learning for Robot control: Monte-carlo planning, Deep RL applications, Imitation Learning
- POMDPs: Belief states and Policy trees
- Information Gathering and Exploration: Gaussian Processes and exploration algorithms.
- Others (if time permits): Human-robot interaction, applications of neural models, scene understanding etc.

**Course Components**

Minor and major exams. Programming assignments (tentatively 1-2). Study of a contemporary works in planning and learning technique relevant to autonomous systems (details in due course).

**Pre-requisites**

Introduction to Artificial Intelligence (COL333-671) or Introduction to Machine Learning (COL774 or equivalent). Programming proficiency and knowledge of probabilistic models, basic deep learning, basic search algorithms, logic and probability will be an advantage.

**Learning outcomes**

At the end of the course students will model a robotic system (e.g., a ground robot or manipulator) as a decision- making AI agent. Students will be able to formulate/solve relevant planning and estimation problems in this domain and understand how incorporate recent learning-based methods decision-making algorithms.

**Other Information**

This course will focus on AI aspects of autonomous systems. A robotic system (ground/air vehicle or manipulator) will be modeled as an AI agent capable of sensing and taking simple actions in the environment. The detailed control/physical aspects of the system will be abstracted to a certain degree in the course. In future offerings experimental component with a real system is likely to be added but is beyond scope in the current offering.

**References**

- Mykel Kochenderfer, Decision Making Under Uncertainty
- Stuart Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 3rd Edition
- Sebastian Thrun, Wolfram Burgard and Dieter Fox. Probabilistic Robotics. MIT Press, 2005.
- Rich Sutton and Andrew Barto. Reinforcement Learning. MIT Press
- Steven LaValle. Planning Algorithms. Cambridge University Press, 2006.
- Dimitri P. Bertsekas. Dynamic Programming and Optimal Control. Athena Scientific, 3rd edition, 2007.
- Paper references will be added in due course.

**Instructor:** Prof. Abhijnan Chakraborty

**Course content**

- Social network analysis
- Introduction, applications and challenges
- Structural properties of large social networks
- Measures of network centrality
- Identifying popular users/experts
- Community detection algorithms
- Information diffusion models
- Influence maximization
- Link prediction

- Fundamentals of Social Data Analytics
- Measurement and collection of social media data
- Basics of text processing over social data
- Entity linking and entity resolution
- Topic models

- Social search and recommendation algorithms
- Crowdsourcing
- Harmful users/content on social media – hate speech, fake news, spammers in social networks, etc.
- Different types of social platforms – Anonymous social networks, E-commerce sites
- Basics of computational social choice — voting, allocation algorithms
- Social media and digital health

**Pre-requisites**

- Data structures and algorithms
- Basics of machine learning
- The course will involve understanding of several research papers, and every student will have to present at least one research paper in the class.
- There will be a few programming assignments as well.

**Course objectives**

- The ability to connect socially relevant technical problems with their experience of social media usage.
- Knowledge of the state of the art in social media related research.
- The ability to gather social media data, process them and implement state-of-the-art techniques to get insights on important problems.

**Instructor:** Prof. Keerti Choudhary

**Course objectives:** On completion of this course, students will gain familiarity with recent developments in the domain of compact graph-structures and their efficient maintenance. Additionally, the students will learn algorithm design methodologies for dynamic and fault-tolerant setting, and will understand the power of randomization in algorithm design.

**Prerequisites:** Data Structure and Algorithms (COL 106), Basics of Probability and Statistics

**Course Content**

- Incremental and Decremental Shortest-Path-Tree Algorithms
- Dynamic Depth First Search (over adversarial and random inputs)
- Fault-tolerant Connectivity and Reachability Structures
- Pair-wise Reachability Structures
- Gomory-Hu Trees, Submodularity of cuts
- Compact representation for all Steiner min-cuts
- Graph Spanners, Greedy Construction, Girth Conjecture
- Clustering approach to efficient Spanner Computation
- Throup and Zwick Distance Oracle
- Additive Spanners
- Dynamic and Fault-tolerant Spanners
- Diameter and Eccentricity Spanners
- Probabilistic Tree Embedding (FRT Algorithm)

**Instructor:** Prof. Sayan Ranu

**Objectives**

- The ability to formulate machine learning problems on graphs using Graph neural networks.
- Knowledge of the state of the art in GNNs
- The ability to analyze the suitability of a GNN architecture for a given problem.
- The ability to implement new GNN architectures and fine-tune existing GNNs for a given application.

**Prerequisites**

- should have taken either Machine Learning or Data Mining

**Contents**

- Basics of Graph Theory
- Node Characterization: Centrality Measures: Degree, Betweenness, etc., Clustering Coefficient, Random walks and PageRank
- Graph Similarity functions (Graph Isomorphism, Subgraph Isomorphism, Edit Distance, Subgraph Edit Distance, Maximum Common Subgraph Similarity, Fingerprints, Random walk based similarity functions)
- Architectures:
- DeepWalk
- Node2Vec
- Struct2Vec
- Graph Convolutional Network
- GraphSage
- Graph Recurrent Networks
- Graph Attention Networks
- Position-aware graph neural networks
- Survey of the most recent architectures published in the last 2 years.

- Surveying GNNs for applications such as
- Learning algorithms
- Adversarial Attacks, defence mechanisms and Interpretability
- Social Network Analysis
- Graph Generative Models
- Scalability

**Instructor:** Prof. Parag Singla

**Pre-requisites:**

A foundational course in AI or ML.

**Overview:**

This course is meant to be the first graduate level course in deep learning. Deep Learning is an emerging area of Machine Learning which has revolutionized the progress in the field during last few years with applications found in NLP, Vision and Speech to name a few domains. This course is intended to give a basic overview of the mathematical foundations of the field, and present the standard techniques/arhitectures which become basis for more advanced ones. About a 3rd of the course will focus on latest research topics in the area. Without an implementation, no deep learning class can be complete. Students will get to implement some of the architectures on a GPU to test on large datasets.

**Content:**

Basics: Introduction. Multi-layered Perceptrons. Backpropagation. Regularization: L1-L2 Norms. Dropouts. Optimization: Challenges. Stochastic Gradient Descent. Advanced Optimization Algorithms. Convolutional Networks (CNNs). Recurrent Architectures. Dropout, Batch Normalization. Generative Architectures. Advanced Architectures for Vision. Advanced Architectures for NLP. More Recent Advances in the field.

**Course webpage:** http://www.cse.iitd.ac.in/~parags/teaching/col870/

**Instructor:** Prof. Venkata Koppula

**Course Objectives:**

On completion of this course, the students will be able to design basic post-quantum secure cryptosystems, and prove security based on the hardness of lattice problem.

**Prerequisites:**

This is a theoretical course, and therefore mathematical maturity will be necessary. Prerequisite for this course: COL351 - Analysis and Design of Algorithms. In particular, students should be comfortable with reductions in computer science. Familiarity with cryptography will be useful, but is not a prerequisite for this class.

**Course Description:**

A lattice (for this course) is a discrete additive subgroup of the n-dimensional Euclidean space. Lattices have been used extensively in computer science and mathematics. Recently (over the last two/three decades), they have found numerous applications in cryptography - both for cryptanalysis, and more recently, for building (quantum) secure cryptosystems.

In this course, we will first study some basic properties of n dimensional lattices, and discuss some problems on lattices that are believed to be hard. Next, we will see why these problems are believed to be hard. Following this, we will study some applications of lattices in cryptanalysis. Finally, we will discuss how to use lattice-based hardness assumptions to build cryptography.

Based on the class interest, we will cover (a subset of) the following topics:

- mathematical preliminaries and some basic properties of lattices
- hard problems on lattices (shortest vector problem, closest vector problem)
- some hardness results/reductions between the different problems
- the LLL algorithms and its applications in cryptanalysis
- Babai’s nearest plane algorithm
- an exponential time algorithm for shortest vector problem
- duality and transference theorems
- Short Integer Solution (SIS) problem (and it’s relation to lattice problems)
- Learning with Errors (LWE) problem (and it’s relation to lattice problems)
- building basic cryptography using SIS/LWE
- Homomorphic encryption using LWE

**Instructor:** Prof. Srikanta Bedthur

**Prerequisites:**

It is strongly recommended to have completed either COL764 or COL772 as a preparation for this course.

**Overview:**

This will primarily be a paper reading and student presentation-driven course with very few lectures.

**Contents:**

The topics that will be covered include:

- Neural Information Retrieval
- neural retrieval architectures
- retrieval using pretrained embeddings - BERT, Roberta, etc.

- Complex question answering
- Multimodal retrieval
- Issues of fairness and bias, counterfactual reasoning
- Modeling information propagation in social networks, combating fake content

**Evaluation parameters:**

There will be a report-writing assignment, and an open-book exam based on topics covered in the course.