The class will cover topics such as Directed/Undirected graphical models, template models, Inference (variable elimination and sumproduct message passing), Learning (Maximum Likelihood Estimation, Generalized Linear Models, learning over fully/partially observed data etc.), approximate inference (MCMC methods, Gibbs sampling). We will also dive to more research oriented topics such as scalable implementations of graphical models, connections of graphical models and relational representation learning, and applications of graphical models to problems in data management (such as data integration and data cleaning).
Text books
The textbooks we will use are the following two:
Prerequisites
You should have taken an introductory machine learning course. You should understand basic probability and statistics, and collegelevel algebra and calculus. For example it is expected that you know about standard probability distributions (Gaussians, Poisson), and also how to calculate derivatives.
Assignments
Misc
#  Date  Topic  Lecture Materials  Reading Material  Assignments 

Introduction and Class Overview  
1  9/6  Introduction to Graphical Models  Lecture 1 


Representation  
2  9/11  Directed Graphical Models: Bayesian Networks  Lecture 2 


3  9/13  Undirected Graphical Models  Lecture 3 


Exact Inference  
4  9/18  Variable Elimination  Lecture 4 


5  9/20  Clique Trees and Message Passing  Lecture 5 


Learning  
6  9/25  Learning over Generalized Linear Models  Lecture 6 
Homework 1: Due Oct 2nd by 2:30 p.m. (beginning of class) 

7  9/27  Learning BNs  Lecture 7 


8  10/2  Learning Undirected Graphical Models  Lecture 8 


9  10/4  Structure Learning  Lecture 9 


10  10/9  Learning with Partially Observed DataThe Expectation Maximization Algorithm  Lecture 10 


Approximate Inference  
11  10/11  Loopy Belief Propagation  Lecture 11 

Homework 2: Due Oct 25th by 2:30 p.m. (beginning of class) 
12  10/16  Mean Field Approximation  Lecture 12  
13  10/18  Variational Inference Continued  Lecture 13  
14  10/23  Sampling Methods to Approximate Inference  Lecture 14 


15  10/25  Review before Midterm  Review 


16  10/30  Midterm  Midterm 


Advanced Graphical Models  
17  11/01  Spectral Learning for GMs  Lecture 16 


18  11/06  Markov Logic Networks  Lecture 17  
Deep Learning  
19  11/08  Deep Learning and Graphical Models  Lecture 18 

Project Proposal Due 
20  11/13  Deep Learning Models: Autoencoders and Variational Autoencoders  Lecture 19  
20  11/15  Deep Learning Models: Generative Adversarial Networks  Lecture 20  
21  11/20  Deep Learning Models: CNNs and RNNs  Lecture 21 


22  11/27  Deep Learning Models: Attention and Transformers  Lecture 22 

Project Mid Report 
Applications  
23  11/29  Knowledge Base Construction  Lecture 23 


24  12/04  Data Cleaning  Lecture 24 


25  12/06  No Class (Theo at DARPA) 


Projects  
25  12/11  Project Presentations 

You are encouraged to discuss the homework assignments with other students; it's fine to discuss overall strategy and collaborate with a partner or in a small group, as both giving and receiving advice will help you to learn.
However, you must write your own solutions to all of the assignemtns, and you must cite all people you worked with. If you consult any resources outside of the materials provided in class, you must cite these sources.
If you do not do so, we will consider this a violation of the University of Wisconsin Honor Code.