CMPUT 652 - Probability Models for Artificial Intelligence

Fall 2003
Department of Computing Science
University of Alberta

Instructor: Dale Schuurmans, Ath409, x2-4806, dale@cs.ualberta.ca
Room: CSC B43
Time: TR 11:00-12:30
Office hours: TR 12:30-1:15 (or by appointment)


In the past decade probability models have revolutionized several areas of artificial intelligence research, including expert systems, computer perception (vision and speech), natural language interpretation, automated decision making, and robotics. In each of these areas, the fundamental challenge is to draw plausible interpretations of inputs that are uncertain and noisy. As a model of uncertainty, probability models are unparalleled in their ability to combine heterogeneous sources of evidence effectively. However, until recently, the use of probability models has been limited by the inherent complexity of realizing exact probabilistic inference. Now recent advances from computing science have made many probabilistic inference tasks practical.

This course will cover the fundamentals of graphical probability models, focusing on the key representations, algorithms and theories that have facilitated much recent progress in artificial intelligence research.

There are no formal prerequisites for this course---all that is required is a basic programming capability and a rudimentary knowledge of probability and statistics. It would be advantageous (but not essential) to have some prior exposure to optimization methods, algorithms and complexity, and a previous course on artificial intelligence.


Textbook: An Introduction to Probabilistic Graphical Models,
by Michael Jordan
(Chapters from his preliminary draft will be distributed by the instructor.)

Supplementary text: Bayesian Networks and Beyond: Probabilistic Models for
Reasoning and Learning
, by Daphne Koller and Nir Friedman
(Chapters from their preliminary draft will be distributed by the instructor as needed.)

Course work:
2 Assignments 25% each Assignment 1 Assignment 2 Assignment 2a (Data files: walkingby2.mat, walkingby1.mat, walkingby.mat, face.mat, facesnow.mat)
Project 50% Project handout


Course outline

Lecture 1 Introduction Thur Sep 4
Part 1 Representation
Lecture 2 Joint distributions, random variables Tues Sep 9
Lecture 3 Graphical model representations Thur Sep 11
Part 2 Inference
Lecture 4 Basic inference algorithm Tues Sep 16
Lecture 5 Independence properties Thur Sep 18
Lecture 6 Efficient tree-based inference Tues Sep 23 Normalization property of junction trees
Part 3 Famous examples
Lecture 7 Multivariate Gaussian Tues Sep 30
Lecture 8 Naive Bayes, mixtures, hierarchies Thur Oct 2
Lecture 9 Hidden Markov models Tues Oct 7
Lecture 10 Kalman filters Thur Oct 9
Lecture 11 Exponential family Tues Oct 14
Lecture 12 Stochatic context free grammars Thur Oct 16
Part 4 Estimation
Lecture 13 Types of estimation/learning problems Tues Oct 21
Lecture 14 Maximum likelihood Thur Oct 23
Lecture 15 Maximum conditional likelihood Tues Oct 28
Lecture 16 Bayesian estimation Thur Oct 30
Lecture 17 Expectation-maximization Tues Nov 4
Lecture 18 Factor and principle component analysis Thur Nov 6
Lecture 19 Estimating hidden Markov models Thur Nov 13
Lecture 20 Learning structure Tues Nov 18
Part 5 Approximation
Lecture 21 Sampling Thur Nov 20
Lecture 22 Variational, loopy approximation Tues Nov 25
Part 6 Decision making
Lecture 23 Decision theory Thur Nov 27
Lecture 24 Markov decision processes Tues Dec 2

Project due

Monday, December 15


Thanks for a great term!

A special thanks to our wonderful guest lecturers, Pascal Poupart and Gal Elidan!