CS 295: Deep Generative Models

Instructor: Prof. Stephan Mandt
Teaching Assistant: Aodong Li
Fall 2021
Day/Time: Monday and Wednesday 11:00-12:20pm
Location: Subject to change, check Canvas website for updates
Course Code: 34770

Attention: the main course page with all relevant and updated information is on Canvas (accessible with your UCI credentials).

Course Summary


Generative models are an important class of machine learning models that aim to learn the data distribution. Deep generative models build on recent advances in the fields of deep learning, and make it possible to sample data that highly resemble the structure of the data on which these models were trained. Recent success stories of deep generative models include Google’s WaveNet which set a new state of the art for voice synthesis, Transformer Networks for highly accurate machine translation, CycleGAN for weakly-supervised style transfer between images or videos, neural compression algorithms that outperform their classical counterparts, and deep generative models for molecular design. This course will introduce students to the probabilistic foundations of deep generative models with an emphasis on variational autoencoders (VAEs), generative adversarial networks (GANs), autoregressive models, and normalizing flows. Advanced topics that will be covered include black-box variational inference, variational dropout, disentangled representations, deep sequential models, alternative variational bounds, and information theoretical perspectives on VAEs. We will discuss diverse applications from the domains of computer vision, speech, NLP, and data compression.

Prerequisites


At minimum:

Course Project

As part of the course, students will work on a research project in small groups. This project may deal with topics such as applications of deep generative models to a new domain or dataset, or improvements on the inference of deep generative models.

Suggested Reading


There is no book on the course topic. For deep learning, the textbook by Goodfellow, Bengio, and Courville is recommended (free online version). Links to relevant research papers will be provided.

Tentative Content


  • Introduction
  • Generative Models and Variational Inference
  • Autoregressive Models
  • Variational Autoencoders
  • Normalizing Flows
  • Generative Adversarial Networks
  • Structured Generative Models
  • Discrete Latent Variables in Deep Models
  • Information Theoretic Perspectives
  • Deep Sequential Models
  • Structured VAEs and Image/Video Compression

    Academic Integrity

    All students are expected to be familiar with the policy below. Failure to adhere to this policy can result in a student receiving a failing grade in the class.

    Academic integrity is taken seriously. For homework problems or programming assignments you are allowed to discuss the problems or assignments verbally with other class members, but under no circumstances can you look at or copy anyone else's written solutions or code relating to homework problems or programming assignments. All problem solutions and code submitted must be material you have personally written during this quarter, except for (a) material that you clearly indicate and reference as coming from another source, or (b) code provided to you by the TA/reader or instructor.

    It is the responsibility of each student to be familiar with UCI's Academic Integrity Policies and UCI's definitions and examples of academic misconduct.