CS 295: Deep Generative Models

Instructor: Prof. Stephan Mandt
Teaching Assistant: Aodong Li
Fall 2022
Day/Time: Tuesday and Thursday 12:30-1:50pm
Location: PCB 1200
Piazza: link

This website is just a placeholder; the course's main website is on Canvas (accessible with your UCI credentials).

Course Summary

Generative models are an important class of machine learning models that aim to learn the data distribution. Deep generative models build on recent advances in the fields of deep learning and make it possible to sample data that highly resemble the training data. Recent success stories of deep generative models include Google’s WaveNet for voice synthesis, Transformer Networks for highly accurate machine translation, CycleGAN for weakly-supervised style transfer between images or videos, diffusion probabilistic models for generating artificial photos, videos, or art, neural compression algorithms that outperform their classical counterparts, and deep generative models for molecular design. This course will introduce students to the probabilistic foundations of deep generative models with an emphasis on variational autoencoders (VAEs), generative adversarial networks (GANs), autoregressive models, normalizing flows, and diffusion models. Advanced topics that will be covered include black-box variational inference, variational dropout, disentangled representations, deep sequential models, alternative variational bounds, and information theoretical perspectives on VAEs. We will discuss diverse applications from the domains of computer vision, speech, NLP, climate science, and data compression.


At minimum:

Course Project

As part of the course, students will work on a research project in small groups. This project may deal with topics such as applications of deep generative models to a new domain or dataset, or improvements on the inference of deep generative models.

Suggested Reading

There is no required book on the course topic. For deep learning, the textbook by Goodfellow, Bengio, and Courville is recommended (free online version). Links to relevant research papers will be provided.

Tentative Content

  • Introduction
  • Generative Models and Variational Inference
  • Autoregressive Models
  • Variational Autoencoders
  • Normalizing Flows
  • Generative Adversarial Networks
  • Structured Generative Models
  • Discrete Latent Variables in Deep Models
  • Information Theoretic Perspectives
  • Deep Sequential Models
  • Diffusion Probabilistic Models
  • Structured VAEs and Image/Video Compression


  • Can I attend remotely? This course is primarily an in-person event. However, asynchronous remote participation is possible. The course videos will be published, and homeworks will be submitted via Gradescope. That said, the lecture recordings may not be perfect and will depend on the local recording conditions in the lecture hall, and there may be a delay of 1-2 days until the lectures are uploaded. Questions via Zoom will not be answered.

    Academic Integrity

    All students are expected to be familiar with the policy below. Failure to adhere to this policy can result in a student receiving a failing grade in the class.

    Academic integrity is taken seriously. For homework problems or programming assignments you are allowed to discuss the problems or assignments verbally with other class members, but under no circumstances can you look at or copy anyone else's written solutions or code relating to homework problems or programming assignments. All problem solutions and code submitted must be material you have personally written during this quarter, except for (a) material that you clearly indicate and reference as coming from another source, or (b) code provided to you by the TA/reader or instructor.

    It is the responsibility of each student to be familiar with UCI's Academic Integrity Policies and UCI's definitions and examples of academic misconduct.