CS 274E: Deep Generative Models

Instructor: Prof. Stephan Mandt
Teaching Assistant: N/A
Fall 2025
Day/Time: Tuesday and Thursday 11:00-12:20pm
Location: PCB 1200

This website is just a placeholder; the course's main website is on Canvas (accessible with your UCI credentials).

Announcements

  • This course will not be covering ChatGPT and Large Language Models. Rather, it focuses on generative models in the visual domain (images, video).
  • I can no longer accept enrollment requests unforunately. We have no TA support and limited physical space. Thanks for your understanding.

    Course Summary


    Generative models are an important class of machine learning models that aim to learn the data distribution. Deep generative models build on recent advances in the fields of deep learning and make it possible to sample data that highly resemble the training data. Recent deep generative models include Autoregressive Transformer Networks used in LLMs, CycleGAN for style transfer between images or videos, diffusion probabilistic models for generating artificial photo and videos, neural compression algorithms that outperform their classical counterparts, and deep generative models for molecular design. This course will introduce students to the probabilistic foundations of deep generative models with an emphasis on variational autoencoders (VAEs), generative adversarial networks (GANs), autoregressive models, normalizing flows, and diffusion models. Advanced topics that will be covered include black-box variational inference, disentangled representations, deep sequential models, various Bayesian approximation techniques, and information theoretical considerations. We will also discuss applications from the domains of computer vision, speech, NLP, climate science, and data compression.

    Prerequisites


    At minimum:

    Course Project

    As part of the course, students will work on a research project in small groups. This project may deal with topics such as applications of deep generative models to a new domain or dataset, or improvements on the inference of deep generative models.

    Suggested Reading


    There is no required book on the course topic. For deep learning, the textbook by Goodfellow, Bengio, and Courville is recommended (free online version). Links to relevant research papers will be provided.

    Tentative Content


  • Introduction
  • Generative Models and Variational Inference
  • Autoregressive Models
  • Variational Autoencoders
  • Normalizing Flows
  • Generative Adversarial Networks
  • Structured Generative Models
  • Discrete Latent Variables in Deep Models
  • Information Theoretic Perspectives
  • Deep Sequential Models
  • Diffusion Probabilistic Models
  • Structured VAEs and Image/Video Compression

    Academic Integrity

    All students are expected to be familiar with the policy below. Failure to adhere to this policy can result in a student receiving a failing grade in the class.

    Academic integrity is taken seriously. For homework problems or programming assignments you are allowed to discuss the problems or assignments verbally with other class members, but under no circumstances can you look at or copy anyone else's written solutions or code relating to homework problems or programming assignments. All problem solutions and code submitted must be material you have personally written during this quarter, except for (a) material that you clearly indicate and reference as coming from another source, or (b) code provided to you by the TA/reader or instructor.

    It is the responsibility of each student to be familiar with UCI's Academic Integrity Policies and UCI's definitions and examples of academic misconduct.