CS 295: Deep Generative Models

Prof. Stephan Mandt
Spring 2019
Day/Time: Tuesday and Thursday 11:00-12:20pm
Location: PCB 1200
Course Code: 34875

All relevant information about the course is found on Canvas

Course Summary


Generative models are an important class of machine learning models due to their ability to produce artificial data. Deep generative models build on recent advances in the fields of deep learning and approximate inference, and make it possible to create structured data that highly resemble the data on which they were trained, such as images, audio, text, or video. This course will introduce students to the probabilistic foundations of deep generative models with an emphasis on variational autoencoders (VAEs), generative adversarial networks (GANs), and the training paradigm of black box variational inference. Advanced topics that will be covered include normalizing flows, variational dropout, disentangled representations, deep sequential models, alternative variational bounds, and information theoretical perspectives on VAEs. We will discuss diverse applications from the domains of computer vision, speech, NLP, and compression.

Prerequisites


At minimum:

Tentative Content


  • Introduction
  • Generative Models and Variational Inference
  • Autoregressive Models
  • Variational Autoencoders
  • Normalizing Flows
  • Generative Adversarial Networks
  • Structured Generative Models
  • Discrete Latent Variables in Deep Models
  • Information Theoretic Perspectives
  • Deep Sequential Models
  • Video Prediction, Text Generation, and Forecasting
  • Structured VAEs and Image/Video Compression