[ Bio | Research | News | Teaching | Group | Publications | Github ]

Short Bio

Stephan Mandt is an Associate Professor of Computer Science and Statistics at the University of California, Irvine. His research centers on deep generative modeling, uncertainty quantification, neural data compression, and AI for science. Previously, he led the machine learning group at Disney Research in Pittsburgh and Los Angeles and held postdoctoral positions at Princeton and Columbia University. Stephan holds a Ph.D. in Theoretical Physics from the University of Cologne, where he received the German National Merit Scholarship. He is furthermore a recipient of the NSF CAREER Award, the UCI ICS Mid-Career Excellence in Research Award, the German Research Foundation's Mercator Fellowship, a Kavli Fellow of the U.S. National Academy of Sciences, a member of the ELLIS Society, and a former visiting researcher at Google Brain. His research is currently supported by NSF, DARPA, IARPA, DOE, Disney, Intel, and Qualcomm. Stephan is an Action Editor of the Journal of Machine Learning Research and Transaction on Machine Learning Research, held tutorials at NeurIPS, AAAI, and UAI, and regularly serves as (Senior) Area Chair for NeurIPS, ICML, AAAI, and ICLR. He currently serves as Program Chair for AISTATS 2024 and General Chair for AISTATS 2025.


Research Interests

The following research areas are of interest to the group:

  • Deep Generative Models: We have a broad interest in deep generative models such as variational autoencoders and diffusion models, aiming to improve their scope (e.g., video diffusion, factorized VAEs, point process models) and inference efficiency (e.g., augmented spaces, iterative inference etc).
  • Uncertainty Quantification: Our group focuses on teaching neural networks to "know what they don't know". To this end, proper uncertainty quantification and calibration is crucial (e.g., through variational inference, ensemble methods, Bayesian neural networks, approaches inspired by statistical physics etc.)
  • Neural Data Compression: We are dedicated to exploring methods that seek to explore the potential of deep learning-based approaches as alternatives to conventional image and video codecs for data compression.
  • Machine Learning and Science: Our research explores the applications of machine learning in physics, chemistry, climate science, and related domains. We also investigate physics-inspired machine learning algorithms and theories.

    PhD student applicants: Unfortunately, I will not be able to respond to most inquiries regarding PhD openings or comment on your applications to my group. If you indicate your interests to work with me in the application questions, I will make sure to review them carefully. Online Application page.
    UCI undergraduate students, read this first: Thank you for your interest in working on research projects with us. Due to the high demand for generative AI opportunities we can only accommodate a limited number of students each year. When reaching out, kindly include your resume and UCI transcript and describe what kind of research interests you the most. You should have already excelled in CS 178 with a top grade (A or A+) and ideally have taken additional courses in AI/ML. Your understanding of these constraints is greatly appreciated.

    News

    Teaching

    Group Members

    Current:
  • Yibo Yang, PhD Student (2019--)
  • Ruihan Yang, PhD Student (2019--)
  • Eliot Wong-Toi, PhD Student (2020--)
  • Prakhar Srivastava, PhD Student (2020--)
  • Kushagra Pandey, PhD Student (2022--)
  • Tuan Pham Anh, PhD Student (2022--)
  • Justus Will, PhD Student (2023--)
  • Yang Meng, PhD Student (2023--)
  • Noah Benamin, PhD Student (2024--)

    Former:
  • Aodong Li, PhD Student (2019--2024), now at Amazon
  • Alex Boyd, PhD Student (2019--2024), now Research Scientist at GE Research
  • Robert Bamler former postdoc, now Professor at the University of Tubingen
  • Fabian Jirasek, former postdoc, now Assistant Professor at TU Kaiserslautern
  • Cheng Zhang, former postdoc, now at Principal Researcher at Microsoft Research
  • Florian Wenzel, former co-advised PhD student, now Researcher at Google Brain
  • Salvator Lombardo, former postdoc, now Associate Research Scientist at Disney Research
  • Yingzhen Li, former intern, now Assistant Professor at Imperial College
  • Chen Qiu, former PhD student, now Research Scientist at Bosch Research, Pittsburgh
  • Jens Tuyls, former undergraduate student, now PhD student at Princeton
  • Harshini Mangipudi, Undergraduate Student, now Software Engineer at Microsoft

      Preprints (selected)

      Publications

      2024

    • Fast samplers for Inverse Problems in Iterative Refinement models
      K. Pandey, R. Yang, S. Mandt
      Neural Information Processing Systems (NeurIPS 2024)   PDF
    • Precipitation Downscaling with Spatiotemporal Video Diffusion
      P. Srivastava, R. Yang, G. Kerrigan, G. Dresdner, J. McGibbon, C. Bretherton, S. Mandt
      Neural Information Processing Systems (NeurIPS 2024)   PDF
    • Unity by Diversity: Improved Representation Learning for Multimodal VAEs
      T. Sutter, Y. Meng, A. Agostini, D. Chopard, N. Fortin, J. Vogt, B. Shahbaba, S. Mandt
      Neural Information Processing Systems (NeurIPS 2024)   PDF
    • Understanding Pathologies of Deep Heteroskedastic Regression
      E. Wong-Toi, A. Boyd, V. Fortuin, S. Mandt
      Uncertainty in Artificial Intelligence (UAI 2024, oral)   PDF
    • Anytime-Valid Confidence Sequences for Consistent Uncertainty Estimation in Early-Exit Neural Networks
      M. Jazbec, P. Forré, S. Mandt, D. Zhang, E. Nalisnick
      Uncertainty in Artificial Intelligence (UAI 2024)   PDF
    • Neural NeRF Compression
      T. Pham and S. Mandt
      International Conference on Machine Learning (ICML 2024)   PDF
    • Position Paper: Bayesian Deep Learning in the Age of Large-Scale AI
      T. Papamarkou et al.
      International Conference on Machine Learning (ICML 2024)   PDF
    • Efficient Integrators for Diffusion Generative Models
      K. Pandey, M. Rudolph, and S. Mandt
      International Conference on Learning Representations (ICLR 2024)   PDF
    • Understanding Precipitation Changes through Unsupervised Machine Learning
      G. Mooers, T. Beucler, M. Pritchard, and S. Mandt
      Environmental Data Science Vol. 3, 2024   PDF

      2023

    • Comparing Storm Resolving Models and Climates via Unsupervised Machine Learning
      G. Mooers, M. Pritchard, T. Beucler, P. Srivastava, H. Mangipudi, L. Peng, P. Gentine, and S. Mandt
      Nature Scientific Reports, 2023   PDF
    • Estimating the Rate-Distortion Function by Wasserstein Gradient Descent
      Y. Yang, S. Eckstein, M. Nutz, and S. Mandt
      Neural Information Processing Systems (NeurIPS 2023)   PDF
    • Lossy Image Compression with Conditional Diffusion Models
      R. Yang and S. Mandt
      Neural Information Processing Systems (NeurIPS 2023)   PDF
    • Zero-Shot Batch-Level Anomaly Detection
      A. Li, C. Qiu, M. Kloft, P. Smyth, M. Rudolph, and S. Mandt
      Neural Information Processing Systems (NeurIPS 2023)   PDF
    • ClimSim: An open large-scale dataset for training high-resolution physics emulators in hybrid multi-scale climate simulators
      S. Yu et al.
      Neural Information Processing Systems (NeurIPS 2023)   PDF
    • Diffusion Probabilistic Modeling for Video Generation
      R. Yang, P. Srivastava, and S. Mandt
      Entropy 25 (10), 1469   PDF
    • Computationally Efficient Neural Image Compression with Shallow Decoders
      Y. Yang and S. Mandt
      International Conference on Computer Vision (ICCV 2023)   PDF
    • Generative Diffusions in Augmented Spaces: A Complete Recipe
      K. Pandey and S. Mandt
      International Conference on Computer Vision (ICCV 2023)(Oral)   PDF
    • Inference for Mark-Censored Temporal Point Processes
      A. Boyd, Y. Chang, S. Mandt, and P. Smyth
      Uncertainty in Artificial Intelligence (UAI 2023)(Spotlight)   PDF
    • SC2 Benchmark: Supervised Compression for Split Computing
      Y. Matsubara, M. Levorato, and S. Mandt
      Transactions on Machine Learning Research, 2023   PDF
    • An Introduction to Neural Data Compression
      Y. Yang, S. Mandt, and L. Theis
      Foundations and Trends in Computer Graphics and Vision, 2023   PDF
    • Insights from Generative Modeling for Neural Video Compression
      R. Yang, Y. Yang, J. Marino, and S. Mandt
      Transactions on Pattern Analysis and Machine Intelligence, 2023   preprint
    • Fully Bayesian Autoencoders with Latent Sparse Gaussian Processes
      B. Tran, B. Shahbaba, S. Mandt, M. Filippone
      International Conference on Machine Learning (ICML 2023)   PDF
    • Deep Anomaly Detection under Labeling Budget Constraints
      A. Li, C. Qiu, P. Smyth, M. Kloft, S. Mandt, M. Rudolph
      International Conference on Machine Learning (ICML 2023)   PDF
    • Probabilistic Querying of Continuous-Time Event Sequences
      A. Boyd, Y. Chang, S. Mandt, and P. Smyth
      Artificial Intelligence and Statistics (AISTATS 2023)   PDF

      2022

    • Predictive Querying for Autoregressive Neural Sequence Models
      A. Boyd, S. Showalter, S. Mandt, and P. Smyth
      Neural Information Processing Systems (NeurIPS 2022) (Oral)   PDF
    • An Unsupervised Learning Perspective on the Dynamic Contribution to Extreme Precipitation Changes
      G. Mooers, T. Beucler, M. Pritchard, and S. Mandt
      NeurIPS 2022 Workshop on Tackling Climate Change with Machine Learning   PDF
    • Latent Outlier Exposure for Anomaly Detection with Contaminated Data
      C. Qiu, A. Li, M. Kloft, M. Rudolph, and S. Mandt
      International Conference on Machine Learning (ICML 2022)   PDF
    • Structured Stochastic Gradient MCMC
      A. Alexos, A. Boyd, and S. Mandt
      International Conference on Machine Learning (ICML 2022)   PDF poster code
    • Raising the Bar in Graph-level Anomaly Detection
      C. Qiu, M. Kloft , S. Mandt, and M. Rudolph
      International Joint Conference on Artificial Intelligence (IJCAI 2022)   PDF
    • Learning to Simulate High Energy Particle Collisions from Unlabeled Data
      J. Howard, S. Mandt, D. Whiteson, Y. Yang
      Nature Scientific Reports 12, 7567 (2022)   PDF
    • Making Thermodynamic Models of Mixtures Predictive by Machine Learning: Matrix Completion of Pair Interactions
      F. Jirasek, R. Bamler, S. Fellenz, M. Bortz, M. Kloft, S. Mandt, and H. Hasse
      Chemical Science 13, 4854-4862 (2022)   PDF
    • Towards Empirical Sandwich Bounds on the Rate-Distortion Function
      Y. Yang and S. Mandt
      International Conference on Learning Representations (ICLR 2022)   PDF
    • Lossless Compression with Probabilistic Circuits
      A. Liu, S. Mandt, and G. van den Broeck
      International Conference on Learning Representations (ICLR 2022)   PDF
    • Supervised Compression for Resource-Constrained Edge Computing Systems
      Y. Matsubara, R. Yang, M. Levorato, and S. Mandt
      Winter Conference on Applications of Computer Vision (WACV 2022)   arXiv

      2021

    • Improving Sequential Latent Variable Models with Autoregressive Flows
      J. Marino, J. He, L. Chen, and S. Mandt
      Machine Learning (2021)   article.
    • History Marginalization Improves Forecasting in Variational Recurrent Neural Networks
      C. Qiu, S. Mandt, and M. Rudolph
      Entropy 23, 1563 (2021)   article.
    • Detecting and Adapting to Irregular Distribution Shifts in Bayesian Online Learning
      A. Li, A. Boyd, P. Smyth, S. Mandt
      Neural Information Processing Systems (NeurIPS 2021).   PDF.
    • Neural Transformation Learning for Deep Anomaly Detection Beyond Images
      C. Qiu, T. Pfrommer, M. Kloft, S. Mandt, and M. Rudolph
      International Conference on Machine Learning (ICML 2021).   PDF
    • Hierarchical Autoregressive Modeling for Neural Video Compression
      R. Yang, Y. Yang, J. Marino, and S. Mandt
      International Conference on Learning Representations (ICLR 2021).   PDF
    • Scale Space Flow with Autoregressive Priors
      R. Yang, Y. Yang, J. Marino, and S. Mandt
      ICLR Workshop on Neural Compression: from Information Theory to Applications, 2021 (Spotlight).   PDF
    • Lower Bounding Rate-Distortion from Samples
      Y. Yang and S. Mandt
      ICLR Workshop on Neural Compression: from Information Theory to Applications, 2021 (Spotlight).   PDF
    • Scalable Gaussian Process Variational Autoencoders
      M. Jazbec, M. Ashman, V. Fortuin, M. Pearce, S. Mandt, and G. Rätsch
      Artificial Intelligence and Statistics (AISTATS 2021).   PDF

      2020

    • Improving Inference for Neural Image Compression
      Y. Yang, R. Bamler, and S. Mandt
      Neural Information Processing Systems (NeurIPS 2020).   PDF code
    • User-Dependent Neural Sequence Models for Continuous-Time Event Data
      A. Boyd, R. Bamler, S. Mandt, and P. Smyth
      Neural Information Processing Systems (NeurIPS 2020).   PDF
    • Hybridizing Physical and Data-Driven Prediction Methods for Physicochemical Properties
      F. Jirasek, R. Bamler, and S. Mandt
      Chemical Communications 56 12407, 2020   article
    • Generative Modeling for Atmospheric Convection
      G. Mooers, J. Tuyls, S. Mandt, M. Pritchard, and T. Beucler
      Climate Informatics, 2020   PDF
    • Variational Bayesian Quantization
      Y. Yang, R. Bamler, and S. Mandt
      International Conference on Machine Learning (ICML 2020)   PDF Video
    • How Good is the Bayes Posterior in Deep Neural Networks Really?
      F. Wenzel, K. Roth, B. Veeling, J. Swiatkowski, L. Tran, S. Mandt, J. Snoek, T. Salimans, R. Jenatton, and S. Nowozin
      International Conference on Machine Learning (ICML 2020)   PDF
    • The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks
      J. Swiatkowski, K. Roth, B. Veeling, L. Tran, J. Dillon, S. Mandt, J. Snoek, T. Salimans, R. Jenatton, and S. Nowozin
      International Conference on Machine Learning (ICML 2020)   PDF
    • Hydra: Preserving Ensemble Diversity for Model Distillation
      L. Tran, B. Veeling, K. Roth, J. Swiatkowski, J. Dillon, J. Snoek, S. Mandt, T. Salimans, S. Nowozin, R. Jenatton
      arXiv:2001.04694   PDF
    • Machine Learning in Thermodynamics: Prediction of Activity Coefficients by Matrix Completion
      F. Jirasek, R. Alves, J. Damay, R. Vandermeulen, R. Bamler, M. Bortz, S. Mandt, M. Kloft, and H. Hasse
      The Journal of Physical Chemistry Letters, 11, 2020.   article free PDF
    • GP-VAE: Deep Probabilistic Time Series Imputation
      V. Fortuin, D. Baranchuk, G. Rätsch, and S. Mandt
      Artificial Intelligence and Statistics (AISTATS 2020).   PDF
    • Extreme Classification via Adversarial Softmax Approximation
      R. Bamler and S. Mandt
      International Conference on Learning Representations (ICLR 2020).   PDF

      2019

    • Tightening Bounds for Variational Inference by Revisiting Perturbation Theory
      R. Bamler, C. Zhang, M. Opper, and S. Mandt
      Journal of Statistical Mechanics (2019), 124004.   PDF
    • Deep Generative Video Compression
      J. Han, S. Lombardo, C. Schroers, and S. Mandt
      Neural Information Processing Systems (NeurIPS 2019).   PDF poster
    • Autoregressive Text Generation Beyond Feedback Loops
      F. Schmidt, S. Mandt, and T. Hofmann
      Conference on Empirical Methods in Natural Language Processing (EMNLP 2019).   PDF
    • Augmenting and Tuning Knowledge Graph Embeddings
      R. Bamler, F. Salehi, and S. Mandt
      Conference on Uncertainty in Artificial Intelligence (UAI 2019).   PDF
    • Advances in Variational Inference
      C. Zhang, J. Bütepage, H. Kjellström, and S. Mandt
      IEEE Transactions on Pattern Analysis and Machine Intelligence.   PDF arXiv
    • Active Mini-Batch Sampling using Repulsive Point Processes
      C. Zhang, C. Öztireli, S. Mandt, and G. Salvi
      Conference on Artificial Intelligence (AAAI 2019).   article PDF
    • Mobile Robotic Painting of Texture
      M. Helou, S. Mandt, A. Krause, and P. Beardsley
      International Conference on Robotics and Automation (ICRA 2019).   article PDF
    • A Quantum Field Theory of Representation Learning
      R. Bamler and S. Mandt
      ICML Workshop on Physics for Deep Learning (2019).   PDF

      2018

    • Disentangled Sequential Autoencoder
      Y. Li and S. Mandt
      International Conference on Machine Learning (ICML 2018).   PDF
    • Iterative Amortized Inference
      J. Marino, Y. Yue, and S. Mandt
      International Conference on Machine Learning (ICML 2018).   PDF
    • Quasi Monte Carlo Variational Inference
      A. Buchholz, F. Wenzel, and S. Mandt
      International Conference on Machine Learning (ICML 2018).   PDF
    • Improving Optimization for Models With Continuous Symmetry Breaking
      R. Bamler and S. Mandt
      International Conference on Machine Learning (ICML 2018), long talk.   PDF
    • Continuous Word Embedding Fusion via Spectral Decomposition
      T. Fu, C. Zhang, and S. Mandt
      The SIGNLL Conference on Natural Language Learning (CoNLL 2018).   PDF
    • Scalable Generalized Dynamic Topic Models
      P. Jähnichen, F. Wenzel, M. Kloft, and S. Mandt
      Artificial Intelligence and Statistics (AISTATS 2018).   PDF
    • Image Anomaly Detection with Generative Adversarial Networks
      L. Deecke, R. Vandermeulen, L. Ruff, S. Mandt, and M. Kloft
      European Conference on Machine Learning (ECML PKDD 2018).   PDF
    • Learning to Infer
      J. Marino, Y. Yue, and S. Mandt
      International Conference on Learning Representations (Workshop Track).   PDF
    • Quasi Monte Carlo Flows
      F. Wenzel, A. Buchholz, and S. Mandt
      NeurIPS Bayesian Deep Learning Workshop.  PDF
    • Video Compression through Deep Bayesian Learning
      S. Lombardo, J. Han, C. Schroers, and S. Mandt
      NeurIPS Bayesian Deep Learning Workshop.  PDF

      2017

    • Stochastic Gradient Descent as Approximate Bayesian Inference
      S. Mandt, M. Hoffman, and D. Blei
      Journal of Machine Learning Research, vol 18(134):1-35, 2017.   PDF code
    • Perturbative Black Box Variational Inference
      R. Bamler, C. Zhang, M. Opper, and S. Mandt
      Neural Information Processing Systems (NIPS 2017).   PDF poster
    • Dynamic Word Embeddings
      R. Bamler and S. Mandt
      International Conference on Machine Learning (ICML 2017).   PDF poster
    • Determinantal Point Processes for Mini-batch Diversification
      C. Zhang, H. Kjellström, and S. Mandt
      Uncertainty in Artificial Intelligence (UAI 2017) (plenary talk).   PDF
    • Factorized Variational Autoencoders for Modeling Audience Reactions to Movies
      Z. Deng, R. Navarathna, P. Carr, S. Mandt, Y. Yue, I. Matthews, and G. Mori
      Computer Vision and Pattern Recognition (CVPR 2017).   PDF
    • Iterative Inference Models
      J. Marino, Y. Yue, and S. Mandt
      NIPS 2017 Workshop on Bayesian Deep Learning.   PDF
    • Bayesian Paragraph Vectors
      G. Ji, R. Bamler, E. Sudderth, and S. Mandt
      NIPS 2017 Workshop on Approximate Bayesian Inference.   PDF
    • Structured Black Box Variational Inference for Latent Time Series Models
      R. Bamler and S. Mandt
      ICML 2017 Time Series Workshop (oral).   PDF
    • Diversified Mini-Batch Sampling using Repulsive Point Processes
      C. Zhang, C. Öztireli, and S. Mandt
      NIPS 2017 Workshop on Advances in Approximate Bayesian Inference.   PDF

      2016

    • Exponential Family Embeddings
      M. Rudolph, F.J.R. Ruiz, S. Mandt, and D. Blei
      Neural Information Processing Systems (NIPS 2016). PDF
    • Balanced Population Stochastic Variational Inference
      C. Zhang, S. Mandt, and H. Kjellström
      NIPS 2016 Workshop on Advances in Approximate Bayesian Inference. PDF
    • Huber-Norm Regularization for Linear Prediction Models
      O. Zadorozhnyi, G. Benecke, S. Mandt, T. Scheffer, M. Kloft
      European Conference on Machine Learning (ECML 2016). PDF
    • A Variational Analysis of Stochastic Gradient Algorithms
      S. Mandt, M. Hoffman, and D. Blei
      International Conference on Machine Learning (ICML 2016)   PDF poster video
      code
    • Variational Tempering
      S. Mandt, J. McInerney, F. Abrol, R. Ranganath, and D. Blei
      Artificial Intelligence and Statistics (AISTATS 2016).   PDF
    • Separating Sparse Signals from Correlated Noise in Binary Classification
      S. Mandt, F. Wenzel, S. Nakajima, C. Lippert, and M. Kloft.
      UAI 2016 Workshop on Causation: Foundation to Application. (oral)   PDF

      2015

    • Sparse Probit Linear Mixed Model
      S. Mandt, F. Wenzel, S. Nakajima, J. P. Cunningham, C. Lippert, and M. Kloft
      Machine Learning, 106(9), 1621-1642.   PDF
    • Continuous-Time Limit of Stochastic Gradient Descent Revisited
      S. Mandt, M. Hoffman, and D. Blei
      NIPS Workshop on Optimization for Machine Learning (OPT 2015)   PDF
    • Finding Sparse Features in Strongly Confounded Medical Binary Data
      S. Mandt, F. Wenzel, S. Nakajima, J. P. Cunningham, C. Lippert, and M. Kloft
      NIPS Workshop on Machine Learning in Healthcare (2015). (oral)   PDF
    • Stochastic Differential Equations for Quantum Dynamics of Spin-Boson Networks
      S. Mandt, D. Sadri, A. Houck, and H. Tureci
      New Journal of Physics 17 (2015) 053018.   PDF

      2014

    • Smoothed Gradients for Stochastic Variational Inference
      S. Mandt and D. Blei
      Neural Information Processing Systems (NIPS 2014)   PDF
    • Probit Regression with Correlated Label Noise: An EM-EP approach
      S. Mandt, F. Wenzel, J. Cunningham, and M. Kloft
      NIPS Workshop on Advances in Variational Inference (2014)   PDF
    • Comment on "Consistent thermostatistics forbids negative absolute temperatures"
      U. Schneider, S. Mandt, A. Rapp, S. Braun, H. Weimer, I. Bloch, and A. Rosch
      Arxiv (2014).   PDF
    • Damping of Bloch oscillations: Variational solution of the Boltzmann equation beyond linear response
      S. Mandt
      Physical Review A 90, 053624 (2014).   PDF

      Before 2014

    • Relaxation towards negative temperatures in bosonic systems: Generalized Gibbs ensembles and beyond integrability
      S. Mandt, A. Feiguin, S. Manmana
      Physical Review A 88, 043643 (2013).   PDF
    • Ultrakalt und doch heißer als unendlich heiß. Erstmals gelang es, ein Quantengas bei negativen absoluten Temperaturen herzustellen
      S. Mandt
      Popular article on negative temperatures in the monthly proceedings of the German Physical Society.
      Physik Journal 12, March edition (2013)   PDF
    • Transport and Non-Equilibrium Dynamics in Optical Lattices. From Expanding Atomic Clouds to Negative Absolute Temperatures
      S. Mandt
      PhD thesis, University of Cologne (2012)   PDF
    • Fermionic transport in a homogeneous Hubbard model: Out-of-equilibrium dynamics with ultracold atoms
      U. Schneider, L. Hackermueller, J.P. Ronzheimer, S. Will, S. Braun, T. Best, I. Bloch, E. Demler, S. Mandt, D. Rasch, A. Rosch
      Nature Physics 8, 213-218 (2012).   PDF
      Press: SciTechDaily, Pro-Physik (in German)
    • Interacting Fermionic Atoms in Optical Lattices Diffuse Symmetrically Upwards and Downwards in a Gravitational Potential
      S. Mandt, A. Rapp, A. Rosch
      Physical Review Letters 106, 250602 (2011).   arXiv
      Press: Nature
    • Equilibration rates and negative absolute temperatures for ultracold atoms in optical lattices
      A. Rapp, S. Mandt, A. Rosch
      Physical Review Letters 105, 220405 (2010).   arXiv
      Press: Nature , New Scientist, Science News
      Experimental realization of T<0 based on our theory: Braun et. al., Science 2013
    • Zooming in on local level statistics by supersymmetric extension of free probability
      S. Mandt, M.R. Zirnbauer
      J. Phys. A 43 (2010) 025201.   arXiv
    • Symmetric Spaces Toolkit
      H. Sebert and S. Mandt
      Lecture notes, SFB/TR 12, Langeoog (2007)   PDF