Faster Convergence for Diffusion-Based Generative Models
Conference
Regional Statistics Conference 2026
Format: IPS Abstract - Malta 2026
Keywords: convergence, diffusion, generative ai, sampling
Session: IPS 1271 - Advances in Deep Learning for Statistical Inference and Generative Modeling
Wednesday 3 June 2:30 p.m. - 4:10 p.m. (Europe/Malta)
Abstract
Diffusion models, which generate new data instances by learning to reverse a Markov diffusion process from noise, have become a cornerstone in contemporary generative modeling. Despite their remarkable empirical success, the theoretical foundations of these models remain relatively underexplored, particularly in terms of convergence guarantees. In this talk, I will present a sharp convergence theory for both continuous and discrete diffusion models, offering the first rigorous justification of their efficiency and fundamental limitations. Building on this framework, I will further show how harnessing intrinsic structures of data can lead to refined convergence guarantees and deeper theoretical insights into diffusion-based generation.