摘要:
Score-based diffusion models, a powerful and universal generative AI technology, have achieved tremendous success in numerous applications. Alongside the significant empirical success, there is a growing body of works attempting to provide clearer theoretical support for these models.
The performance of a score-based diffusion model typically depends on the discretization of the forward and backward SDEs and the approximation of the score functions. We are interested in the following points:
1. How to measure the distance between the distribution of generated samples and the distribution of real data?
2. What conditions must the real data distribution satisfy to ensure that the score-based generative model can approximate the real data distribution?
3. Given a precision requirement ϵ for the distance between these two distributions, what iteration complexity is required for the discretization in the diffusion model, and what precision is needed for score matching?
In this talk, we will start by introducing the training process of score-based diffusion models, helping everyone establish the basic steps for analyzing the convergence rate of diffusion models.
Then, I will present a series of works that study the convergence rates of score-based diffusion models, each providing different answers to the three questions we are interested in. Finally, I will introduce some more efficient model algorithms designed based on these theoretical insights.
论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。