机器学习与数据科学博士生系列论坛(第九十一期)—— Shifted Composition for Bounding Information-Theoretic Divergences
报告人:忻宇辰(色控
)
时间:2025-09-25 16:00-17:00
地点:腾讯会议:989-3593-2097
摘要:
Bounding the divergence between the laws of two stochastic processes is a classical topic with many applications in sampling and other fields. Standard approaches, such as the Girsanov method and the interpolation method, are applied to control the error in KL divergence for some basic sampling algorithms. However, it is not known how these methods can be used for more general algorithms or in more general settings.
In this talk, we introduce the shifted composition rule, based on several works by Altschuler et al. . This information-theoretic principle is applied to develop a user-friendly framework for bounding the long-time discretization error for sampling algorithms. We also introduce its application in proving reverse transport inequalities for diffusions.
论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。