kTULA: A Langevin sampling algorithm with improved KL bounds under super-linear log-gradients

各位老师同学好, 以下发布一则学术报告通知, 欢迎各位老师同学参加, 谢谢!

报告题目:kTULA: A Langevin sampling algorithm with improved KL bounds under super-linear log-gradients

报告人:张莹  助理教授  香港科技大学(广州)

报告时间:2026年4月22日上午10:00

报告地点:红瓦楼723

报告内容简介:Motivated by applications in deep learning, where the global Lipschitz continuity condition is often not satisfied, we examine the problem of sampling from distributions with super-linearly growing log-gradients. We propose a novel tamed Langevin dynamics-based algorithm, called kTULA, to solve the aforementioned sampling problem, and provide a theoretical guarantee for its performance. More precisely, we establish a non-asymptotic convergence bound in Kullback-Leibler (KL) divergence with the best-known rate of convergence equal to 2−ϵ, ϵ>0, which significantly improves relevant results in existing literature. This enables us to obtain an improved non-asymptotic error bound in Wasserstein-2 distance, which can be used to further derive a non-asymptotic guarantee for kTULA to solve the associated optimization problems. To illustrate the applicability of kTULA, we apply the proposed algorithm to the problem of sampling from a high-dimensional double-well potential distribution and to an optimization problem involving a neural network. We show that our main results can be used to provide theoretical guarantees for the performance of kTULA.

报告人简介:张莹,香港科技大学(广州)助理教授,博士毕业于英国爱丁堡大学,曾在新加坡南洋理工大学担任博士后研究员。她的研究方向聚焦于机器学习算法的理论性质和它们在金融等领域的应用。研究成果发表于Mathematics of Operations Research、IMA Journal of Numerical Analysis、Bernoulli、SIAM Journal on Mathematics of Data Science、Applied Mathematics and Optimization等领域内高水平期刊。

报告邀请人:吴思洲