Seminar series
Date
Mon, 22 Nov 2021
Time
14:00 - 15:00
Location
Virtual
Speaker
Murat Erdogdu
Organisation
University of Toronto

We study sampling from a target distribution $e^{-f}$ using the unadjusted Langevin Monte Carlo (LMC) algorithm. For any potential function $f$ whose tails behave like $\|x\|^\alpha$ for $\alpha \in [1,2]$, and has $\beta$-H\"older continuous gradient, we derive the sufficient number of steps to reach the $\epsilon$-neighborhood of a $d$-dimensional target distribution as a function of $\alpha$ and $\beta$. Our rate estimate, in terms of $\epsilon$ dependency, is not directly influenced by the tail growth rate $\alpha$ of the potential function as long as its growth is at least linear, and it only relies on the order of smoothness $\beta$.

Our rate recovers the best known rate which was established for strongly convex potentials with Lipschitz gradient in terms of $\epsilon$ dependency, but we show that the same rate is achievable for a wider class of potentials that are degenerately convex at infinity.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.