We study sampling from a target distribution e−f using the unadjusted Langevin Monte Carlo (LMC) algorithm. For any potential function f whose tails behave like ‖ for \alpha \in [1,2], and has \beta-H\"older continuous gradient, we derive the sufficient number of steps to reach the \epsilon-neighborhood of a d-dimensional target distribution as a function of \alpha and \beta. Our rate estimate, in terms of \epsilon dependency, is not directly influenced by the tail growth rate \alpha of the potential function as long as its growth is at least linear, and it only relies on the order of smoothness \beta.
Our rate recovers the best known rate which was established for strongly convex potentials with Lipschitz gradient in terms of \epsilon dependency, but we show that the same rate is achievable for a wider class of potentials that are degenerately convex at infinity.