pdf icon
Volume 18 (2022) Article 9 pp. 1-18
APPROX-RANDOM 2019 Special Issue
Optimal Convergence Rate of Hamiltonian Monte Carlo for Strongly Logconcave Distributions
Received: February 29, 2020
Revised: September 23, 2020
Published: April 30, 2022
Download article from ToC site:
[PDF (1062K)] [PS (2755K)] [Source ZIP]
Keywords: logconcave distribution, sampling, Hamiltonian Monte Carlo, spectral gap, strong convexity
ACM Classification: Theory of computation--Random walks and Markov chains, Design and analysis of algorithms
AMS Classification: 60J05, 60J25, 68W20

Abstract: [Plain Text Version]

$ \newcommand{\R}{{\mathbb R}} \newcommand{\eee}{\mathrm e} \newcommand{\eps}{\varepsilon} $

We study the Hamiltonian Monte Carlo (HMC) algorithm for sampling from a strongly logconcave density proportional to $\eee^{-f}$ where $f:\R^d \to \R$ is $\mu$-strongly convex and $L$-smooth (the condition number is $\kappa = L/\mu$). We show that the relaxation time (inverse of the spectral gap) of ideal HMC is $O(\kappa)$, improving on the previous best bound of $O(\kappa^{1.5})$ (Lee et al., 2018); we complement this with an example where the relaxation time is $\Omega(\kappa)$, for any step-size. When implemented with an ODE solver, HMC returns an $\eps$-approximate point in 2-Wasserstein distance using $\tilde{O}((\kappa d)^{0.5} \eps^{-1})$ gradient evaluations per step and $\tilde{O}((\kappa d)^{1.5}\eps^{-1})$ total time.

----------------------

A conference version of this paper appeared in the Proceedings of the 23rd International Conference on Randomization and Computation (RANDOM 2019).