# American Institute of Mathematical Sciences

doi: 10.3934/fods.2021034
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

## Constrained Ensemble Langevin Monte Carlo

* Corresponding author: Qin Li

Received  September 2021 Revised  October 2021 Early access December 2021

Fund Project: Q.L. acknowledges support from Vilas Early Career award. The research of Z.D., and Q.L is supported in part by NSF via grant DMS-1750488, DMS-2023239 and Office of the Vice Chancellor for Research and Graduate Education at the University of Wisconsin Madison with funding from the Wisconsin Alumni Research Foundation

The classical Langevin Monte Carlo method looks for samples from a target distribution by descending the samples along the gradient of the target distribution. The method enjoys a fast convergence rate. However, the numerical cost is sometimes high because each iteration requires the computation of a gradient. One approach to eliminate the gradient computation is to employ the concept of "ensemble." A large number of particles are evolved together so the neighboring particles provide gradient information to each other. In this article, we discuss two algorithms that integrate the ensemble feature into LMC, and the associated properties.

In particular, we find that if one directly surrogates the gradient using the ensemble approximation, the algorithm, termed Ensemble Langevin Monte Carlo, is unstable due to a high variance term. If the gradients are replaced by the ensemble approximations only in a constrained manner, to protect from the unstable points, the algorithm, termed Constrained Ensemble Langevin Monte Carlo, resembles the classical LMC up to an ensemble error but removes most of the gradient computation.

Citation: Zhiyan Ding, Qin Li. Constrained Ensemble Langevin Monte Carlo. Foundations of Data Science, doi: 10.3934/fods.2021034
##### References:

show all references

##### References:
Example 1: Evolution of samples using CEnLMC. $N = 10^4$
Example 1: Evolution of samples using LMC and MALA. $N = 10^4$
Example 1: Evolution of $\mathcal{R}_m$ when $N = 2\times10^3, 6\times10^3$ or $10^4$
Example 2: Evolution of samples using CEnLMC when $N = 10^4$
Example 2: Evolution of samples using LMC and MALA when $N = 10^4$
Example 2: Evolution of $\mathcal{R}_m$ with $m$ when $N = 2\times10^3, 6\times10^3, 10^4$