UPDF AI

Functional Central Limit Theorem and Strong Law of Large Numbers for Stochastic Gradient Langevin Dynamics

A. Lovas,Mikl'os R'asonyi

2022 · DOI: 10.1007/s00245-023-10052-y
Applied Mathematics and Optimization · 3 citaten

TLDR

A strong law of large numbers and a functional central limit theorem are derived for SGLD, the stochastic gradient Langevin dynamics with a fixed step size.

Samenvatting

We study the mixing properties of an important optimization algorithm of machine learning: the stochastic gradient Langevin dynamics (SGLD) with a fixed step size. The data stream is not assumed to be independent hence the SGLD is not a Markov chain, merely a Markov chain in a random environment , which complicates the mathematical treatment considerably. We derive a strong law of large numbers and a functional central limit theorem for SGLD.