I wrote a free, proof-complete monograph on stochastic gradient methods in infinite-dimensional Hilbert spaces. It starts from the functional-analysis foundations (Riesz, Radon–Nikodym), gives five equivalent definitions of stochastic gradients, proves existence/uniqueness of the dynamics, connects the continuum limit to gradient-flow PDEs, and derives explicit convergence rates under multiple assumption sets (convex, strongly convex, PL/KL, heavy-tailed noise, proximal). It also covers special cases (Gaussian/RKHS), extends to Hilbert manifolds/Banach spaces, analyzes discretizations (stability/consistency), and includes four applications (QM ground states, elasticity, optimal control, inverse problems) with pseudocode and explicit constants. Open problems at the end invite feedback. PDF is free to redistribute if cited.
Happy to answer questions on assumptions that are actually needed in infinite-D, how the spectral picture influences rates, and what breaks outside Hilbert structure. If you want a quick start, skim the convergence summary and the applications sections.