Yulong Lu
Yulong Lu
Verified email at - Homepage
Cited by
Cited by
A Bayesian level set method for geometric inverse problems
MA Iglesias, Y Lu, AM Stuart
Interfaces and free boundaries 18 (2), 181-217, 2016
Scaling limit of the Stein variational gradient descent: The mean field regime
J Lu, Y Lu, J Nolen
SIAM Journal on Mathematical Analysis 51 (2), 648-671, 2019
A mean field analysis of deep resnet and beyond: Towards provably optimization via overparameterization from depth
Y Lu, C Ma, Y Lu, J Lu, L Ying
International Conference on Machine Learning, 6426-6436, 2020
A universal approximation theorem of deep neural networks for expressing probability distributions
Y Lu, J Lu
Advances in neural information processing systems 33, 3094-3105, 2020
A priori generalization analysis of the deep ritz method for solving high dimensional elliptic partial differential equations
Y Lu, J Lu, M Wang
Conference on learning theory, 3196-3241, 2021
Gaussian Approximations for Probability Measures on
Y Lu, A Stuart, H Weber
SIAM/ASA Journal on Uncertainty Quantification 5 (1), 1136-1165, 2017
The factorization method for inverse elastic scattering from periodic structures
G Hu, Y Lu, B Zhang
Inverse Problems 29 (11), 115005, 2013
Quantitative propagation of chaos in a bimolecular chemical reaction-diffusion model
TS Lim, Y Lu, JH Nolen
SIAM Journal on Mathematical Analysis 52 (2), 2098-2133, 2020
Gaussian approximations for transition paths in Brownian dynamics
Y Lu, AM Stuart, H Weber
SIAM Journal on Mathematical Analysis 49 (4), 3005–3047, 2017
Geometric ergodicity of Langevin dynamics with Coulomb interactions
Y Lu, JC Mattingly
Nonlinearity 33 (2), 675, 2019
Uniform-in-time weak error analysis for stochastic gradient descent algorithms via diffusion approximation
Y Feng, T Gao, L Li, JG Liu, Y Lu
arXiv preprint arXiv:1902.00635, 2019
Accelerating langevin sampling with birth-death
Y Lu, J Lu, J Nolen
arXiv preprint arXiv:1905.09863, 2019
On the bernstein-von mises theorem for high dimensional nonlinear bayesian inverse problems
Y Lu
arXiv preprint arXiv:1706.00289, 2017
Exponential decay of Rényi divergence under Fokker–Planck equations
Y Cao, J Lu, Y Lu
Journal of Statistical Physics 176 (5), 1172-1184, 2019
A priori generalization error analysis of two-layer neural networks for solving high dimensional Schrödinger eigenvalue problems
J Lu, Y Lu
Communications of the American Mathematical Society 2 (01), 1-21, 2022
On the representation of solutions to elliptic pdes in barron spaces
Z Chen, J Lu, Y Lu
Advances in neural information processing systems 34, 6454-6465, 2021
Continuum limit and preconditioned Langevin sampling of the path integral molecular dynamics
J Lu, Y Lu, Z Zhou
Journal of Computational Physics 423, 109788, 2020
An operator splitting scheme for the fractional kinetic Fokker-Planck equation
MH Duong, Y Lu
arXiv preprint arXiv:1806.06127, 2018
On the rate of convergence of empirical measure in Wasserstein distance for unbounded density function
A Liu, JG Liu, Y Lu
arXiv preprint arXiv:1807.08365, 2018
Gradient flow structure and exponential decay of the sandwiched Rényi divergence for primitive Lindblad equations with GNS-detailed balance
Y Cao, J Lu, Y Lu
Journal of Mathematical Physics 60 (5), 052202, 2019
The system can't perform the operation now. Try again later.
Articles 1–20