People
Education
Ph.D., Computational Mathematics, Peking University, 2019
B.S., Pure Mathematics, Nankai University, 2012
B.S., Pure Mathematics, Nankai University, 2012
Research Interests
Mathematical theory of deep learning: neural network approximation, implicit regularization of SGD, non-convex optimization
Selected Publication
1.Weinan E, Chao Ma, Lei Wu. Machine Learning from a Continuous Viewpoint. Science China Mathematics, 2020
2.Weinan E, Chao Ma, Lei Wu. Barron spaces and flow-induced function spaces for neural network models. Constructive Approximation, 2021
3.Weinan E, Chao Ma, Lei Wu. A comparative analysis of the optimization and generalization property of two-layer neural network and random feature models under gradient descent dynamics. Science China Mathematics, 2020
4.Weinan E, Chao Ma, Lei Wu. A priori estimates of the population risk for two-layer neural networks. Communications in Mathematical Sciences, 2019
5.Lei Wu, Chao Ma, Weinan E. How SGD selects the global minima in over-parameterized learning: A dynamical stability perspective. Advances in Neural Information Processing Systems (NeurIPS), 2018
2.Weinan E, Chao Ma, Lei Wu. Barron spaces and flow-induced function spaces for neural network models. Constructive Approximation, 2021
3.Weinan E, Chao Ma, Lei Wu. A comparative analysis of the optimization and generalization property of two-layer neural network and random feature models under gradient descent dynamics. Science China Mathematics, 2020
4.Weinan E, Chao Ma, Lei Wu. A priori estimates of the population risk for two-layer neural networks. Communications in Mathematical Sciences, 2019
5.Lei Wu, Chao Ma, Weinan E. How SGD selects the global minima in over-parameterized learning: A dynamical stability perspective. Advances in Neural Information Processing Systems (NeurIPS), 2018