Faculty DirectoryYiping Lu
Assistant Professor of Industrial Engineering and Management Sciences
Contact
2145 Sheridan RoadTech M237
Evanston, IL 60208-3109
Email Yiping Lu
Website
Departments
Industrial Engineering and Management Sciences
Education
Ph.D ,Applied and computational math, Stanford
B.S., Computational Math, Peking University
Research Interests
Statistical Machine Learning, Numerical Analysis, Optimization and Control, Machine Learning for physical and manufacture applications, Experimental Design
The long-term goal of Yiping’s research is to develop a hybrid scientific research discipline that combines domain knowledge (differential equation, stochastic process, control,…), machine learning and (randomized) experiments. To this end, Yiping and his group are working on an interdisciplinary research approach across probability and statistics, machine learning, numerical algorithms, control theory, signal processing/inverse problem, and operations research. Yiping is among the first to work at the interface of deep learning and differential equations. Yiping was a recipient of the Conference on Parsimony and Learning (CPAL) Rising Star Award in 2024, the Rising Star in Data Science from the University of Chicago in 2022, the Stanford Interdisciplinary Graduate Fellowship and the SenseTime Scholarship in 2021 for undergraduates in AI in 2019. Yiping also served as Area Chair for top machine learning conference such as Neurips and AISTATS.
Selected Publications
- Lu Y, Lin D, Du Q. Which Spaces can be Embedded in $ L_p $-type Reproducing Kernel Banach Space? A Characterization via Metric Entropy, : arXiv: 2410.11116.
- Xu R, Lu Y. Randomized Iterative Solver as Iterative Refinement: A Simple Fix Towards Backward Stability[J]. arXiv preprint arXiv:2410.11115, 2024.
- Jin J, Lu Y, Blanchet J, et al. Minimax optimal kernel operator learning via multilevel training[C]//The Eleventh International Conference on Learning Representations. 2022.
- Lu Y, Chen H, Lu J, et al. Machine Learning For Elliptic PDEs: Fast Rate Generalization Bound, Neural Scaling Law and Minimax Optimality[C]//International Conference on Learning Representations.
- Ji W, Lu Y, Zhang Y, et al. An Unconstrained Layer-Peeled Perspective on Neural Collapse[C]//International Conference on Learning Representations.
- Lu Y, Ma C, Lu Y, et al. A mean field analysis of deep resnet and beyond: Towards provably optimization via overparameterization from depth[C]//International Conference on Machine Learning. PMLR, 2020: 6426-6436.
- Zhang D, Zhang T, Lu Y, et al. You only propagate once: Accelerating adversarial training via maximal principle[J]. Advances in neural information processing systems, 2019, 32.
- Lu Y, Zhong A, Li Q, et al. Beyond finite layer neural networks: Bridging deep architectures and numerical differential equations[C]//International Conference on Machine Learning. PMLR, 2018: 3276-3285.
- Long Z, Lu Y, Ma X, et al. Pde-net: Learning pdes from data[C]//International conference on machine learning. PMLR, 2018: 3208-3216.