| CARVIEW |
Cheng Lu (路橙)Research ScientistMeta Super Intelligence TBD Lab
|
![]() |
Biography
I am a research scientist at Meta Super Intelligence TBD Lab, working on pretraining architecture and optimization algorithms.
I was a research scientist at OpenAI, where I was a core research contributor of GPT-4o Image Generation and a research contributor of Sora 2. I'm interested in large-scale deep generative models and reinforcement learning algorithms. I love to find a sweet balance between mathmetical theory and practical tricks.
I received my Ph.D. degree from TSAIL Group in December 2023 at Tsinghua University, advised by Prof. Jun Zhu. During my Ph.D., I also worked closely with Jianfei Chen and Chongxuan Li. Before that, I received my B.E. degree from the Department of Computer Science and Technology, Tsinghua University in July, 2019.
I had research experiences on consistency models, diffusion models, normalizing flows and energy-based models, and their applications in image generation, 3D generation and reinforcement learning.
My Doctoral Dissertation (in Chinese): Research on Invertible Generative Models and Efficient Algorithms, supervised by Prof. Jun Zhu.
Research Highlight
-
Simplifying, Stabilizing and Scaling Continuous-Time Consistency Models
[blog]
- Continuous-time consistency models with sample quality comparable to leading diffusion models in just two sampling steps.
-
ProlificDreamer & Variational Score Distillation
[code] [project]
- High-fidelity text-to-3D generation with solely 2D diffusion models.
-
DPM-Solver, DPM-Solver++ and DPM-Solver-v3
[code] [demo]
- Training-free fast samplers for diffusion models (such as Stable Diffusion).
- Widely-applied in various of text-to-image libraries and applications, such as DreamStudio, StableBoost, Stable-Diffusion-WebUI, Diffusers, official Code of Stable Diffusion v2, official Code of Stable Diffusion v1, online demo of Stable Diffusion v2, online demo of Stable Diffusion v1.5, Apple's ML-Stable-Diffusion.
Publications
-
Towards Efficient and Exact Optimization of Language Model Alignment
Haozhe Ji, Cheng Lu, Yilin Niu, Pei Ke, Hongning Wang, Jun Zhu, Jie Tang, Minlie Huang
International Conference on Machine Learning (ICML), 2024
[code] -
Score Regularized Policy Optimization through Diffusion Behavior
Huayu Chen, Cheng Lu, Zhengyi Wang, Hang Su, Jun Zhu
International Conference on Learning Representations (ICLR), 2024
[code] -
The Blessing of Randomness: SDE Beats ODE in General Diffusion-based Image Editing
Shen Nie, Hanzhong Allan Guo, Cheng Lu, Yuhao Zhou, Chenyu Zheng, Chongxuan Li
International Conference on Learning Representations (ICLR), 2024
[code] [project] -
ProlificDreamer: High-Fidelity and Diverse Text-to-3D Generation with Variational Score Distillation
Spotlight
Zhengyi Wang*, Cheng Lu*, Yikai Wang, Fan Bao, Chongxuan Li, Hang Su, Jun Zhu
Conference on Neural Information Processing Systems (NeurIPS), 2023
[code] [project] -
DPM-Solver-v3: Improved Diffusion ODE Solvers with Empirical Model Statistics
Kaiwen Zheng*, Cheng Lu*, Jianfei Chen, Jun Zhu
Conference on Neural Information Processing Systems (NeurIPS), 2023
[code] [project] -
On Calibrating Diffusion Probabilistic Models
Tianyu Pang, Cheng Lu, Chao Du, Min Lin, Shuicheng YAN, Zhijie Deng
Conference on Neural Information Processing Systems (NeurIPS), 2023
[code] -
Gaussian Mixture Solvers for Diffusion Models
Hanzhong Allan Guo, Cheng Lu, Fan Bao, Tianyu Pang, Shuicheng YAN, Chao Du, Chongxuan Li
Conference on Neural Information Processing Systems (NeurIPS), 2023
[code] -
Improved Techniques for Maximum Likelihood Estimation for Diffusion ODEs
Kaiwen Zheng*, Cheng Lu*, Jianfei Chen, Jun Zhu
International Conference on Machine Learning (ICML), 2023
[code] -
Contrastive Energy Prediction for Exact Energy-Guided Diffusion Sampling in Offline Reinforcement Learning
Cheng Lu*, Huayu Chen*, Jianfei Chen, Hang Su, Chongxuan Li, Jun Zhu
International Conference on Machine Learning (ICML), 2023
[code] -
Offline Reinforcement Learning via High-Fidelity Generative Behavior Modeling
Huayu Chen, Cheng Lu, Chengyang Ying, Hang Su, Jun Zhu
International Conference on Learning Representations (ICLR), 2023
[code] -
DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps
Oral (Accept rate~1.7%)
Cheng Lu, Yuhao Zhou, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu
Conference on Neural Information Processing Systems (NeurIPS), 2022
[code] -
Maximum Likelihood Training for Score-Based Diffusion ODEs by High Order Denoising Score Matching
Cheng Lu, Kaiwen Zheng, Fan Bao, Chongxuan Li, Jianfei Chen, Jun Zhu
International Conference on Machine Learning (ICML), 2022
[code] -
Implicit Normalizing Flows
Spotlight (Accept rate~5.5%)
Cheng Lu, Jianfei Chen, Chongxuan Li, Qiuhao Wang, Jun Zhu.
International Conference on Learning Representations (ICLR), 2021
[code] -
VFlow: More Expressive Generative Flows with Variational Data Augmentation
Jianfei Chen, Cheng Lu, Biqi Chenli, Jun Zhu, Tian Tian
International Conference on Machine Learning (ICML), 2020
[code]
Preprints
-
Simplifying, Stabilizing and Scaling Continuous-Time Consistency Models
Cheng Lu, Yang Song
[blog]
-
DPM-Solver++: Fast Solver for Guided Sampling of Diffusion Probabilistic Models
Cheng Lu, Yuhao Zhou, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu
[code]
Selected Honors & Awards
- Outstanding Doctoral Thesis, Tsinghua University , 2024.06
- Beijing Outstanding Graduates , 2024.01
- Zhong Shimo Scholarship , 2023.12
- China National Scholarship , 2023.12
- ByteDance Scholarship , 2023.10
- '84' Future Innovation Scholarship, Tsinghua University, 2020.12
- Top Ten Compus Singers Competition, Tsinghua University, 2019.10
- Outstanding Graduates, Department of Computer Science and Technology, Tsinghua University, 2019.06
- The Mathematical Contest in Modeling, Meritorious Winner, 2018
- Chinese Mathematical Olympiad (CMO), Silver Medal, 2014
Services
Reviewer
NeurIPS (2021, 2022 with Top Reviewer, 2022 workshop); ICML (2021-2024); ICLR (2021, 2023-2024); CVPR (2023-2024); ICCV 2023; ECCV 2024; IJCV;Contributor
huggingface/diffusers, the most widely-used library for diffusion models.Teaching
2021 Spring, Head TA in Statistical Learning Theory and Applications, instructed by Prof. Jun Zhu2021 Spring, TA in Deep Learning, instructed by Prof. Xiaolin Hu and Prof. Jun Zhu
