| CARVIEW |
Yangyi Chen
Research Scientist @ NVIDIA
[Google Scholar]ā[Twitter]ā[Github]
Contact: yangyic@NVIDIA.com
Iām a research scientist at NVIDIA. I received my Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign (Advisor: Prof. Heng Ji). At NVIDIA, I work on LLMs post-training and LLM-based coding agents.
Experience / Education
-
NVIDIA, Research Scientist; 2025-Present
-
University of Illinois Urbana-Champaign, Ph.D. in Computer Science; 2022-2025
-
Huazhong University of Science and Technology, BS in Software Engineering; 2018-2022
Recent Publications
* indicates equal contribution
-
(LLMs RL Post-Training) Nemotron-Cascade: Scaling Cascaded Reinforcement Learning for General-Purpose Reasoning Models [paper]
Boxin Wang*, Chankyu Lee*, Nayeon Lee*, Sheng-Chieh Lin*, Wenliang Dai*, Yang Chen*, Yangyi Chen*, Zhuolin Yang*, Zihan Liu*, Mohammad Shoeybi, Bryan Catanzaro, Wei Ping* (ordered alphabetically by first name with equal technical contribution*)
Arxiv 2025 -
(Vision-Language Pre-Training) Prioritizing Image-Related Tokens Enhances Vision-Language Pre-Training [paper]
Yangyi Chen, Hao Peng, Tong Zhang, Heng Ji
Arxiv 2025 -
(LLMs Pre-Training Scaling Laws) Scaling Laws for Predicting Downstream Performance in LLMs [paper]
Yangyi Chen, Binxuan Huang, Yifan Gao, Zhengyang Wang, Jingfeng Yang, Heng Ji
TMLR 2025 -
(MLLMs Architecture) A Single Transformer for Scalable Vision-Language Modeling [paper]
Yangyi Chen*, Xingyao Wang*, Hao Peng, Heng Ji
TMLR 2024 -
(Coding Agent) Executable Code Actions Elicit Better LLM Agents [paper]
Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji
ICML 2024