Welcome to my homepage
I’m a Senior Research Scientist on NVIDIA’s Post-Training Applied Research team, where I push the boundaries of code generation by fine-tuning large language models. Before NVIDIA, I worked at AWS AI Labs, where I focused on code generation for Amazon Q Developer.
I obtained my PhD in Computer Science at University of California Los Angeles, supervised by Dr. Kai-Wei Chang. I was fortunate to work as a research intern at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs during my PhD.
News and Announcements
- [12.2025] We released Nemotron 3 Family of Models.
- [11.2025] Presented tutorial “NLP+Code: Code Intelligence in Language Models” at EMNLP 2025.
- [10.2025] We introduce GenCluster, achieving IOI Gold with open-weight LLMs.
- [10.2025] We released BigCodeArena, check this out!
- [08.2025] We released Nemotron-Nano-v2.
- [06.2025] Co-organizing Deep Learning for Code workshop at NeurIPS 2025.
- [04.2025] We released Nemotron-H, a family of Mamba-Transformer models.
- [04.2025] We released OpenCodeInstruct and OpenCodeReasoning.
- [03.2025] I will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.
