Welcome to my homepage

I’m a Senior Research Scientist on NVIDIA’s Post-Training Applied Research team, where I push the boundaries of code generation by fine-tuning large language models. Before NVIDIA, I worked at AWS AI Labs, where I focused on code generation for Amazon Q Developer.

I obtained my PhD in Computer Science at University of California Los Angeles, supervised by Dr. Kai-Wei Chang. I was fortunate to work as a research intern at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs during my PhD.

News and Announcements

  1. [12.2025] We released Nemotron 3 Family of Models.
  2. [11.2025] Presented tutorial “NLP+Code: Code Intelligence in Language Models” at EMNLP 2025.
  3. [10.2025] We introduce GenCluster, achieving IOI Gold with open-weight LLMs.
  4. [10.2025] We released BigCodeArena, check this out!
  5. [08.2025] We released Nemotron-Nano-v2.
  6. [06.2025] Co-organizing Deep Learning for Code workshop at NeurIPS 2025.
  7. [04.2025] We released Nemotron-H, a family of Mamba-Transformer models.
  8. [04.2025] We released OpenCodeInstruct and OpenCodeReasoning.
  9. [03.2025] I will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.