| CARVIEW |
Select Language
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Wed, 15 Oct 2025 22:46:01 GMT
access-control-allow-origin: *
etag: W/"68f02429-41af"
expires: Tue, 30 Dec 2025 12:50:09 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: C7C6:2D64E0:A05956:B42795:6953C829
accept-ranges: bytes
age: 0
date: Tue, 30 Dec 2025 12:40:09 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210048-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767098410.637776,VS0,VE207
vary: Accept-Encoding
x-fastly-request-id: 91d412895e3733199a9405f0ccda4920292221f1
content-length: 4201
Charlotte Chang
Ting-Yun (Charlotte) Chang
Hi, I'm Ting-Yun Chang 張婷雲. I am a final-year PhD student at USC CS, co-advised by Robin Jia and Jesse Thomason. I am interested in improving large language models post-training based on scientific insights, e.g., understanding the internal mechanisms of in-context learning to make LLMs less sensitive to prompt designs. Recently, I have been working on LLM quantization, studying the root causes of performance degradation to minimize errors after low-bit quantization.
Previously, I did my bachelor's and master's degrees in Taiwan, both in Computer Science. I was advised by Yun-Nung (Vivian) Chen at National Taiwan University and Chi-Jen Lu at Academia Sinica.
Hi, I'm Ting-Yun Chang 張婷雲. I am a final-year PhD student at USC CS, co-advised by Robin Jia and Jesse Thomason. I am interested in improving large language models post-training based on scientific insights, e.g., understanding the internal mechanisms of in-context learning to make LLMs less sensitive to prompt designs. Recently, I have been working on LLM quantization, studying the root causes of performance degradation to minimize errors after low-bit quantization.
Previously, I did my bachelor's and master's degrees in Taiwan, both in Computer Science. I was advised by Yun-Nung (Vivian) Chen at National Taiwan University and Chi-Jen Lu at Academia Sinica.
Experience
Research Assistant
Fall 2021 - Spring 2026
University of Southern California
Advisors: Robin Jia and Jesse Thomason
Research Intern
Summer 2025
Google DeepMind
Applied Scientist Intern
Summer 2024
Amazon AWS AI
Applied Scientist Intern
Spring 2020
Amazon Alexa AI
Publications
TA
- USC CS544 Applied Natural Language Processing (Fall 2024)
- USC CS467 Introduction to Machine Learning (Spring 2023)
- NTU CSIE Applied Deep Learning (Spring 2019)
Other
- Women's 4 × 100 metres relay record holder of National Wu-Ling Senior High School (Since 2012)
- Varsity table tennis team of National Tsing Hua University (2014 – 2017)
- 2015 Mei-Chu Tournament – relay race
- I love Classical Chinese poetry. My favorite poem is 蝶戀花·憶掛孤帆東海畔
- My dream is to play table tennis in different cities with different people :)
- Unlocked Regions: LA, Bay Area, NYC, Honolulu, Beijin, Hangzhou, Taipei, Hsinchu