| CARVIEW |
William Timkey
PhD Student
Department of Linguistics
New York University
About
Hello! I am a third-year PhD student in the Department of Linguistics at New York University advised by Tal Linzen.
I completed a BA in Linguistics at Cornell in May 2020, and then spent a year as a Post-Baccalaureate researcher at Cornell with Marten van Schijndel.
My research interests lie broadly within the field of computational psycholinguistics. Specifically, I am interested in how advances in deep learning can inform theories of online sentence processing. Lately, I’ve been investigating how we can model humans’ finite memory capacity in neural language models, with the long-term goal of integrating expectation-based and memory-based theories of online sentence processing.
News
- 11/13/2025 - Check out the preprint of our new work revealing a dissociation between prediction and structural processing in eye movements during reading.
- 3/2025 - Presented a poster at HSP 2025 about whether language model surprisal can better capture garden path effects when syntactic parallelism is limited. Here is the abstract.
- 3/8/2024 - Presented a poster at HSP 2024 about which measures of eye-tracking while reading are captured by LM surprisal. Here is the abstract.
- 12/6/2023 - Presented our new EMNLP Findings paper on modeling similarity-based interference effects with neural self-attention in Singapore at EMNLP 2023.
- 9/1/2022 - Started my PhD at NYU!
- 9/23/2021 - Our EMNLP paper was selected for an oral presentation!
- 8/26/2021 - Our paper on similarity measures in Transformer language models was accepted to EMNLP!
- 8/7/2021 - Presented our ACL Findings paper at ACL 2021.
- 5/6/2021 - Our paper exploring the abstractive capabilities of neural summarization models was accepted to Findings of ACL 2021
- 6/2020 - Joined the C.Psyd group at Cornell
- 5/2020 - Graduated (with distinction!) from Cornell Unviersity with a BA in Linguistics
Publications
William Timkey, and Tal Linzen. “A Language Model with Limited Memory Capacity Captures Interference in Human Sentence Processing” In Findings of the Association for Computational Linguistics: EMNLP. 2023.
William Timkey and Marten van Schijndel. “All Bark and No Bite: Rogue Dimensions in Transformer Language Models Obscure Representational Quality” In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2021.
Matt Wilber, William Timkey, and Marten van Schijndel. “To Point or Not to Point: Understanding How Abstractive Summarizers Paraphrase Text” In Proceedings of the 2021 Findings of the ACL. 2021.
Teaching
I also have a passion for Teaching! Before coming to NYU, I was a Teaching Fellow in the Department of Statisics at Harvard University.
Fall 2023: LING-UA 27 - Language and Mind @ New York University (Teaching Assistant)
Fall 2021: STAT 104 - Introduction to Quantitative Methods for Economics @ Harvard University (Teaching Fellow)
Summer 2021: STAT 100 - Introduction to Quantitative Methods @ Harvard University (Teaching Fellow)
Spring 2021: STAT 102 - Introduction to Quantitative Methods for Life Sciences @ Harvard University (Teaching Fellow)
Fall 2020: STAT 104 - Introduction to Quantitative Methods for Economics @ Harvard University (Teaching Fellow)
Summer 2020: STAT 100 - Introduction to Quantitative Methods @ Harvard University (Teaching Fellow)
Spring 2020: INFO 2950 - Introduction to Data Science @ Cornell University (Teaching Assistant)
Fall 2017: CSE 199 - First Year Seminar in Computer Science @ SUNY Buffalo (Teaching Assistant)