| CARVIEW |
Nathaniel Imel
I am a Ph.D. student in Cognition and Perception at NYU, working with Noga Zaslavsky in the InfoCog Lab.
My research broadly explores how ideas about information, learning and evolution can help us understand behavior in intelligent systems. Recent projects have focused on characterizing the evolutionary dynamics of efficient semantic representations in humans and machines.
Before coming to NYU, I spent time at UC Irvine as a Ph.D. student in the departments of Logic and Philosophy of Science and Language Science. Before that I earned an M.S. in computational linguistics at the University of Washington, working with Shane Steinert-Threlkeld in the CLMBR lab. Before that I earned a B.A. in philosophy at UC San Diego.
Papers
* = equal contribution
- An efficient communication analysis of modal typology - Imel, N., Guo, Q., & Steinert-Threlkeld, S. (accepted). Open Mind.
- Culturally transmitted color categories in LLMs reflect a learning bias toward efficient compression - Imel, N. & Zaslavsky, N. (2025). NeurIPS 2025 Workshop on Interpreting Cognition in Deep Learning Models. Best paper award.
- Iterated language learning is shaped by a drive for optimizing lossy compression - Imel, N. & Culbertson, J. & Kirby, S. & Zaslavsky, N. (2025). Proceedings of the 47th Annual Meeting of the Cognitive Science Society.
- Bilinguals exhibit semantic convergence in categorization while remaining optimally efficient - Taliaferro, M.* & Imel, N.* & Blanco-Elorietta, E. & Zaslavsky, N. (2025). Proceedings of the 47th Annual Meeting of the Cognitive Science Society.
- Density, asymmetry and citation dynamics in scientific literature - Imel, N. & Hafen-Saavedra, Z. (2025). arxiv.
- The Unnatural Language ToolKit (ULTK) - Imel, N. & Haberland, C. & Steinert-Threlkeld, S. (2025). Proceedings of the 8th Annual Society for Computation in Linguistics.
- Optimal compression in human concept learning - Imel, N. & Zaslavsky, N. (2024). Proceedings of the 46th Annual Meeting of the Cognitive Science Society.
- Citation-similarity relationships in astrophysics literature - Imel, N., & Hafen-Saavedra, Z. (2023). NeurIPS 2023 Workshop on AI for Scientific Discovery: From Theory to Practice.
- Noisy population dynamics lead to efficiently compressed semantic systems - Imel, N., & Futrell, R., & Franke, M., & Zaslavsky, N. (2023). NeurIPS 2023 Workshop on Information-Theoretic Principles in Cognitive Systems.
- The evolution of efficient compression in signaling games - Imel, N. (2023). Proceedings of the 45th Annual Meeting of the Cognitive Science Society.
- Deontic priority in the lexicalization of impossibility modals - Uegaki, W., Mucha, A., Imel, N., & Steinert-Threlkeld, S. (2023). PsyArXiv.
- A semantic universal for modality - Steinert-Threlkeld, S., Imel, N., & Guo, Q. (2023). Semantics and Pragmatics, 16, 1:EA-1:EA.
- A database for modal semantic typology - Guo, Q. Imel, N., and Steinert-Threlkeld, S. (2022). Proceedings of the 4th Workshop on Computational Typology and Multilingual NLP (SIGTYP 2022). pp 42-51.
- Modal semantic universals optimize the simplicity/informativeness trade-off - Imel, N. and Steinert-Threlkeld, S. (2022). Proceedings of Semantics and Linguistic Theory 32.