| CARVIEW |
Harvard Machine Learning Foundations Group
We are a research group focused on some of the foundational questions in modern machine learning. We are interested in both experimental and theoretical approaches that advance our understanding. Our group contains ML practitioners, theoretical computer scientists, statisticians, and neuroscientists, all sharing the goal of placing machine and natural learning on firmer foundations, and elucidating their fundamental capabilities and limitations.
Our group organizes the Kempner Seminar Series - a research seminar on the foundations of both natural and artificial learning. See mailing list, Google calendar , and list of talks.
Opportunities: We are looking for graduate students and postdocs. See opportunities section below. Announcements on positions will also be posted on social media.
People
Professors
Boaz Barak
Faculty
Sham Kakade
Faculty
David Alvarez-Melis
Faculty
Postdocs
Samy Jelassi
Postdoctoral Fellow
Alex Damian
Kempner Research Fellow
Bingbin Liu
Kempner Research Fellow
Nihal Nayak
Postdoctoral Fellow
Students
Gustaf Ahdritz
PhD Student
Yang Hu
PhD Student
Gal Kaplun
PhD Student
Anat Kleiman
PhD Student
Depen Morwani
PhD Student
Costin-Andrei Oncescu
PhD Student
Aayush Karan
PhD Student
Alex Meterez
PhD Student
Chloe Su
PhD Student
Rachit Bansal
PhD Student
Yonadav Shavit
PhD Student
Sunny Qin
PhD Student
Sara Kangaslahti
PhD Student
Roy Rinberg
PhD Student
Natalie Abreu
PhD Student
Clara Mohri
PhD Student
Hanlin Zhang
PhD Student
Mary Letey
PhD Student
Jonathan Geuter
PhD Student
Rosie Zhao
PhD Student
Affiliated Faculty
Demba Ba
Faculty
Lucas Janson
Faculty
Seth Neel
Faculty
Cengiz Pehlevan
Faculty
Finale Doshi-Velez
Faculty
Hima Lakkaraju
Faculty
Yue Lu
Faculty
Na Li
Faculty
Michael Mitzenmacher
Faculty
Morgane Austern
Faculty
Sitan Chen
Faculty
Emeritus
Preetum Nakkiran
PhD Student
Dimitris Kalimeris
PhD Student
Alex Atanasov
PhD Student
Yamini Bansal
PhD Student
Blake Bordelon
PhD Student
Chi-Ning Chou
PhD Student
Ben Edelman
PhD Student
Eran Malach
Kempner Research Fellow
Sharon Qian
PhD Student
David Brandfonbrener
Postdoctoral Fellow
Nikhil Vyas
Postdoctoral Fellow
Tristan Yang
Undergraduate
Fred Zhang
PhD Student
Sheng Yang
Masters Student
Jacob Zavatone-Veth
PhD Student
Runyu (Cathy) Zhang
PhD Student
Recent Publications
By our group and its members.
(This list is not comprehensive. Also, we’re sometimes slow in updates—see individual homepages and the arXiv for the latest publications.)
Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit
NeurIPS 2022
Contrasting random and learned features in deep Bayesian linear regression
Manuscript 2022
Deconstructing Distributions: A Pointwise Framework of Learning
Manuscript 2022
Depth induces scale-averaging in overparameterized linear Bayesian neural networks
55th Asilomar Conference 2021
Neural Networks as Kernel Learners: The Silent Alignment Effect
ICLR 2022
Inductive Biases and Variable Creation in Self-Attention Mechanisms
ICML 2022.
Capacity of Group-invariant Linear Readouts from Equivariant Representations: How Many Objects can be Linearly Classified Under All Possible Views?
ICLR 2022
Revisiting Model Stitching to Compare Neural Representations
NeurIPS 2021
Learning Curves for SGD on Structured Features
ICLR 2022
Out-of-Distribution Generalization in Kernel Regression
NeurIPS 2021
Asymptotics of Representation Learning in Finite Bayesian Neural Networks
NeurIPS 2021
For Self-supervised Learning, Rationality Implies Generalization, Provably
ICLR 2020.
The Deep Bootstrap: Good Online Learners are Good Offline Generalizers
ICLR 2020.
Distributional Generalization: A New Kind of Generalization
Manuscript 2020.
Learning From Strategic Agents: Accuracy, Improvement, and Causality
ICML 2020.
Deep Double Descent: Where Bigger Models and More Data Hurt
ICLR 2020.
SGD on Neural Networks Learns Functions of Increasing Complexity
NeurIPS 2019 spotlight talk (top 15% of accepted papers).
More Data Can Hurt for Linear Regression: Sample-wise Double Descent
Manuscript. 2019.
Computational Limitations in Robust Classification and Win-Win Results
COLT 2019.
Minnorm training: an algorithm for training over-parameterized deep neural networks
Manuscript. 2019.
Adversarial Robustness May Be at Odds With Simplicity
Manuscript. 2019.
On the Information Bottleneck Theory of Deep Learning
ICLR 2018.
Recent & Upcoming Talks
The ML Foundations Talks are now the Kempner Seminar Series organized by the ML Foundations Group. For more information about the series, see the line-up of speakers or visit the Kempner Institute events page.Brian DePasquale
Kim Stachenfeld - Predictive models for representation learning and simulation
Stefano Ermon - Score Entropy Discrete Diffusion Models
Andrea Montanari - Solving overparametrized systems of nonlinear equations
Tom Griffiths - Using the Tools of Cognitive Science to Understand the Behavior of Large Language Models
Larry Abbott - Modeling the Navigational Circuitry of the Fly
Rajesh Rao - Active Predictive Coding: A Sensory-Motor Theory of the Neocortex and a Unifying Framework for AI
Noam Brown - CICERO: Human-Level Performance in the Game of Diplomacy by Combining Language Models with Strategic Reasoning
Carsen Stringer - Unsupervised pretraining in biological neural networks
Emmanuel Abbe - Logic Reasoning and Generalization on the Unseen
Denny Zhou - Teach language models to reason
Tom Goldstein - Dataset security issues in generative AI
Yann LeCun - Towards Machines that can Learn, Reason, and Plan.
Timothy Lillicrap - Model-based reinforcement learning and the future of language models
Yejin Choi - Common Sense: the Dark Matter of Language and Intelligence
Seminar Calendar
Below is the calendar of events in the Kempner ML Foundations seminar. Join the mailing list for talk announcements.
Opportunities
We are looking for undergraduate researchers, graduate students and postdocs in the ML foundations group.
For undergraduate students, we are only able to work with students at Harvard or MIT (with preference to the former). If you are a Harvard or MIT student interested in collaborating, informally or formally, with us, please fill out the following google form. Students might also be interested in taking Boaz’s Spring 2023 seminar on the foundations of deep learning.
For graduate students we have openings in Computer Science, Electrical Engineering,applied mathematics or statistics degrees. New: Kempner Institute Graduate Fellowship: See more details here
If you are applying for graduate studies in CS and are interested in machine learning foundations, please mark both “Machine Learning” and “Theory of Computation” as areas of interest. Please also list the names of faculty you want to work with on your application. ML foundations group faculty include Demba Ba (Electrical Engineering and Bioengineering), David Alvarez-Melis, Boaz Barak, Sitan Chen, Jonathan Frankle, Sham Kakade (Computer Science), Cengiz Pehlevan (Applied Mathematics), and Lucas Janson (Statistics). There are also ML foundations affiliated faculty in all of the above departments and more. All of us are also open to the possibilities of co-advising students, including across different departments and schools.
Postdoc opportunities for 2024-2025 Academic year:
There are a number of opportunities at Harvard for postdoc positions. Applying to multiple positions is not just allowed but encouraged, and we urge you to apply to any of those that are of interest to you.
Kempner Institute Fellows - a three-year prestigious position for postdocs in AI/ML/related areas interested in “fundamentally advancing our understanding of natural and artificial intelligence.” Apply by October 9 2023
Computer science postdocs Postdocs in the ML foundations, Rabin Fellowship, Privacy Tools, Theory of Society. Coming soon.
Postdoctoral positions at Harvard Data Science initiative
The George F. Carrier Postdoctoral Fellowship in Applied Mathematics.
Postdoctoral fellow in theoretical and computational neuroscience at the Swartz Program
Center for Research on Computation and Society (CRCS) postdoc position
Postdoc positions at the Materials Intelligence Group
Follow us on social media for announcements of more opportunities.