HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Tue, 30 Dec 2025 01:47:16 GMT
access-control-allow-origin: *
etag: W/"69532f24-21c0"
expires: Tue, 30 Dec 2025 06:02:31 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: C1EE:2B0FD4:9B3730:AE4DEA:6953689F
accept-ranges: bytes
age: 0
date: Tue, 30 Dec 2025 05:52:31 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210075-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767073952.693224,VS0,VE205
vary: Accept-Encoding
x-fastly-request-id: 4e4a193f209e6b8490fd1768025c4fe34ed3f482
content-length: 2748
Tselil Schramm
Tselil Schramm
tselil AT stanford DOT edu
Office: CoDa E254
I am an assistant professor at Stanford in the Department of Statistics (and in Computer Science and Mathematics, by courtesy).
My research is at the intersection of theoretical computer science and statistics.
I study algorithms for high-dimensional estimation problems, and I work to characterize and explain information-computation tradeoffs.
Before joining Stanford,
I received my PhD from
U.C. Berkeley, where I was lucky to be advised by
Prasad Raghavendra and
Satish Rao.
After that I was a postdoc at
Harvard and
MIT, hosted by the wonderful quadrumvirate of
Boaz Barak,
Jon Kelner,
Ankur Moitra, and
Pablo Parrilo.
Here is a tutorial for pronouncing my name.
Teaching:
Winter 2026: Theory of Statistics II (STATS300B)
Spring 2025: Intro to Statistics (precalculus) (STATS 60)
Winter 2025: Theory of Statistics II (STATS 300B)
Fall 2024: Machine Learning Theory (STATS 214 / CS 228M)
Winter 2024: Theory of Statistics II (STATS 300B)
Fall 2023: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2023: Probability Theory (STATS 116)
Winter 2023: Intro to Stochastic Processes 1 (STATS 217)
Fall 2022: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2022: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 314a)
Winter 2022: Random Processes on Graphs and Lattices (STATS 221)
Spring 2021: Probability Theory (STATS 116)
Winter 2021: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 319)
Selected and Recent Papers [all papers]:
Polynomial-time sampling despite disorder chaos
[arXiv]
with
Eric Ma,
in submission
Some easy optimization problems have the overlap-gap property
[arXiv]
with
Shuangping Li,
in COLT 2025.
Discrepancy Algorithms for the Binary Perceptron
[arXiv]
with
Shuangping Li and Kangjie Zhou,
in STOC 2025.
Fast, robust approximate message passing
[arXiv]
with
Misha Ivkov,
in STOC 2025.
Semidefinite programs simulate approximate message passing robustly
[arXiv]
with
Misha Ivkov,
in STOC 2024.
Spectral clustering in the Gaussian mixture block model
[arXiv]
with
Shuangping Li
,
in submission.
Local and global expansion in random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in STOC 2023.
Testing thresholds for high-dimensional sparse random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in STOC 2022.
Invited to the STOC 2022 special issue of SICOMP.
Statistical query algorithms and low-degree tests are almost equivalent
[arXiv]
with
Matthew Brennan,
Guy Bresler,
Sam Hopkins and
Jerry Li,
in COLT 2021, runner up to the Best Paper.
Computational barriers to estimation from low-degree polynomials
[arXiv]
with
Alex Wein,
in The Annals of Statistics, 2022.
Subexponential LPs approximate max-cut
[arXiv]
with
Sam Hopkins and
Luca Trevisan,
in FOCS 2020.
On the power of sum-of-squares for detecting hidden structures
[arXiv]
with
Sam Hopkins,
Pravesh Kothari,
Aaron Potechin,
Prasad Raghavendra, and
David Steurer,
in FOCS 2017.
Strongly refuting random CSPs below the spectral threshold
[arXiv]
with
Prasad Raghavendra
and
Satish Rao,
in STOC 2017.
Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
[arXiv]
with Sam Hopkins, Jonathan Shi, and David Steurer,
in STOC 2016.