| CARVIEW |
Select Language
HTTP/2 200
date: Mon, 29 Dec 2025 05:43:49 GMT
content-type: text/html; charset=utf-8
server: cloudflare
last-modified: Sun, 10 Nov 2024 15:59:27 GMT
vary: Accept-Encoding
access-control-allow-origin: *
nel: {"report_to":"cf-nel","success_fraction":0.0,"max_age":604800}
expires: Mon, 29 Dec 2025 05:53:49 GMT
cache-control: max-age=600
report-to: {"group":"cf-nel","max_age":604800,"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=7tf9CjObj%2FG%2FH5bfyEjjdeNdGf53AI1J9snyWAGp3yZQlR0SkOx%2FLO92ildgFhIxozrpfJhRksu5d0HTtS0CN2srfZfhXkpGHQJf"}]}
x-proxy-cache: MISS
x-github-request-id: 2A4C:3B65E7:4280FF8:47BDEF2:69521515
cf-cache-status: DYNAMIC
content-encoding: gzip
cf-ray: 9b56fb615b043205-BOM
alt-svc: h3=":443"; ma=86400
Alex Renda
Here is a full CV.
Alex Renda
| Email: | [email protected] |
| Twitter: | @alex_renda_ |
| GitHub: | @alexrenda |
| LinkedIn: | @alexrenda |
| Google Scholar: | Alex Renda |
About
I graduated from MIT with a PhD in EECS in 2024. My research studied machine learning as an abstraction to help programmers develop complex systems, with the goal of making it easier to write programs that are hard or even impossible to write by hand.Here is a full CV.
Education
- Ph.D. student in EECS, MIT CSAIL. 2018-2024.
Thesis: Programming with Neural Surrogates of Programs.
Worked on learning-based systems and efficient neural networks.
Advised by Michael Carbin. - S.M. in Electrical Engineering and Computer Science, MIT. 2020.
Thesis: Comparing Rewinding and Fine-tuning in Neural Network Pruning.
Worked on efficient neural networks.
Advised by Michael Carbin. - B.S. (Summa Cum Laude) in Computer Science with Honors, with a minor in Linguistics, Cornell University. 2018.
Worked on programming abstractions for natural language and intelligent systems as an undergraduate member of the Capra group.
Advised by Adrian Sampson.
Publications
-
CoMEt: x86 Cost Model Explanation Framework.
Isha Chaudhary, Alex Renda, Charith Mendis, and Gagandeep Singh.
MLSys, 2024. Paper. Bibtex. -
Turaco: Complexity-Guided Data Sampling for Training Neural Surrogates of Programs.
Alex Renda, Yi Ding, and Michael Carbin.
OOPSLA, 2023. Paper. Bibtex. Code. Presentation. -
Programming with Neural Surrogates of Programs.
Alex Renda, Yi Ding, and Michael Carbin.
Onward!, 2021. Paper. Bibtex. Code. Presentation. -
DiffTune: Optimizing CPU Simulator Parameters with Learned Differentiable Surrogates.
Alex Renda, Yishen Chen, Charith Mendis, and Michael Carbin.
MICRO, 2020. Paper. Bibtex. Code. Presentation. -
Comparing Rewinding and Fine-tuning in Neural Network Pruning.
Alex Renda, Jonathan Frankle, and Michael Carbin.
ICLR, 2020. Paper. Bibtex. Code. Presentation.
Oral presentation (<2% of submitted papers). -
BHive: A Benchmark Suite and Measurement Framework for Validating x86-64 Basic Block Performance Models.
Yishen Chen, Ajay Brahmakshatriya, Charith Mendis, Alex Renda, Eric Atkinson, Ondřej Sýkora, Saman Amarasinghe, and Michael Carbin.
IISWC, 2019. Paper. Bibtex. Code. -
Ithemal: Accurate, Portable and Fast Basic Block Throughput Estimation using Deep Neural Networks.
Charith Mendis, Alex Renda, Saman Amarasinghe, and Michael Carbin.
ICML, 2019. Paper. Bibtex. Code.
Best Paper award at the ML for Systems workshop at ISCA 2019. -
Programming Language Support for Natural Language Interaction.
Alex Renda, Harrison Goldstein, Sarah Bird, Chris Quirk, and Adrian Sampson.
SysML, 2018. Paper. Extended Draft. Bibtex. Code.
Workshop Papers
-
Can LLMs Generate Random Numbers? Evaluating LLM Sampling in Controlled Domains.
Alex Renda*, Aspen Hopkins*, and Michael Carbin.
Sampling and Optimization in Discrete Space Workshop, ICML, 2023. Paper. Bibtex. Code. -
Renamer: A Transformer Architecture Invariant to Variable Renaming.
Zachary Ankner, Alex Renda, and Michael Carbin.
Machine Learning for Systems Workshop, NeurIPS, 2023. Bibtex. -
The Effect of Data Dimensionality on Neural Network Prunability.
Zachary Ankner, Alex Renda, Gintare Karolina Dziugaite, Jonathan Frankle, and Tian Jin.
I Can’t Believe It’s Not Better Workshop, NeurIPS, 2022. Paper. Bibtex. -
TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning.
Riyadh Baghdadi, Abdelkader Nadir Debbagh, Kamel Abdous, Fatima Zohra Benhamida, Alex Renda, Jonathan Elliott Frankle, Michael Carbin, and Saman Amarasinghe.
Workshop on Systems for ML, NeurIPS, 2019. Paper. Bibtex.
Drafts
-
A Theory of Equivalence-Preserving Program Embeddings.
Logan Weber, Jesse Michel, Alex Renda, Saman Amarasinghe, and Michael Carbin.
2023. Paper. -
Cello: Efficient Computer Systems Optimization with Predictive Early Termination and Censored Regression.
Yi Ding, Alex Renda, Ahsan Pervaiz, Michael Carbin, and Henry Hoffmann.
2022. Paper.
Honors
- NSF GRFP Honorable Mention, 2020
- Best Paper award for Ithemal at the ML for Systems workshop at ISCA 2019
- MIT Great Educators Fellowship, 2018-2019
- Cornell University: Summa Cum Laude with Honors, 2018
Academic Service
- NeurIPS 2023 — Reviewer
- ICML 2023 — Reviewer
- ICLR 2023 — Reviewer
- PLDI 2023 — Social Events Co-Chair
- OOPSLA 2022 — Artifact Evaluator / External Review Committee
- ECOOP 2022 — Artifact Evaluator / External Review Committee
- ICLR 2022 — Reviewer
- POPL 2022 — Artifact Evaluator
- OOPSLA 2021 — Artifact Evaluator
- NeurIPS 2021 — Reviewer
- ICML 2021 — Reviewer
- ASPLOS 2021 — Artifact Evaluator
- ICLR 2021 — Reviewer (Outstanding Reviewer)
- AAAI 2021 — Emergency Reviewer
- NeurIPS 2020 — Reviewer
- ICML 2020 — Reviewer (Top 33% Reviewer)
Institutional Service
Invited Talks
- January, 2022 — NEC Labs Europe — Programming with Neural Surrogates of Programs
- March, 2021 — MIT PLSE Seminar — Learned x86 Cost Models: Steps Towards a Learned Compiler Backend
- November, 2020 — Facebook AI Compiler Group — Learned x86 Cost Models: Steps Towards a Learned Compiler Backend
- July, 2021 — OctoML — DiffTune: Optimizing CPU Simulator Parameters with Learned Differentiable Surrogates
Teaching
- 6.1100 (formerly 6.035) - Computer Language Engineering. Teaching Assistant. MIT. Spring 2023.
- CS 4120 - Introduction to Compilers. Teaching Assistant. Cornell University. Spring 2018.
- CS 2112 - Object Oriented Programming and Data Structures - Honors. Consultant. Cornell University. Fall 2015, Fall 2016.
Industry Experience
- Spring 2021: Consultant at ReadySet
- Summer 2020: MLSys Research Intern at OctoML
- Summer 2018: Software Engineering Intern at Two Sigma
- Summer 2017: Software Engineering Intern at Two Sigma
- Summer 2016: Software Engineering Intern at Facebook
- Summer 2014: System Validation Intern at Tesla