Harvard Machine Learning Foundations Group

We are a research group focused on some of the foundational questions in modern machine learning. We are interested in both experimental and theoretical approaches that advance our understanding. Our group contains ML practitioners, theoretical computer scientists, statisticians, and neuroscientists, all sharing the goal of placing machine and natural learning on firmer foundations, and elucidating their fundamental capabilities and limitations.

Our group organizes the Kempner Seminar Series - a research seminar on the foundations of both natural and artificial learning. See mailing list, Google calendar , and list of talks.

Opportunities: We are looking for graduate students and postdocs. See opportunities section below. Announcements on positions will also be posted on social media.

People

Researchers

Avatar

Gustaf Ahdritz

PhD Student

Avatar

Alex Atanasov

PhD Student

Avatar

Demba Ba

Faculty

Avatar

Boaz Barak

Faculty

Avatar

Blake Bordelon

PhD Student

Avatar

Chi-Ning Chou

PhD Student

Avatar

Ben Edelman

PhD Student

Avatar

Yang Hu

PhD Student

Avatar

Samy Jelassi

Postdoctoral Fellow

Avatar

Seth Neel

Faculty

Avatar

Gal Kaplun

PhD Student

Avatar

Anat Kleiman

PhD Student

Avatar

Depen Morwani

PhD Student

Avatar

David Brandfonbrener

Postdoctoral Fellow

Avatar

Yonadav Shavit

PhD Student

Avatar

Sunny Qin

PhD Student

Avatar

Nikhil Vyas

Postdoctoral Fellow

Avatar

Sara Kangaslahti

PhD Student

Avatar

Roy Rinberg

PhD Student

Avatar

Natalie Abreu

PhD Student

Avatar

Clara Mohri

PhD Student

Avatar

Hanlin Zhang

PhD Student

Avatar

Mary Letey

PhD Student

Avatar

Jonathan Geuter

PhD Student

Avatar

Sitan Chen

Faculty

Avatar

Sheng Yang

Masters Student

Avatar

Rosie Zhao

PhD Student

Affiliated

Avatar

Yue Lu

Faculty

Avatar

Na Li

Faculty

Emeritus

Avatar

Preetum Nakkiran

PhD Student

Avatar

Yamini Bansal

PhD Student

Avatar

Sharon Qian

PhD Student

Avatar

Tristan Yang

Undergraduate

Avatar

Fred Zhang

PhD Student

Recent Publications

By our group and its members.

(This list is not comprehensive. Also, we’re sometimes slow in updates—see individual homepages and the arXiv for the latest publications.)

Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit

Contrasting random and learned features in deep Bayesian linear regression

Deconstructing Distributions: A Pointwise Framework of Learning

Depth induces scale-averaging in overparameterized linear Bayesian neural networks

Neural Networks as Kernel Learners: The Silent Alignment Effect

Inductive Biases and Variable Creation in Self-Attention Mechanisms

Capacity of Group-invariant Linear Readouts from Equivariant Representations: How Many Objects can be Linearly Classified Under All Possible Views?

Revisiting Model Stitching to Compare Neural Representations

Learning Curves for SGD on Structured Features

Out-of-Distribution Generalization in Kernel Regression

Asymptotics of Representation Learning in Finite Bayesian Neural Networks

For Self-supervised Learning, Rationality Implies Generalization, Provably

The Deep Bootstrap: Good Online Learners are Good Offline Generalizers

Distributional Generalization: A New Kind of Generalization

Learning From Strategic Agents: Accuracy, Improvement, and Causality

Deep Double Descent: Where Bigger Models and More Data Hurt

SGD on Neural Networks Learns Functions of Increasing Complexity

More Data Can Hurt for Linear Regression: Sample-wise Double Descent

Computational Limitations in Robust Classification and Win-Win Results

Minnorm training: an algorithm for training over-parameterized deep neural networks

Adversarial Robustness May Be at Odds With Simplicity

On the Information Bottleneck Theory of Deep Learning

Recent & Upcoming Talks

Widget Image The ML Foundations Talks are now the Kempner Seminar Series organized by the ML Foundations Group. For more information about the series, see the line-up of speakers or visit the Kempner Institute events page.

Navigation requires orienting oneself relative to landmarks in the environment, evaluating relevant sensory data, remembering goals, …

Recent neurophysiological experiments indicate that almost all cortical areas, even those traditionally labelled as primary sensory …

In this talk I will describe CICERO, the first AI agent to achieve human-level performance in Diplomacy, a strategy game involving both …

Representation learning in neural networks may be implemented with supervised or unsupervised algorithms, distinguished by the presence …

Transformers have become the dominant neural network architecture in deep learning. While they are state of the art in language and …

Over the past decades, the machine learning community has developed tons of data-driven techniques aimed at enhancing learning …

Machine learning systems are built using large troves of training data that may contain private or copyrighted content. In this talk, …

How could machines learn as efficiently as humans and animals? How could machines learn how the world works and acquire common sense? …

Large language models are capable of an incredible array of tasks. Language models are pre-trained on large amounts of text data from …

Seminar Calendar

Below is the calendar of events in the Kempner ML Foundations seminar. Join the mailing list for talk announcements.

Opportunities

We are looking for undergraduate researchers, graduate students and postdocs in the ML foundations group.

For undergraduate students, we are only able to work with students at Harvard or MIT (with preference to the former). If you are a Harvard or MIT student interested in collaborating, informally or formally, with us, please fill out the following google form. Students might also be interested in taking Boaz’s Spring 2023 seminar on the foundations of deep learning.

For graduate students we have openings in Computer Science, Electrical Engineering,applied mathematics or statistics degrees. New: Kempner Institute Graduate Fellowship: See more details here

If you are applying for graduate studies in CS and are interested in machine learning foundations, please mark both “Machine Learning” and “Theory of Computation” as areas of interest. Please also list the names of faculty you want to work with on your application. ML foundations group faculty include Demba Ba (Electrical Engineering and Bioengineering), David Alvarez-Melis, Boaz Barak, Sitan Chen, Jonathan Frankle, Sham Kakade (Computer Science), Cengiz Pehlevan (Applied Mathematics), and Lucas Janson (Statistics). There are also ML foundations affiliated faculty in all of the above departments and more. All of us are also open to the possibilities of co-advising students, including across different departments and schools.

Postdoc opportunities for 2024-2025 Academic year:

There are a number of opportunities at Harvard for postdoc positions. Applying to multiple positions is not just allowed but encouraged, and we urge you to apply to any of those that are of interest to you.

  • Kempner Institute Fellows - a three-year prestigious position for postdocs in AI/ML/related areas interested in “fundamentally advancing our understanding of natural and artificial intelligence.” Apply by October 9 2023

  • Computer science postdocs Postdocs in the ML foundations, Rabin Fellowship, Privacy Tools, Theory of Society. Coming soon.

  • Postdoctoral positions at Harvard Data Science initiative

  • The George F. Carrier Postdoctoral Fellowship in Applied Mathematics.

  • Postdoctoral fellow in theoretical and computational neuroscience at the Swartz Program

  • Center for Research on Computation and Society (CRCS) postdoc position

  • Postdoc positions at the Materials Intelligence Group

Follow us on social media for announcements of more opportunities.