Nina's Page
Maria-Florina Balcan
Nina
Cadence
Design Systems Professor of Computer Science
School
of
Computer Science (
MLD and
CSD)
Carnegie Mellon
University
Office: GHC 8205
Phone: 412-268-5295
Email: ninamf AT cs DOT cmu DOT edu
I am the Cadence Design Systems Professor of Computer Science
at Carnegie Mellon University. My main research interests are in
machine learning, artificial intelligence, theoretical computer
science, algorithmic game theory, and novel connections between
learning theory and other scientific fields. Current research focus
includes:
- Foundations for modern machine learning. These include new
theoretical analyses and principled algorithms for topics of
current high interest in machine learning (deep learning,
learning from limited supervision, learning representations, and
life-long learning). Also learnability of much more complex
objects and processes (including algorithms).
- Algorithm design and analysis, including the use of machine
learning for algorithm design and beyond the worst-case analysis
of algorithms.
- Computational and data-driven approaches in game theory and
economics.
- Computational, learning theoretic, and game theoretic aspects
of multi-agent systems.
I am an ACM Fellow, an AAAI Fellow, a Simons Investigator, and a
recipient of the 2019 ACM Grace Murray Hopper Award (awarded to the
outstanding young computer professional of the year). I was a
Program Committee Co-chair for
NeurIPS 2020,
ICML 2016, and
COLT 2014.
For more information see my
my resume.
Research
List of all my papers here.
Selected recent papers
- Learning
Accurate and Interpretable Decision Trees. With Dravyansh
Sharma. UAI 2024 (Oral). Winner of the Outstanding student paper
award.
- Learning to Relax:
Setting Solver Parameters Across a Sequence of Linear System
Instances. With Mikhail Khodak, Edmond Chow, and Ameet
Talwalkar. ICLR 2024 (Spotlgiht).
- Regret Minimization
in Stackelberg Games with Side Information. With Keegan
Harris and Zhiwei Steven Wu. NeurIPS 2024.
- How
much data is sufficient to learn high-performing algorithms?
With Dan DeBlasio, Travis Dick, Carl Kingsford, Tuomas Sandholm,
and Ellen Vitercik. Journal of the ACM 2024.
Earlier version in STOC 2021.
- Learning
to Branch: Generalization Guarantees and Limits of
Data-Independent Discretization. With Travis Dick, Tuomas
Sandholm, and Ellen Vitercik. Journal of the ACM 2024.
Earlier version in ICML 2018.
- Generalization
Guarantees for Multi-Item Profit Maximization: Pricing,
Auctions, and Randomized Mechanisms With Tuomas Sandholm
and Ellen Vitercik. Operations Research (OR) 2023
(accepted).
Earlier version in EC 2018.
- Reliable learning
in challenging environments.With Steve Hanneke, Rattana
Pukdee, and Dravyansh Sharma. NeurIPS 2023.
- Structural
Analysis of Branch-and-Cut and the Learnability of Gomory
Mixed Integer Cuts. With Siddharth Prasad, Tuomas
Sandholm, and Ellen Vitercik. NeurIPS 2022 (Oral).
- Robustly-reliable
learners under poisoning attacks. With Avrim Blum, Steve
Hanneke, Dravyansh Sharma. COLT 2022.
- Data driven
semi-supervised learning. With Dravyansh Sharma. NeurIPS
2021 (Oral).
-
Data-driven algorithm design. Chapter 29 in the Beyond the
Worst-Case Analysis of Algorithms book, Cambridge University
Press, 2020.
-
Noise in Classification. With Nika Haghtalab. Chapter 16
in the Beyond the Worst-Case Analysis of Algorithms book,
Cambridge University Press, 2020.
- Semi-bandit
Optimization in the Dispersed Setting. With Travis Dick
and Wesley Pegden. UAI 2020.
-
Learning to Link. With Travis Dick and Manuel Lang. ICLR
2020.
- Adaptive
Gradient-Based Meta-Learning Method. With Mikhail Khodak
and Ameet Talwalkar. NeurIPS 2019.
-
Envy-free Classification. With Travis Dick, Ritesh
Noothigattu, and Ariel D. Procaccia. NeurIPS 2019.
- Estimating
Approximate Incentive Compatibility. With Tuomas Sandholm
and Ellen Vitercik. ACM EC 2019. Winner of the Exemplary
Artificial Intelligence Track Paper Award.
- Dispersion for
Data-Driven Algorithm Design, Online Learning, and Private
Optimization. With Travis Dick and Ellen Vitercik. FOCS
2018.
- Data-Driven
Clustering
via
Parameterized Lloyd's Families. With Travis Dick and Colin
White. NeurIPS 2018 (Spotlight).
- Submodular
Functions: Learnability, Structure, and Optimization. With
Nick Harvey. SIAM Journal of Computing 2018.
Earlier version titled Learning
Submodular
Functions in STOC 2011.
Also a NECTAR track paper at ECML-PKDD 2012 (for "significant
machine learning results"). See short
abstract.
- Learning-Theoretic
Foundations
of
Algorithm Configuration for Combinatorial Partitioning
Problems. With Vaishnavh Nagarajan, Ellen Vitercik, and
Colin White. COLT 2017.
- The
Power
of Localization for Learning Linear Separators with Noise.
With Pranjal Awasthi and Phil Long. Journal of the ACM
2017.
Earlier version
in STOC 2014.
- Data
Driven
for Learning. With Travis Dick, Mu Li, Venkata Krishna
Pillutla, Colin White, and Alex Smola. AISTATS 2017.
-
Nash Equilibria in Perturbation-Stable Games. With Mark
Braverman. Theory of Computing Journal 2017.
-
Clustering under Perturbation Resilience. With Yingyu
Liang. SIAM Journal of Computing 2016.
-
Efficient Algorithms for Learning and 1-bit Compressed Sensing
under Asymmetric Noise. With Pranjal Awasthi, Nika
Haghtalab, and Hongyang Zhang. COLT 2016.
-
Statistical Active Learning Algorithms for Noise Tolerance and
Differential Privacy. With Vitaly Feldman. Algorithmica
2015 (special issue, invited).
Earlier version
in NIPS 2013.
- Robust
Hierarchical Clustering. With Yingyu Liang and Pramod
Gupta. Journal of Machine Learning Research 2014.
Earlier
version in COLT 2010.
Teaching
- 10-701 Machine Learning (at CMU): Spring 2021(with Henry
Chai), Spring 2022.
- 10-315 Machine Learning (at CMU): Spring 2021(with Leila
Wehbe), Spring 2019.
- 10-715 Advanced Introduction to Machine Learning (at CMU):
Fall 2018, Fall 2017.
- 10-401 Machine Learning (at CMU): Spring 2018.
- 10-601 Machine Learning (at CMU): Fall 2016 (with Matt
Gormley), Spring 2016 (with William Cohen), Spring 2015 (with
Tom Mitchell).
- 10-806 Foundations of Machine Learning and Data Science (at
CMU): Fall 2015 (with Avrim Blum).
- Machine Learning Theory (at Georgia Tech): Fall 2013, Fall
2011, Spring 2010.
- Design and Analysis of Algorithms (at Georgia Tech): Spring
2014, Spring 2013, Fall 2012, Spring 2011.
- Connections between Learning, Game Theory, and Optimization
(at Georgia Tech) Fall 2010.
- Machine Learning Theory (at CMU). Spring 200 (with Avrim
Blum).
- Artificial Intelligence (Spring 2002, University of Bucharest)