Contact
E-mail:
pilanci[at]stanford.edu
Address:
350 Jane Stanford Way
Packard Building, Room 255
Stanford, CA 94305-9510
Teaching
Spring 2025 - Stanford University
Spring 2024 - Stanford University
Autumn 2023 - Stanford University
Winter 2021 - Stanford University
Research group
News
May 1.
We are excited to announce that
2 papers from our group have been accepted to ICML 2025!
Jan 22.
Happy to announce that
2 papers from our group have been accepted to ICLR 2025!
Dec 20.
Delighted to announce that
4 papers from our group have been accepted to ICASSP 2025!
Sep 25.
We are excited to announce that
4 papers have been accepted to NeurIPS 2024!
Selected Publications
E. Zeger, Y. Wang, A. Mishkin, T. Ergen, E. J. Candès, M. Pilanci —
A Library of Mirrors: Deep Neural Nets in Low Dimensions are Convex Lasso Models with Reflection Features —
preprint, 2024 —
PDF |
arXiv
B. Bartan, M. Pilanci —
Neural Spectrahedra and Semidefinite Lifts: Global Convex Optimization of Polynomial Activation Neural Networks —
Mathematical Programming, 2024 —
PDF |
arXiv |
code
R. Saha, N. Sagan, V. Srivastava, A.J. Goldsmith, M. Pilanci —
Compressing Large Language Models using Low Rank and Low Precision Decomposition —
Neurips 2024 —
arXiv |
code
F. Zhang, M. Pilanci —
Spectral Adapter: Fine-Tuning in Spectral Space —
Neurips 2024 —
arXiv
S. Kim, M. Pilanci —
Convex Relaxations of ReLU Neural Networks Approximate Global Optima in Polynomial Time —
ICML 2024 (spotlight) —
PDF
M. Pilanci —
From Complexity to Clarity: Analytical Expressions of Deep Neural Network Weights via Clifford's Geometric Algebra and Convexity —
TMLR 2024 —
PDF |
arXiv
M. Pilanci —
Computational Polarization: An Information-Theoretic Method for Resilient Computing —
IEEE Transactions on Information Theory, 2022 —
DOI |
arXiv
M. Pilanci —
Information-Theoretic Bounds on Sketching —
Book chapter, Cambridge U. Press, 2021 —
PDF
M. Pilanci, T. Ergen —
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-Layer Networks —
ICML 2020 —
PDF |
arXiv |
code