Sanjeev Arora
(2 articles)2015
Overparameterization Regime
A phase in ML where the model has more parameters than the number of training samples, often leading to a high-variance, overfitted model.
Generality: 550
2019
Double Descent
Phenomenon in ML where the prediction error on test data initially decreases, increases, and then decreases again as model complexity grows.
Generality: 715