# Tag: learning-theory

- [OPML#7] BLN20 & BS21: Smoothness and robustness of neural net interpolators (22 Sep 2021)

- [OPML#6] XH19: On the number of variables to use in principal component regression (11 Sep 2021)

- How many neurons are needed to approximate smooth functions? A summary of our COLT 2021 paper (15 Aug 2021)

- [OPML#5] BL20: Failures of model-dependent generalization bounds for least-norm interpolation (30 Jul 2021)

- [OPML#4] HMRT19: Surprises in high-dimensional ridgeless least squares interpolation (23 Jul 2021)

- Orthonormal function bases: what they are and why we care (16 Jul 2021)

- [OPML#3] MVSS19: Harmless interpolation of noisy data in regression (16 Jul 2021)

- [OPML#2] BLLT19: Benign overfitting in linear regression (11 Jul 2021)

- [OPML#1] BHX19: Two models of double descent for weak features (05 Jul 2021)

- [OPML#0] A series of posts on over-parameterized machine learning models (04 Jul 2021)

## Archive

technical

learning-theory

over-parameterized

candidacy

least-squares

research

neural-net

background