Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having too much parameters in your model, so that all of sample/training data is preserved perfectly, is usually considered a bad thing (overfitting).

But you're right - if dataset is exhaustive and finite, and model is large enough to preserve it perfectly - such overfitted model would work just fine, even if it's unlikely to be a particularly efficient way to build it.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: