I Used to Hate Overfitting, But Now Im Grokking It | by Laurin Heilmeyer | Jul, 2024 – Towards Data Science

The surprising generalisation beyond overfitting 8 min read

As someone who spent considerable time with various computer science topics, where mathematical abstractions can sometimes be very dry and abstract, I find the practical, hands-on nature of data science to be a breath of fresh air. It never fails to amaze me how even the simplest ideas can lead to fascinating results.

This article faces one of these surprising revelations I just stumbled upon.

Ill never forget how the implementation of my Bachelors thesis went. While it was not about machine learning, it had a formative effect on me, and Ill manage to constantly remind myself of it when tinkering with neural networks. It was an intense time because the thesis was about an analytical model of sound propagation that aimed to run in the browser, thus having very limited performance leading to long running simulations. It constantly failed to complete after running for many hours. But the worst experience was interpreting wrongly configured simulations with confusing results, which often made me think the whole model was nonsense.

The same happens from time to time when I actually train neural networks myself. It can be exhausting to

Link:

I Used to Hate Overfitting, But Now Im Grokking It | by Laurin Heilmeyer | Jul, 2024 - Towards Data Science

Related Posts

Comments are closed.