The first time on Kaggle, I was ✨intimidated✨.

All these smart people, big companies, and serious prize money. Competitors have years of experience ahead of you, and each competition has its unique challenges.

Even worse, these amazing custom neural networks are trained on beefy GPUs. So much knowledge was required between validation, data loading, network types, and coding.

How was I even supposed to try?

But there was a competition type that took difficult neural networks out of the equation!

Tabular data competitions can be won entirely without neural networks. Many grandmasters even still apply LightGBM or XGBoost to win this type of competition.

Then I knew, that if I learned about Boosting, I would add an important tool to my machine learning belt. XGBoost often works well out of the box, so it was a unique opportunity to experiment with validation workflows and data cleaning techniques.

In the end, I built a satellite image segmentation that worked better with XGBoost than a convolutional neural network.

NASA image of satellite over Earth

I wrote about Boosting in my Newsletter here.