Return to site

Why AdaBoost Is My Fav This Week

Loving ADABoost in the Age of XGBoost

Seems like every data science blog out there extolls the virtues of XGBoost. Many a Kaggle competition has been won. But many folks use XGBoost as a black box, and I'm not down with that all the time. So, I'm going to tell you all about my love for adaptively boosted decision trees.

This week, we were tasked with finding the best algorithm to detect the West Nile Virus in Chicago. So, while ery'body else was gradient boosting and XGBoosting their hearts out, we went with AdaBoost.

AdaBoost is a boosting technique which allows you to combine multiple “weak classifiers” into one strong classifier. Our model came out with the highest Kaggle score in the class. Why? Because of weak-ass learning. Our testing and training data was a little messy, and uneven. So, we needed an algorithm that would turn weak into strong.  

All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK