Is XGBoost really all we need?
If you have experience building machine learning models on tabular data you will have experienced that gradient boosting based algorithms like catboost, lightgbm and xgboost are almost always superior.
It's not for nothing Bojan Tunguz (a quadruple kaggle grandmaster employed by Nvidia) states:
XGBoost Is All You Need
— Bojan Tunguz (@tunguz) March 30, 2022
Deep Neural Networks and Tabular Data: A Surveyhttps://t.co/Z2KsHP3fvp pic.twitter.com/uh5NLS1fVP
... but aren't we all fooling ourselves?