paint-brush
Difference Between Boosting Trees: Updates to Classics With CatBoost, XGBoost and LightGBMby@teenl0ve
703 reads
703 reads

Difference Between Boosting Trees: Updates to Classics With CatBoost, XGBoost and LightGBM

by Valentine Shkulov5mApril 17th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This publication discusses the differences between popular boosting tree algorithms such as CatBoost, XGBoost, and LightGBM. It covers the historical development of boosting techniques, starting with AdaBoost, and moving on to Gradient Boosting Machines (GBM), XGBoost, LightGBM, and CatBoost. Each algorithm has unique features and strengths, with CatBoost excelling in handling categorical features, XGBoost offering high performance and regularization, and LightGBM focusing on speed and efficiency. The choice of the algorithm depends on the problem and dataset, and it's recommended to try all three to find the best fit.
featured image - Difference Between Boosting Trees: Updates to Classics With CatBoost, XGBoost and LightGBM
Valentine Shkulov HackerNoon profile picture
Valentine Shkulov

Valentine Shkulov

@teenl0ve

Data Science expert with desire to help companies advance by applying AI for process improvements.

L O A D I N G
. . . comments & more!

About Author

Valentine Shkulov HackerNoon profile picture
Valentine Shkulov@teenl0ve
Data Science expert with desire to help companies advance by applying AI for process improvements.

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite