paint-brush
Boosting Algorithms: AdaBoost, Gradient Boosting and XGBoostby@grohith327
44,800 reads
44,800 reads

Boosting Algorithms: AdaBoost, Gradient Boosting and XGBoost

by Rohith Gandhi3mMay 5th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Neural networks and Genetic algorithms are our naive approach to imitate nature. They work well for a class of problems but they do have various hurdles such as overfitting, local minima, vanishing gradient and much more. There is another set of algorithms that do not get much recognition(in my opinion) compared to others and they are boosting algorithms.

Company Mentioned

Mention Thumbnail
featured image - Boosting Algorithms: AdaBoost, Gradient Boosting and XGBoost
Rohith Gandhi HackerNoon profile picture
Rohith Gandhi

Rohith Gandhi

@grohith327

L O A D I N G
. . . comments & more!

About Author

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite