paint-brush
How to Use Approximate Leave-one-out Cross-validation to Build Better Modelsby@ryanburn
3,787 reads
3,787 reads

How to Use Approximate Leave-one-out Cross-validation to Build Better Models

by Ryan Burn1mJuly 20th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Leave-one-out Cross-validation (LOOCV) is one of the most accurate ways to estimate how well a model will perform on out-of-sample data. For the specialized cases of ridge regression, logistic regression, Poisson regression, and other generalized linear models, ALOOCV gives us a much more efficient estimate of out of-sample error that’s nearly as good as LOOCV. In this post, I’ll cover:What is ALOocV and how to use it to identify outliers in a training data set.

Company Mentioned

Mention Thumbnail
featured image - How to Use Approximate Leave-one-out Cross-validation to Build Better Models
Ryan Burn HackerNoon profile picture
Ryan Burn

Ryan Burn

@ryanburn

Mathematical Engineer | Building better models: buildingblock.ai

About @ryanburn
LEARN MORE ABOUT @RYANBURN'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Ryan Burn HackerNoon profile picture
Ryan Burn@ryanburn
Mathematical Engineer | Building better models: buildingblock.ai

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite