How to Use Approximate Leave-one-out Cross-validation to Build Better Models by@ryanburn

How to Use Approximate Leave-one-out Cross-validation to Build Better Models

Leave-one-out Cross-validation (LOOCV) is one of the most accurate ways to estimate how well a model will perform on out-of-sample data. For the specialized cases of ridge regression, logistic regression, Poisson regression, and other generalized linear models, ALOOCV gives us a much more efficient estimate of out of-sample error that’s nearly as good as LOOCV. In this post, I’ll cover:What is ALOocV and how to use it to identify outliers in a training data set.
image
Ryan Burn Hacker Noon profile picture

Ryan Burn

Mathematical Engineer | Building better models: buildingblock.ai

Tags

Join Hacker Noon

Create your free account to unlock your custom reading experience.