Optuna Vs. Hyperopt: Which Hyperparameter Optimization Library You Should Choose

Written by neptuneAI_jakub | Published 2020/01/13
Tech Story Tags: machine-learning | supervised-learning | learn-machine-learning | hyperparameter-optimization | model-tuning | software-development | machine-learning-uses | tech-review

TLDR Optuna is slightly better because of the flexibility, imperative approach to sampling parameters and a bit less boilerplate. Hyperopt has a ton of sampling options for each hyperparameter type (Float, integer, Categorical). Optuna has a search space definition, flexibility in defining a complex space and sampling options (Float) for each parameter type. In this article I will show you an example of using Optuna and Hyperopt on a real problem, compare them on API, documentation, functionality, and more, give you my overall score and recommendation on which library you should use.via the TL;DR App

no story

Written by neptuneAI_jakub | Senior data scientist building experiment tracking tools for ML projects at https://neptune.ai
Published by HackerNoon on 2020/01/13