Serve Data Models with MLFlow in Productionby@rick-bahague
650 reads

Serve Data Models with MLFlow in Production

tldt arrow
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

MLFlow allows serving data models as REST API without the complicated setup. For organizations looking for a way to ‘democratize’ data science, it is a must that data models are accessible to the enterprise. There are other solutions out there to serve data models which is a very common problem for data scientists. We used anaconda3 to setup the environment and at least 1GB of RAM is needed to get R running with MLFlow in AWS LightSail. For Python-based models, MLFLow supports deploying to SageMaker.
featured image - Serve Data Models with MLFlow in Production
Rick Bahague HackerNoon profile picture

@rick-bahague

Rick Bahague

Free & Open Source Advocate. Data Geek - Big or Small.


Receive Stories from @rick-bahague

react to story with heart

RELATED STORIES

L O A D I N G
. . . comments & more!