paint-brush
Serve Data Models with MLFlow in Productionby@rick-bahague
650 reads
650 reads

Serve Data Models with MLFlow in Production

by Rick Bahague2mAugust 3rd, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

MLFlow allows serving data models as REST API without the complicated setup. For organizations looking for a way to ‘democratize’ data science, it is a must that data models are accessible to the enterprise. There are other solutions out there to serve data models which is a very common problem for data scientists. We used anaconda3 to setup the environment and at least 1GB of RAM is needed to get R running with MLFlow in AWS LightSail. For Python-based models, MLFLow supports deploying to SageMaker.
featured image - Serve Data Models with MLFlow in Production
Rick Bahague HackerNoon profile picture
Rick Bahague

Rick Bahague

@rick-bahague

Free & Open Source Advocate. Data Geek - Big or Small.

Learn More
LEARN MORE ABOUT @RICK-BAHAGUE'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Rick Bahague HackerNoon profile picture
Rick Bahague@rick-bahague
Free & Open Source Advocate. Data Geek - Big or Small.

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite
Coffee-web