paint-brush
Use plaidML to do Machine Learning on macOS with an AMD GPUby@Alex_Wulff
1,862 reads
1,862 reads

Use plaidML to do Machine Learning on macOS with an AMD GPU

by Alex Wulff4mDecember 16th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Want to train machine learning models on your Mac’s integrated AMD GPU or an external graphics card? Look no further than PlaidML.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Use plaidML to do Machine Learning on macOS with an AMD GPU
Alex Wulff HackerNoon profile picture
Alex Wulff

Alex Wulff

@Alex_Wulff

22-Year-Old Maker and Harvard Student

About @Alex_Wulff
LEARN MORE ABOUT @ALEX_WULFF'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Alex Wulff HackerNoon profile picture
Alex Wulff@Alex_Wulff
22-Year-Old Maker and Harvard Student

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite