Use plaidML to do Machine Learning on macOS with an AMD GPU
Written by
Alex_Wulff
| Published
2020/12/16
Tech Story Tags:
machine-learning
|
ml
|
apple
|
amd
|
gpu
|
data-science
|
deep-learning
|
artificial-intelligence
TLDR
Want to train machine learning models on your Mac’s integrated AMD GPU or an external graphics card? Look no further than PlaidML.
via the TL;DR App
no story
Written by
Alex_Wulff
| 22-Year-Old Maker and Harvard Student
Published
by
HackerNoon
on
2020/12/16