paint-brush
How GPUs are Beginning to Displace Clusters for Big Data & Data Scienceby@dan-voyce
7,887 reads
7,887 reads

How GPUs are Beginning to Displace Clusters for Big Data & Data Science

by Dan Voyce6mJanuary 5th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Many big-data-driven companies are turning to GPU's in place of a traditional cluster. I think we will see a leap to a more 'GPU First' mindset over the coming years. I have been using a low grade consumer GPU (NVIDIA GeForce 1060) to accomplish things that were previously only realistically capable on a cluster. Here is why I think this is the direction data science will go in the next 5 years. The GTX 1080 Ti is illustrated below, it shows that this card has 3584 CUDA cores that can process data in parallel.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - How GPUs are Beginning to Displace Clusters for Big Data & Data Science
Dan Voyce HackerNoon profile picture
Dan Voyce

Dan Voyce

@dan-voyce

Director of Technology Solutions, Tech Nerd and lover of Tea!

About @dan-voyce
LEARN MORE ABOUT @DAN-VOYCE'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Dan Voyce HackerNoon profile picture
Dan Voyce@dan-voyce
Director of Technology Solutions, Tech Nerd and lover of Tea!

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite