The Singapore AI Ecosystem

When do we really need GPU for AI?  



Hi all,

Good morning

1) May I know when do we really need a GPU for AI?

2) Any guidelines on GPU selection for AI application?

3) What are the things that only GPU can do that CPU cannot? 

Hope to hear from you soon. Thank you.



Hi Vin,

1) GPU is very much required during the training phase..if you have lots of data. Once trained, the model can be deployed on production systems either on the cloud, or on the edge (e.g. IoT sensors, CCTV cameras, smart speakers, self-driving cars). On the edge, either CPU, GPU or specialized ASIC chips can do final inferencing job.

See here how much the speed up GPU vs CPU:

Inference: The Next Step in GPU-Accelerated Deep Learning

2) Unfortunately, the industry predominantly uses NVIDIA GPUs for now... hopefully AMD and Intel (with Nervana processor) can catch up. Once the GPU is selected, the other criteria is how much VRAM is required depending on your AI application, rather than just pure GPU TFLOPs.

3) See my link above from Nvidia.

Disclaimer: I don't work for or get paid by NVIDIA. 


Pin It on Pinterest


Please Login or Register