Nvidia released their new RTX 30 series and the top of the line consumer graphics cards sold out in a matter of seconds. Nvidia stock is at an all time high. Demand for Graphic Processing Units (GPU) keeps growing for specialized computations in gaming and AI development. The latest wave of AI has been spurred on by the rapid development of GPU compute capability. A100/V100s with High Bandwidth Memory (HBM) stuffed into a Cray, DGX, and systems like it, power many of the foremost Deep Learning research groups and labs. Does this mean that to practice Deep Learning that you need the latest and greatest GPU? What should you do if you want to break into Deep Learning and not the bank?
This article will detail various places to access GPU compute and how I made an old linux machine into a decent system for learning Deep Learning.