My Deep Learning Rig Build Journey...

I’ve been on quite a journey planning and building my own deep learning machine. I wanted to share my thought process, the decisions I made along the way, and my final setup. Hopefully, this can help anyone else considering a similar project!

Why Build Your Own Deep Learning Rig?

Let me quickly recap why I decided to build my own machine:

  1. It’s a fun challenge and learning experience.
  2. As a data scientist/engineer, I wanted to understand the hardware better.
  3. It’s cost-effective in the long run if you use it frequently.
  4. Flexibility: I can use it for deep learning, gaming, and even as a personal cloud.

Key Decisions Based on Research

I spent a lot of time researching and reading articles about building deep learning rigs. One particularly helpful resource was Tim Dettmers’ blog post on deep learning hardware. Here are some key decisions I made based on that research:

1. GPU Choice

This was probably the most crucial decision. I went with an RTX 3090, even though it’s not the latest model. Why?

I decided to buy a second-hand 3090 to save some money. It’s a bit risky, but the potential savings were worth it for me.

RTX 3090 GPU

2. CPU Selection

I chose the Intel Core i7-13700K. For deep learning, you don’t need the absolute top-of-the-line CPU. What mattered more was:

The i7-13700K ticks all these boxes and leaves room for future expansion.

3. RAM Considerations

I went with 64GB of Corsair Vengeance RGB DDR5. The article pointed out that RAM clock speeds don’t matter much for deep learning, but I decided on DDR5 for future-proofing. 64GB should be more than enough to match my GPU’s memory and handle large datasets.

4. Storage Strategy

Following the advice from the article, I’m using a two-pronged approach:

This setup gives me the speed where I need it and plenty of storage for large datasets.

5. Power Supply

I chose the EVGA 1300 G+. The article stressed the importance of having enough power and PCIe connectors. I calculated my power needs and added some buffer, as suggested.

6. Cooling Considerations

For CPU cooling, I went with the DeepCool LE520 Liquid Cooling Kit. For the GPU, I’m sticking with air cooling for now, but I made sure to get a case with good airflow.

My Final Build

Here’s the complete list of components for my deep learning rig:

PRODUCTS

Building Process and Challenges

Here are some challenges I faced:

  1. Installing the case fans was more fiddly than I expected.
  2. Cable management with the power supply took some time to get right.
  3. I encountered some boot-up issues initially, which took some time to figure out, but with some googling and ChatGPT I was able to solve!

Steps after build is ready

Once the hardware is set up, my next steps are:

  1. Install Ubuntu (I chose this over Windows for better compatibility with deep learning frameworks)
  2. Set up CUDA for GPU acceleration
  3. Mount the second SSD
  4. Run some benchmarks and tests to ensure everything’s working correctly

Conclusion

Building my own deep learning rig has been an incredible learning experience. It’s given me a much deeper understanding of the hardware that powers the algorithms we work with every day. Plus, I now have a powerful, flexible machine that I can use for all sorts of projects.

If you’re considering building your own deep learning rig, I say go for it! Yes, it takes time and research, but the knowledge you gain and the customized setup you end up with are well worth it.

FINAL

References