Want to run WAN 2.1 AI text to video generator locally but GPU poor? DeepBeepMeep has you covered! I check out the performance of 4 GPUs from 24GB to 8GB VRAM at the highest and lowest resolutions so we can see how they stack up. FULL COST ANALYSIS AT END!
GPUs Tested
3060 12GB https://geni.us/3060_GPU_12GB
3070 8GB https://geni.us/RTX_3070
3090 24GB https://geni.us/GPU3090
4090 24GB https://geni.us/4090_24GB_GPU
GitHub https://github.com/deepbeepmeep/Wan2GP
Quad GPU AI Training Rig Writeup https://digitalspaceport.com/ollama-gpu-3090-home-server-quad-gpu-ai-training-rig/
LOCAL AI SERVER BUILD WRITEUPS
Super Low Budget Ai Server https://digitalspaceport.com/local-ai-home-server-at-super-low-150-budget-price/
Low Budget Ai Server https://digitalspaceport.com/local-ai-home-server-at-low-350-budget-price/
Mid Range Ai Server https://digitalspaceport.com/local-ai-home-server-build-at-mid-range-750-price/
High End AI Server https://digitalspaceport.com/local-ai-home-server-build-at-high-end-5000-price/
QUAD 3090 AI HOME SERVER BUILD
https://youtu.be/JN4EhaM7vyw
PARTS
GPU Rack Frame https://geni.us/GPU_Rack_Frame
Supermicro H12ssl-i MOBO (better option vs mz32-ar0) https://geni.us/MBD_H12SSL-I-O
Gigabyte MZ32-AR0 MOBO https://geni.us/mz32-ar0_motherboard
AMD 7V13 (newer, faster vs 7702) https://geni.us/EPYC_7V13_CPU
RTX 3090 24GB GPU (x4) https://geni.us/GPU3090
256GB (8x32GB) DDR4 2400 RAM https://geni.us/256GB_DDR4_RAM
PCIe4 Risers (x4) https://geni.us/PCIe4_Riser_Cable
AMD SP3 Air Cooler (easier vs water cooler) https://geni.us/EPYC_SP3_COOLER
iCUE H170i water cooler https://geni.us/iCUE_H170i_Capellix
(sTRX4 fits SP3 and retention kit comes with the CAPELLIX)
CORSAIR HX1500i PSU https://geni.us/Corsair_HX1500iPSU
4i SFF-8654 to 4i SFF-8654 (x4, not needed for H12SSL-i) https://geni.us/SFF8654_to_SFF8654
ARCTIC MX4 Thermal Paste https://geni.us/Arctic_ThermalPaste
Kritical Thermal GPU Pads https://geni.us/Kritical-Thermal-Pads
HDD Rack Screws for Fans https://geni.us/HDD_RackScrews
Running Deepseek R1 671b on $2000 EPYC Local Ai Server
https://youtu.be/Tq_cmN4j2yY
Deepseek R1 671b Local Ai Server Writeup https://digitalspaceport.com/how-to-run-deepseek-r1-671b-fully-locally-on-2000-epyc-rig/
Chapters
0:00 Local Ai Video Generation for low VRAM GPUs
1:30 4090 WAN 2 Ai 14b 720 Test Setup
6:10 4090 Ai 14b 720 Result
6:31 4090 local Ai 1.3b 480 Test
7:30 3090 WAN 2 Ai 14b 720 Test
9:15 3090 local Ai 1.3b 480 Test
10:22 3060 1.3b 480 Test
13:25 3060 14b 720 Test
15:58 3070 14b 720 Test
18:52 3070 1.3b 480 Test
20:25 Conclusion and Data Points
Be sure to 👍✅Subscribe✅👍 for more content like this!
Join this channel https://www.youtube.com/@DigitalSpaceport/join
Digital Spaceport Website https://digitalspaceport.com
Patreon https://www.patreon.com/digitalspaceport
Buy Me a Coffee https://buymeacoffee.com/digitalspaceport
Please share this video to help spread the word and drop a comment below with your thoughts or questions. Thanks for watching!
*****
As an Amazon Associate I earn from qualifying purchases.
When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
*****