Hello all,
Looking to run some local AI to learn more about the technology,
I recently acquired 3 Nvidia Rtx A4000 cards – 16gb vram each. I also have 3 Rtx P4000 and my understanding is I can mix them but will basically be bottlenecked as if I had 6 lower spec cards.
So my thought is if I can run the three A4000 together I will have a decent amount of vram to run most LLMs and things like Wan 2.1 – but my question is – how much system ram would I need to pair with it? Anything over about 128gb pushes me to something like an epyc server board and gets expensive quick. I have some money to spend on the project but just want to put it in the right place.
Thanks!
submitted by /u/McShotCaller to r/learnmachinelearning
[link] [comments]
Laisser un commentaire