3090 or 4090 - eviltoast

I have an RTX 2080ti.

I still play in 1080p 60Hz, and the 2080 is plenty. But I’m looking to train some ML models, and the 11GB VRAM is limiting for that.

Thus, I plan to buy a new one. Also I don’t want a ML only GPU since I don’t want to maintain two GPUs.

Since I’m upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I’m not bought into 4k as any perceivable benefit.

Given all these, I wanted to check with folks who have either card, should I consider 4090?

  • TheTrueLinuxDev@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    7900 XTX recently got support for Stable Diffusion and LLM, on paper, it’s faster than 4090 RTX for FP16 computation, it does seem faster judging my experience using rented 4090 RTX on Runpod and my 7900 XTX GPU. 14 seconds (4090 RTX) vs 6 seconds (7900 XTX.)

    7900 XTX is an option if you want $1000 cheaper than 4090 RTX and have similar sized VRAM and having comparable performance to that of 4090 RTX.