RTX 50 series opinons? - eviltoast

Is there any GPU that stands out to guys as a giod value, or do you believe that everybody should skip them?

I’m liking the 5070 Ti with 16GB 256-bit transfer speed 896 GB/s for $750USD. The 5080 for $1000USD has 16GB 256-bit for 960 GB/s. I don’t see value for the extra $250.

The both have 2x 9th Gen NVENC encoder. The 5080 has 2x 6th Gen decoder, 5070 Ti has 1x 6th Gen decoder. I can use that for OBS recording while watching other videos.

  • NIB@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 hours ago
    1. The msrp is good, maybe too good, we just need to wait and see the actual prices and availability.

    2. I dont care about frame generation but it might be a decent last resort for when the gpus are old. Having a small latency, some visual fuckery and a “playable” game is preferable over not being able to play the game.

    3. The biggest advantage of dlss 4.0 is their new ray reconstruction(transformer model) that will improve image quality but this feature is coming out on older gpus too.

    4. We need to wait for benchmarks. Any card can be good or bad, it just depends on its price and performance. If the 5070 is 20% faster than a 4070super but also costs 20% more, then it isnt really that relevant, is it? I expect we will see something like that.

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Not at all since they’re developing Reflex to be AI boosted.

            • Murvel@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              So rasterised vector graphics are ‘real’ to you and DLSS is ‘fake’. How very technical of you…

              • vrighter@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                3 hours ago

                one calculates what the pixel’s color should be. That’s the actual color that that pixel should be. The real one.

                The other one doesn’t try to calculate it and makes an educated guess. That by definition is “not rendering the pixel”.

                It has nothing to do with how realistic it looks. And yes, having written both software raytracers and rasterizers, I am technical about this stuff a bit more than just calling vulkan and having it do its magic. I actually dived in all the way

  • terraborra@lemmy.nz
    link
    fedilink
    English
    arrow-up
    21
    ·
    14 hours ago

    Assuming you’re primarily interested in gaming performance; wait for reliable 3rd party non-DLSS benchmarks.

    From the Nvidia presentation, the 5070ti looks great, but the performance uplift over previous gen in their slides pretty much only applies to games with frame generation. Not every game will implement DLSS 4 let alone DLSS. You may still need the better rasterisation of the 5080 depending on your monitor resolution and desired fps.

    • FeelzGoodMan420
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 hours ago

      Non-DLSS isn’t looking much more powerful. Like it or not, games are going to rely more and more on this type of upscaling tech (and frame generation.) It kinda sucks because it gives excuses not to spend time and money properly optimizing games. On the other hand, these types of technology are quite interesting and the new dlss model looks good (I’ll be waiting for a proper review to have an opinion.)

      All that being said, I’ll upgrade next generation. I need to see a lot more new good games come out to justify upgrading my PC parts. I don’t think dropping $1000+ just to replay Cyberpunk for the 3rd time with higher frame rate is worth the expense.

      • Mettled@reddthat.comOP
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        13 hours ago

        I am of the opinion that DLSS and FSR is an admission of failure by GPU engineers that they are not capable,–so far,-- to design a GPU that does 4K 160fps with psycho raytracing on, zero upscaling, zero frame generation.

        I do believe that they are wrking on it, but nVidia/AMD demand gimmicks in the meantime to continue selling GPU’s.

        I suspect that the 5090 will be the first card to do 1440p with psycho raytracing at 144 fps without DLSS enabled.

        There’s something about Reflex 2 that is bothering me or concerning me, but I have no clue what it is.

        • Poopfeast420@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          8 hours ago

          I am of the opinion that DLSS and FSR is an admission of failure by GPU engineers that they are not capable,–so far,-- to design a GPU that does 4K 160fps with psycho raytracing on, zero upscaling, zero frame generation.

          How is it an admission of faliure? They probably can design a GPU for that, but do you want to pay hundreds of thousands, because the chip uses a full silicon wafer?

          Do you think NVIDIA or AMD should have sat on that technology for decades, until it’s good enough for 4k 144fps? Then you would probably say, it’s not good enough, because it can’t do 8k 144fps. Also, why 4k as your arbitrary limit? Most people are still on 1080p. So why not just say it’s good enough, when the hardware can do 1080p 60fps.

          I suspect that the 5090 will be the first card to do 1440p with psycho raytracing at 144 fps without DLSS enabled.

          Definitely not, since it can’t even do 30fps in 4k, with all the bells and whistles and no DLSS. 1440p ist probably not even gonna be 60fps.

          There’s something about Reflex 2 that is bothering me or concerning me, but I have no clue what it is.

          What? I’m pretty sure the technology they’re using, Frame Warping, has been around for years, and it’s used in VR, so you can just look that up and see what it does.

        • vrighter@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          9 hours ago

          anyone could have told them that. real time path tracing is a pipe dream, even now. The actual raytracing output is a very noisy, incomplete image. Halving the noise requires 4x the compute. We won’t get realtime raytracing anytime this decade for sure, if ever.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 hours ago

      I don’t use DLSS. I have never tried a game that has DLSS enabled. I like to max out path tracing/raytracing but disabld DLSS.

      I woyld guess that the 5070 Ti is at tue very least 15% better than 4070 Ti. Maybe some games 12%, other games 20% better.

      The new NVENC in the 50 series is also a very strong point of interest for me due to frequently using OBS.

  • Ashtear@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    14 hours ago

    Way too early to speculate. Until the cards are independently benchmarked, there’s no way to assess value.

  • mrfriki@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    9 hours ago

    I’m planning on getting the 5090 FE provided scalpers don’t ruin it (which is the most likely scenario by the way). I have a 3080 and have recently upgraded to 4k OLED, going from 3k to 4k has taken a serious blow to the FPS and I’d like to play some of the newer games at high settings and high FPS so I’m due for an update.

    I have a no problem with DLSS on the games I played so far. Whichever artifacts it may have they “disappear” during gameplay. You are not counting pixels while you are playing. I know this is not the solution we want but we need to be realistic here, there is a reason why CGI can’t be rendered in real time. Certainly we are still far away from that type of quality yet the games we play are rendering in real time. We cannot afford the size and price of CGI workstations so we have to rely on these “gimmicks” to make for it.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      9 hours ago

      I understand your reasoning about DLSS. I don’t agree but allis well. For 4K, you will need a 6090, and a 7090, etc.

      Why the FE over an AIB card?

      • mrfriki@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 hours ago

        I don’t think a 6090 or future card will be enough for 4k without some short of DLSS or frame generation shenanigans, because by the time those cards releases, graphics would have “evolved” to a point where they will be, once again, no longer enough. The eternal obsolescence cycle…

        I prefer the FE because is the only 2 slots card that will be available at launch and I don’t have the need for the extra fans and size. FE are normally on par, if not better than they AIB counterpart as long as you stick to air cooled.