If it ain't broke - eviltoast
  • BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    135
    ·
    10 months ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      52
      ·
      10 months ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        12
        ·
        10 months ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

        • Holzkohlen@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

      • minibyte@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        10 months ago

        I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

        I’d settle for 4k @ 120 FPS locked.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

          With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

          There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

    • Final Remix@lemmy.world
      link
      fedilink
      arrow-up
      28
      ·
      10 months ago

      *monkey’s paw curls*

      Granted! Everything’s just internal render 25% scale and massive amounts of TAA.

    • flintheart_glomgold@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

      TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

    • bruhduh@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

      • AngryMob@lemmy.one
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        can doesn’t mean should.

        720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

        to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

        or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

        For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

        • bruhduh@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)