Stacked 3D cache is coming to Intel CPUs, and gamers should be excited (should we?) - eviltoast
  • deranger@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I dunno, I ran a 2080 on the same PSU that I used on a 2013 build, a 650W seasonic. Got some graphs? Power consumption didn’t seem to jump that bad until the latest gen.

    My current 3090 is a power hog though, that’s when I’d say it started for Nvidia (3000 series). For AMD, 7000 series CPUs, and I’m not really sure for Intel. 9900k was the last Intel CPU I ran, it seemed fine. I was running a 9900k/2080 on the same PSU as the 2500k/570 build.

    • candyman337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      As for as the 2080 goes, like I said, it was big FOR THE TIME, and power hungry FOR THE TIME. It’s still reasonable especially for today’s standards

      As for as the last two gens, 3000 and 4000 series, they are known to draw more than their rated power requirements, which, for their min recommended psu wattage, 3080 was 50 watts more than the 2080 (750w), and 4080 was 100 w more than that (850w)

      To add to that, both of these gens of cards, when doing graphics intensive things like gaming, can overdraw power and have been known to cause hard shutdowns in pcs with PSUs that are even slightly higher rated than their min rec. Before these last two gens you could get away with a slightly lower than rated wattage PSU and sacrifice a little performance but that is definitely no longer the case.

      And sure, the performance to watts used is better in the 3080, but they also run 10+ degrees hotter and the 4000 series even moreso.

      I just hope the 5000 series goes the way of power consumption refinement rather than smashing more chips onto a board or vram fuckery like with the 4060, like I’d be happy with similar performance on the 5000 series if it was less power hungry