I watched Nvidia's Computex 2024 keynote and it made my blood run cold - eviltoast
    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      5 months ago

      Yeeeeah, you’re gonna have to break down that math for me.

      Because if an output takes some amount of processing to generate and your energy cost per unit of compute is higher we’re either missing something in that equation or we’re breaking the laws of thermodynamics.

      If the argument is that the industry uses more total energy because they keep the training in-house or because they do more of the compute at this point in time, that doesn’t change things much, does it? The more of those tasks that get offloaded to the end user the more the balance will shift for generating outputs. As for training, that’s a fixed cost. Technically the more you use a model the more the cost spreads out per query, and it’s not like distributing the training load itself among user-level hardware would make its energy cost go down.

      The reality of it is that the entire thing was more than a bit demagogic. People are mad at the energy cost of chatbot search and image generation, but not at the same cost of image generation for videogame upscaling or frame interpolation, even if they’re using the same methods and hardware. Like I said earlier, it’s all carryover from the crypto outrage more than it is anything else.