Using GPT-4 to generate 100 words consumes up to 3 bottles of water — AI data centers also raise power and water bills for nearby residents - eviltoast

Net-zero emission goals went out the window with AI.

  • iAmTheTot@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    6
    ·
    2 months ago

    My friend, you are naive at best if you think AI data centers are using closed loop water cooling. Look up evaporative cooling towers. It’s “consumed” in the sense that it is evaporated.

    • Zikeji@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 months ago

      I specifically avoided saying they did because I wasn’t knowledgeable on the topic. But I agree, I could equally be accused of being disingenuous by phrasing it in a way that could lead people to assume they use closed loops.

      I did look those up, and while evaporation cooling isn’t the only method used, it also doesn’t evaporate all the water each pass, only a portion of it (granted “a portion” is all I found at a quick look, which isn’t actually useful).

      I do agree though, the water usage is excessive, and when though that water only “changes forms”, it’s still removes it from a water source and only some of it may make its way back in.

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      2 months ago

      and it’s still absolute crap… the heat produced by 100 words of GPT inference is negligible - it CERTAINLY doesn’t take 3L of water evaporating to cool it