‘Thirsty’ ChatGPT uses four times more water than previously thought - eviltoast

cross-posted from: https://lemmy.dbzer0.com/post/32023985

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • SlopppyEngineer@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 month ago

    That’s physics. For a closed loop system, the cooling has to be done with air, which is less efficient and that doesn’t work so well in say Texas in summer.

    • curbstickle@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 month ago

      Making it a stupid design, yes.

      Edit: putting your massive heat generating data center (beyond what most DCs will do) in support of AI in Texas is stupid.

      Closed loop systems absolutely have other options in design, which ive mentioned in another comment chain.

      As terrible as they are as companies, meta, apple, and others have made much more appropriate decisions - like locating their big load DCs in cold climates, partnering with the locale to make use of the heat being generated, removing the need for power to be used to perform those tasks - making them not only efficient designs, but compared to putting a DC in Texas like a dipshit (or LA, or NV, or anywhere else with a hot climate), makes the whole thing better for the environment.

      Yes, its a stupid design.