Generative AI’s environmental costs are soaring — and mostly secret - eviltoast

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    You have a very strange definition of ‘modest.’ Because I would say one household’s worth of electricity is modest and 33,000 is a fuckload. Or did I miss something and we’re running houses off of AA batteries these days?

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      9 months ago

      OpenAI is a global service. People all over the world are using it and doing a massive amount of work with it. According to this page there are 180.5 million users and openai.com got 1.6 billion visits in December last year. It is extremely modest on that scale.

      You need to account for what’s being done with resources when trying to judge whether the resources are excessive.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        Do I have to account for that? Or can I say that 10,000 more households than this town I live in of almost 60,000 people is a whole lot of energy regardless?

        And I’m really more concerned about the water anyway. You don’t seem to be, which is odd considering what I already posted about how much water is being used in Iowa and how little fresh water there will be available in the U.S. in 50 years. I guess because you’ll likely be dead by then anyway?