Generative AI’s environmental costs are soaring — and mostly secret - eviltoast

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    9 months ago

    Not to mention that increasing usage of AI means AI is producing more useful work in the process, too.

    The people running these AIs are paying for the electricity they’re using. If the AI isn’t doing enough work to make it worth that expense they wouldn’t be running them. If the general goal is “reduce electricity usage” then there’s no need to target AI, or any other specific use for that matter. Just make electricity in general cost more, and usage will go down. It’s basic market forces.

    I suspect that most people raging about AIs wouldn’t want their energy bill to shoot up, though. They want everyone else to pay for their preferences.

    • El Barto@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Not that you don’t have a point, but there’s is this theory, paradox or law or something, it escapes my memory at the time, which says that when technology advances, so do requirements. So what’s going to happen is that when hardware is 100x more efficient, the fucking corporations will use 100x more, and nothing gets solved in the pollution front.

      I am betting in renewable energy as the best way to combat the environmental issues we’re facing.

      Also, “making electricity cost more” doesn’t sound like basic market forces.