Generative AI’s environmental costs are soaring — and mostly secret - eviltoast

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • MrMcGasion@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    9 months ago

    Yeah, but LLMs like ChatGPT and the like aren’t where that advancement is being made. LLMs are driving investment in the technology, but it’s just a mostly useless investor target that just happens to run on the same hardware that can be used for useful AI-powered research. Sure, it’s pushing the hardware advancement forward maybe 10-15 years faster than it might have otherwise happened, but it’s coming with a lot of wasteful baggage as well because LLMs are the golden boy investors want to to throw money at.