Generative AI’s environmental costs are soaring — and mostly secret - eviltoast

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • Dr. Dabbles@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    9 months ago

    Anything power saved by hardware design improvements will be consumed by adding more transistors. You will not be seeing a power consumption decrease. Manufacturers of this hardware have been giving talks for the past two years calling for literal power plants to be build co-resident with datacenters.