I watched Nvidia's Computex 2024 keynote and it made my blood run cold - eviltoast
  • FiniteBanjo@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    I guarantee you that much more power will be used as a result of the data centers regardless of how much efficiency they have per output.

      • FiniteBanjo@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 months ago

        Is this a joke? I said it. It was a single sentence, you can’t parse that?

        1. Power Used Total by all people WITH AI DATACENTERS

        is greater than

        1. Power Used Total by all people WITHOUT AI DATACENTERS

        Even if they’re more efficient, they’re also producing more output and taking more power as a result.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          5 months ago

          Yes, no, you said that. But since that is a meaningless statement I was expecting some clarification.

          But nope, apparently we have now established that a device existing uses up more power than that device not existing.

          Which is… accurate, I suppose, but also true of everything. Turns out, televisions? Also consume less power if they don’t exist. Refrigerators. Washing machines? Lots less power by not existing.

          So I suppose you’re advocating a return to monke situation, but since I do appreciate having a telephone (which would, in fact, save power by not existing), we’re going to have to agree to disagree.

          • FiniteBanjo@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            LLMs major use is mimicking human beings at the cost of incredible amounts of electricity. Last I checked we have plenty of human beings and will all die if our power consumption keeps going up, so it’s absolutely not worth it. Comparing it to literally any useful technology is disingenuous.

            And don’t go spouting some bullshit about it getting better over time, because the Datacenters aren’t being built in the hypothetical future when it is better, they’re being built NOW.

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              5 months ago

              Look, I can suggest you start this thread over and read it from the top, because the ways this doesn’t make much sense have been thoroughly explained.

              Because this is a long one and if you were going to do that you would have already, I’ll at least summarize the headlines: LLMs exist whether you like them or not, they can be quantized down to more reasonable power usage, are running well locally on laptops and tablets burning just a few watts for just a few seconds (NOW, as you put it). They are just one application of ML tech, and are not useless at all (fuzzy searches with few specific parameters, accessibility features, context-rich explanations of out of context images or text), even if their valid uses are misrepresented by both advocates and detractors. They are far from the only commonplace computing task that is now using a lot more power than the equivalent a few years ago, which is a larger issue than just the popularity of ML apps. Granting that LLMs will exist in any case, running them on a data center is more efficient, and the issue isn’t just “power consumption” but also how the power is generated and what the reclamation of the waste products (in this case excess heat and used water) is on the other end.

              I genuinely would not recommend that we engage in a back and forth breaking that down because, again, that’s what this very long thread has been about already and a) I have heard every argument the AI moral panic has puth forth (and the ones the dumb techbro singularity peddlers have put forth, too), and b) we’d just go down a circular rabbit hole of repeating what we’ve already established here over and over again and certainly not convince each other of anything (because see point A).

              • FiniteBanjo@lemmy.today
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 months ago

                They exist at the current scale because we’re not regulating them, not whether we like it or not.

                • MudMan@fedia.io
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  5 months ago

                  Absolutely not true. Regulations are both in place and in development, and none of them seem like they would prevent any of the applications currently in the market. I know the fearmongering side keeps arguing that a copyright case will stop the development of these but, to be clear, that’s not going to happen. All it’ll take is an extra line in an EULA to mitigate or investing in the dataset of someone who has a line in their EULA (Twitter, Reddit already, more to come for sure). The industry is actually quite fond of copyright-based training restrictions, as their main effect is most likely to be to close off open source alternatives and make it so that only Meta, Google, and MS/OpenAI can afford model training.

                  These are super not going away. Regulation is needed, but it’s not restricting or eliminating these applications in any way that would make a dent on the also poorly understood power consumption costs.