How would you feel about a law that restricts the ability to purchase hardware used for training AI? - eviltoast

Currently, AI models are trained using GPUs. In the future though, Generative AI will probably require its own specialized ASICs to achieve the best performance. This happened with bitcoin mining a few years ago and is also the reason big tech companies are making their own CPUs now.

Since there are only a few companies on the planet capable of producing these chips in bulk, the government could easily place restrictions on the purchase of AI hardware. This would control who has access to the best AI.

Only the government and a few permitted parties have access to the best AI. Everyone else would use worse AI that, while still good enough for most people, could be detected by the government. The government could use their superior models to easily detect whether a post is AI-generated, for example, and provide that insight as a service to citizens.

Effectively, the government becomes the sole purveyor of truth, as opposed to that power being in the hands of whoever can afford the biggest computer.

  • Gigan@lemmy.world
    link
    fedilink
    arrow-up
    37
    ·
    9 months ago

    Effectively, the government becomes the sole purveyor of truth

    Straight out of a dystopian novel. No thanks

    • nodsocket@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      12
      ·
      9 months ago

      as opposed to that power being in the hands of whoever can afford the biggest computer.

      • Excrubulent@slrpnk.net
        link
        fedilink
        English
        arrow-up
        12
        ·
        9 months ago

        If they can afford the biggest computer, they can afford the licensing and legislative capture required to deal with your proposed restriction.

        You don’t seem to understand corporations already rule the world. The government is not an effective check on them. Legislation gets written to cater to them. Yours would be no different. Even if it was a good idea on paper, it wouldn’t survive the lobbyists.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        9 months ago

        Do you think the hardware would be free in this scenario? It adds restrictions, it doesn’t remove any.

  • nivenkos@lemmy.world
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    9 months ago

    They already tried this sort of nonsense with the encryption controls in the 90s.

    No thanks.

  • peto (he/him)@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    ·
    9 months ago

    Only the government and a few permitted parties

    So a government and anyone who can pay a government’s fee. This isn’t really fixing the problem, just putting an extra barrier in the way of any smaller org that wants to get involved.

    Never mind the issue that there isn’t a government that can be trusted. Do you think the world is going to be improved by making perception manipulating tech the private weapon of whatever bunch of psychopaths happen to rule at the time?

    • nodsocket@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      16
      ·
      9 months ago

      Would you rather let anyone with the money buy a nuke, or only let the governments have them? At least this way there’s a fewer number of psychopaths to worry about.

      • peto (he/him)@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        9 months ago

        Yeah, totally the same thing. Utterly comparable, you clearly fully understand what it is capable of and the risks it poses.

        I also respect your knowledge of nuclear weapons and the reasons why every billionaire doesn’t have a home defence warhead.

      • gzrrt@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        I’d say LLMs are pretty comparable to an operating system (i.e., something anyone can buy, use and develop without any outside interference) and not comparable at all to nuclear weapons.

  • fuckwit_mcbumcrumble@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    9 months ago

    People would just buy gaming gpus for “gaming”. Then whoops, they end up working on AI. Just like they’re currently doing in china.

    • nodsocket@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      11
      ·
      edit-2
      9 months ago

      GPUs for gaming won’t be regulated because in the future, AI will require specialized ASICs to achieve the best performance. Only those ASICs would be regulated.

      • key@lemmy.keychat.org
        link
        fedilink
        English
        arrow-up
        8
        ·
        9 months ago

        It’s more a matter of cost efficiency than performance. That is especially critical for cryptomining where your only means for profit is by competing against the cost of electricity on an hour to hour level. That’s much less the case for training AI. They’d still use GPUs, they’d just spend more money on electricity and cooling than is ideal, furthering climate change and water insecurity. If you want to regulate AI it would be a lot more efficient to just regulate AI.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    9 months ago

    Ignoring the fact that there are multiple governments in the world, how could you even detect if something was made with AI? An artist who touches up their art with Stable Diffusion would probably never be “caught”. The way Stable Diffusion blends and alters images in Krita isn’t terribly different than the rest of the Krita toolset, only faster and easier to control.

  • brygphilomena@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    9 months ago

    Based on your replies, it doesn’t seem like you want a discussion on the idea but that you want people to say how good of an idea this is.

    Truth is, it’s not. It’s a half thought out idea that can’t work. ASICs aren’t that terribly unique that only a handful of chip manufacturers have the ability to make them. There are existing companies that can move on quickly because they already have the infrastructure and processes in place, but other chip manufacturers can enter the space.

    This assumes there is no black marker or secondary marker for ASICs.

    This assumes that one governments restrictions would be effective when there are companies in more than just the single country with these restrictions.

    Restricting hardware also implies that hardware today (or ASICs of tomorrow) are going to stay as the tech for AI. It also hampers the R&D of this type of hardware.

    It creates a barrier of entry to startups and smaller business that may use generative AI in positive ways.

    It implies that the use of generative AI is inherently dangerous and needs to be regulated.

    It assumes that consumer hardware wouldn’t be able to match ASICs. ASICs are certainly fast, but enough consumer GPUs would match the processing power of a single ASIC.

    It assumes the government is good, truthful, effective, honest, and moral.

    It assumes that truth is a black and white construct.

    It assumes that there will be a process to check, identify, communicate, and regulate AI generated information.

  • LibreFish@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    How would you feel about a law that restricts the ability to purchase hardware used for training AI?

    No

    Effectively, the government becomes the sole purveyor of truth

    Extra no

  • Kiernian@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    No limiting consumer access to computer hardware.

    Just no.

    We still haven’t recovered from early crypto crap with GPU’s.

    Fix the environmental rules for corpos so they can’t just stand up data farms and simultaneously wreak havoc on the grid and the environment without paying the full cost to offset the damage they’re doing.

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    8
    ·
    edit-2
    9 months ago

    I’d rather governments just ban generative “AI”. 🤷‍♂️

    I’m sick of hearing about it, sick of it being shoved into everything, sick of the hype, and just…all of it.

    • Rhynoplaz@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      9 months ago

      I’d rather see a ban on Organic Intelligence.

      It’s accomplished quite a bit over the years, but it’s also led us to the miserable situation we are in now.