FBI Arrests Man For Generating AI Child Sexual Abuse Imagery - eviltoast
  • msage@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    6 months ago

    I hear you, and I don’t necessarily disagree with you, I just know that’s not how anything works.

    Regulations work for big companies, but there isn’t a big company behind this specific case. And those small-time users have run away and you can’t stop them.

    It’s like trying to regulate cameras to not store specific images. Like, I get the sentiment, but sorry, no. It’s not that I would not like that, it’s just not possible.

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      6 months ago

      This argument could be applied to anything though. A lot of people get away with myrder, we should still try and do what we can to stop it from happening.

      You can’t sit in every car and force people to wear a seatbelt, we still have seatbelt laws and regulations for manufacturers.

      • msage@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        6 months ago

        Physical things are much easier to regulate than software, much less serverless.

        We already regulate certain images, and it matters very little.

        The bigger payoff will be from educating the public and accepting that we can’t win every war.

        • retrospectology@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          7
          ·
          edit-2
          6 months ago

          So accept defeat from the start, that’s really just a non-starter. AI models run on hardware, they are developed by specific people, their contents are distributed by specific individuals, code bases are hosted on hardware and on specific outlets.

          It really does sound like you’re just trying to make excuses to avoid regulation, not that you genuinely have a good reason to think it’s not possible to try.