Explicit deepfake scandal shuts down Pennsylvania school - eviltoast
  • webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    2 hours ago

    I have mixed feelings about this prosecution of ai deepfakes.

    Like obviously people should have protection against becoming a victim of such and perpetrators should be held accountable.

    But the line “feds are currently testing whether existing laws protecting kids against abuse are enough to shield kids from AI harms” would be a incredibly dangerous precedent because those are mostly designed for actual physical sex crimes.

    As wrong as it is to create and distribute ai generated sex imagery involving non consenting people it is not even remotely as bad as actual rape and distributing real photos.

    • Blueberrydreamer@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      25 minutes ago

      I don’t think you’re on the right track here. There are definitely existing laws in most states regarding ‘revenge porn’, creating sexual media of minors, Photoshop porn, all kinds of things that are very similar to ai generated deep fakes. In some cases ai deepfakes fall under existing laws, but often they don’t. Or, because of how the law is written, they exist in a legal grey area that will be argued in the courts for years.

      Nowhere is anyone suggesting that making deepfakes should be prosecuted as rape, that’s just complete nonsense. The question is, where do new laws need to be written, or laws need to be updated to make sure ai porn is treated the same as other forms of illegal use of someone’s likeness to make porn.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 hour ago

      Creating and distributing anything should be legal if no real person suffers during its creation and if it’s not intended at defamation, forgery, such things.

      • Alphane Moon@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        29 minutes ago

        You would be fine with AI-gen porn images of your teenage daughter being distributed around the internet?

        • droporain@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          18 minutes ago

          Meanwhile in reality check out what she is distributing through Snapchat and only fans… Maybe pursuing the actual crimes first then if there’s spare resources go after fiction.

          • Todd Bonzalez@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 minutes ago

            Big “but what was she wearing?” energy here.

            I don’t give a shit if she’s doing Shein bikini hauls on Youtube. If you use AI to nudify her pictures, you’re manufacturing child pornography, and deserve the full consequences for doing that.

            As for OnlyFans, they are quite strict about age requirements. Children aren’t running OF accounts. You just hate women and needed to bring up OF to slut-shame.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    3 hours ago

    Title is misleading?

    An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images.

    Classes are planned to resume on Tuesday, Lancaster Online reported.

    So the school is still in operation.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    21
    ·
    5 hours ago

    researchers concluded that “outlawing all deepfakes is unrealistic and unfeasible”—especially since all the harmful AI-generated images that are already out there are likely to “remain online indefinitely.”

    Just think a little bigger:

    It must be a crime to have the harmful material.

    Have it on your PC or phone —> goto jail.
    Have it in your online account —> goto jail.
    Be a service provider and have it on your server —> goto jail.

    This will reduce the stuff.