Man Arrested for Creating Child Porn Using AI - eviltoast

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • pregnantwithrage@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    2 months ago

    You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.

      • medgremlin@midwest.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        4
        ·
        2 months ago

        Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

        • CommanderCloon@lemmy.ml
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          2 months ago

          so to get AI generated CSAM…it had to have been fed some amount of CSAM

          No actually, it can combine concepts that aren’t present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

          Edit: corn dog

          • medgremlin@midwest.social
            link
            fedilink
            arrow-up
            4
            ·
            2 months ago

            Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven’t been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.

            The creators and managers of these generative “AIs” have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their “full self driving”.

          • emmy67@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            2 months ago

            A dumb argument. Corn and dog were. But that’s not a corn dog like what we expect when we think corn dog.

            Hence it can’t get what we know a corn dog is.

            You have proved the point for us since it didn’t generate a corn dog.

        • BonesOfTheMoon@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          2 months ago

          Ok makes sense. Yuck my skin crawls. I got exposed to CSAM via Twitter years ago, thankfully it was just a shot of nude children I saw and not the actual deed, but I was haunted.