Man Arrested for Creating Child Porn Using AI - eviltoast

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • 2xsaiko@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    3 months ago

    I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though.

    No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.

    • HelixDab2@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      2 months ago

      I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).

      Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.