In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’ - eviltoast

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • duxbellorum@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    7
    ·
    edit-2
    1 year ago

    Why? They didn’t take or share any nudes, and nobody believes they did.

    This is only a nightmare if an ignorant adult tells them that it is.

    • 0x815@feddit.deOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      @duxbellorum

      Why? They didn’t take or share any nudes, and nobody believes they did.

      This is only a nightmare if an ignorant adult tells them that it is.

      So you don’t have children, right?

    • ParsnipWitch@feddit.de
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      edit-2
      1 year ago

      Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.

      Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.