Explicit deepfake scandal shuts down Pennsylvania school - eviltoast
  • krashmo@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    4
    ·
    1 month ago

    There’s a legitimate discussion to be had about harm reduction here. You’re approaching this topic from an all-or-nothing mindset but there’s quite a bit of research indicating that’s not really how it works in practice. Specifically as it relates to child pornography the argument goes that not allowing artificial material to be created leads to an increase in production of actual child pornography which obviously means more real children are being harmed than would be if other forms were not controlled in the same fashion. The same sort of logic could be applied to revenge porn, stolen selfies, or whatever else we’re calling the kind of thing this article is referring to. It may not be an identical scenario but I still think it would be fair to say that an AI generated image is not as damaging as a real one.

    That is not to say that nothing should be done in these situations. I haven’t decided what I think the right move is given the options in front of us but I think there’s quite a bit more nuance here than your comment would indicate.

    • MagicShel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      I think this is probably a really good point. I have no issue with AI generated images, although obviously if they are used to do an illegal thing such has harassment or defamation, those things are still illegal.

      I’m of two minds when it comes to AI nudes of minors. The first is that if someone wants that and no actual person is harmed, I really don’t care. Let me caveat that here: I suspect there are people out there who, if inundated with fake CP, will then be driven to ideation about actual child abuse. And I think there is real harm done to that person and potentially the children if they go on to enact those fantasies. However I think it needs more data before I am willing to draw a firm conclusion.

      But the second is that a proliferation of AI CP means it will be very difficult to tell fakes from actual child abuse. And for that reason alone, I think it’s important that any distribution of CP, whether real or just realistic, must be illegal. Because at a minimum it wastes resources that could be used to assist actual children and find their abusers.

      So, absent further information, I think whatever a person whats to generate for themselves in private is just fine, but as soon as it starts to be distributed, I think that it must be illegal.